- The brain consumes 12W of energy. Ideally, artificial simulations of the brain should respect this energy consumption level. But this seems far from possible because the individual elements used in artificial intelligence consume far more power and there are factor thousands involved in this calculation.
- It should be parallel in nature, similar to neuron firings (thread initiations) that fire along dendrites and synapses. If not, the model should assess scheduling in a single thread of operation.
- It should be stack-less and not have function unwinds.
- The brain has about 100 billion neurons.
- The fanout (connections) with other neurons is between 1,000 to 10,000, (others report this to be 20,000).
- It's not so much the working neural network that is interesting, but the DNA or construction rules that generate the working neural network. That is, it's more interesting to come up with rules that determine how a network is to be built than build a network that works and not being able to reconstruct it elsewhere.
- How to observe the state of the network in an intelligent way in order to deduce conclusions from the observation by the network?? (does it need to do so?)
- It is possible that successfully working networks can only evolve / develop over a certain time period and that the initial results look like nothing interesting at all. This statement can be deepened out by observing the development of infants.
- How does the state of a brain transcend into consciousness? (or is thinking the re-excitation of network assemblies by faking nerve input, imagination, so that images and audio seem to be there?)
- Zero-point measurement: My computer (a dual intel E6850 with 2GB low-latency memory) can process 500,000,000 (500 million) neuron structures in 0.87 seconds. That is about 1.14 cycles per second on 500,000,000 neurons. That is still a factor of 100 * 1000 = 100,000 slower than the human brain, assuming it re-evaluates all neurons in one sweep.
- For a very simple neuron structure on a 50 that does not yet contain connection information, but 3 bytes for threshold, fatigue and excitation information, 140 GB of memory is required to store this network in memory.
- In 2 GB of memory, you can fit 715,000,000 neurons without connection information.
- 50 billion neurons need 186404 GB of memory to store an average of 1,000 connections at a pointer size of 4 bytes per neuron.
- On my CPU (E6850) and a single thread/process, a number of 400,000 can reasonably be processed in one sweep. That makes it about 1,500 sweeps per second across the entire neuron array.
- In 2GB of memory, it's possible to fit 500,000 neurons with connection information.
500,000 seems to be the limit due to memory and due to CPU cycles in order to attain the same frequency. That is a factor 100,000 lower than the human brain and it's more or less maxing out the machine.
No comments:
Post a Comment