Thursday, May 08, 2008

Important numbers and statements

The following are statements that are important to remember and re-assess for validity:
  1. The brain consumes 12W of energy. Ideally, artificial simulations of the brain should respect this energy consumption level. But this seems far from possible because the individual elements used in artificial intelligence consume far more power and there are factor thousands involved in this calculation.
  2. It should be parallel in nature, similar to neuron firings (thread initiations) that fire along dendrites and synapses. If not, the model should assess scheduling in a single thread of operation.
  3. It should be stack-less and not have function unwinds.
  4. The brain has about 100 billion neurons.
  5. The fanout (connections) with other neurons is between 1,000 to 10,000, (others report this to be 20,000).
  6. It's not so much the working neural network that is interesting, but the DNA or construction rules that generate the working neural network. That is, it's more interesting to come up with rules that determine how a network is to be built than build a network that works and not being able to reconstruct it elsewhere.
  7. How to observe the state of the network in an intelligent way in order to deduce conclusions from the observation by the network?? (does it need to do so?)
  8. It is possible that successfully working networks can only evolve / develop over a certain time period and that the initial results look like nothing interesting at all. This statement can be deepened out by observing the development of infants.
  9. How does the state of a brain transcend into consciousness? (or is thinking the re-excitation of network assemblies by faking nerve input, imagination, so that images and audio seem to be there?)
  10. Zero-point measurement: My computer (a dual intel E6850 with 2GB low-latency memory) can process 500,000,000 (500 million) neuron structures in 0.87 seconds. That is about 1.14 cycles per second on 500,000,000 neurons. That is still a factor of 100 * 1000 = 100,000 slower than the human brain, assuming it re-evaluates all neurons in one sweep.
  11. For a very simple neuron structure on a 50 that does not yet contain connection information, but 3 bytes for threshold, fatigue and excitation information, 140 GB of memory is required to store this network in memory.
  12. In 2 GB of memory, you can fit 715,000,000 neurons without connection information.
  13. 50 billion neurons need 186404 GB of memory to store an average of 1,000 connections at a pointer size of 4 bytes per neuron.
  14. On my CPU (E6850) and a single thread/process, a number of 400,000 can reasonably be processed in one sweep. That makes it about 1,500 sweeps per second across the entire neuron array.
  15. In 2GB of memory, it's possible to fit 500,000 neurons with connection information.
I'm therefore choosing 500,000 neurons as the basis of the network, which might eventually translate to a frequency of about 1000Hz if the sweeps are designed more carefully (1000Hz is derived from extremely high firing rates in the human brain that are observed to be at 200 pulses per second. Add the absolute refractory period to that, which lasts 3-4 cycles, and 1000Hz emerges).

500,000 seems to be the limit due to memory and due to CPU cycles in order to attain the same frequency. That is a factor 100,000 lower than the human brain and it's more or less maxing out the machine.

No comments: