"Over the past twenty years, a wide variety of modes of synaptic transmission have been discovered, in addition to simple, chemically mediated increases in permeability leading to excitation and inhibiation. Such modes of transmission include electrical exictation and inhibition, combined chemical-electrical synapses, chemical synapted changes produced by reductions in memberane permeability, and prolonged synaptic potentials mediated by chemical reactions in the postsynaptic cell."
-- page 208, From Neuron to Brain, Second Edition, by Stephen W. Kuffler, John G. Nicholls, and A. Robert Martin; 1984, Sinauer Associates, Inc., Suderland, Massachusetts
In Jeff Hawkins' On Intelligence, he presents a predictive-memory model for human intelligence which he believes can serve as a basis for intelligent machines [which I call machine understanding to emphasize that the machines would not merely be mimicking surface features of human intelligence, as is the case with, for example, expert systems]. I agree with him that while neuroscience and computing science have made great progress in understanding many details of the human brain, and in writing software, we need to have an overview that allows us to make progress on the centrals issues that interest us. Thus, in Hawkins' model, he does not worry about the exact nature of neural synapses.
But at the very least, we should be aware of how complicated human synapses are. This should allow us a greater freedom of thought when modeling the mechanics of machine understanding than if we simply assumed synapses all work alike. I suspect the Hebbian learning model for neurons would benefit from considering that the real world may complex, and in that complexity there may be keys to progress that we leave out with overly simple models.
What struck me most about the above paragraph about synapses, however, is the role played by evolution. Charles Darwin wrote about the process of species creation, but we have long grown used to the idea that evolution takes place on a molecular level. Each nerve cell, presumably, contains the full set of genes of the organism, but many different types of synapses are manifested. There must be controlling, blueprint genes that tell the cells which typese of synapses they are to form as they develop. In turn, we can expect that many different blueprints have been tried over the last four million years or so. The most successful gave their human organisms survival advantages.
It would be interesting to know how much synaptic variation exists in the current human population. Is this variation responsible, or partly responsible, for variations in basic intelligence capabilities of human beings?
This brings us back to the hard-wiring versus plastic debate. Human beings are very adaptable, as is shown by their ability to learn any of thousands of human languages as a child. We tend to think that we are very plastic and programmable creatures. But nerve transmission speeds and synaptic types are hard wired, as is the basic design of the brain. One might say we have the hard-wired capability to be flexible.
And that is what we aim to build into the new machines: hard wiring (a stable system with a design we understand and can reproduce) that is capable of showing the flexibility required to exhibit intelligence and understanding.