Hebbian learning
Hebbian learning is a hypothesis for how neuronal connections are enforced in animals brains, and also a technique for weight selection in artificial neural networks.
The idea is named after Donald Hebb, who in 1949 presented it in his book The Organization of Behavior (and kickstarted research into neural nets as a result). His idea specified how much the strength of a connection between two neurons should be altered according to how they are firing at the current time. Hebb's original principle was essentially that if one neuron is stimulating some other neuron, and at the same time that receiving neuron is also firing, then the strength of the connection between the two neurons will be increased, and vice versa--if one is firing and the other isn't, then the connection strength is decreased.
Hebbian Learning in Artificial Intelligence Research
From the point of view of artificial neurons and artificial neural networks, Hebb's principle can be described as a method of determining how to alter the weights between model neurons. The weight between two neurons will increase if the two neurons activate simultaneously; it is reduced if they activate separately. Nodes which tend to be either both positive or both negative at the same time will have strong positive weights while those which tend to be opposite will have strong negative weights. It it sometimes stated more simply as: fire together, wire together.
This original principle is perhaps the simplest form of weight selection. While this means it can be relatively easily coded into a computer program and used to update the weights for a network, it also prohibits the number of applications of Hebbian learning. Today the term 'Hebbian learning' generally refers to some form of mathematical abstraction of the original principle proposed by Webb. In this sense, Hebbian learning involves weights between learning nodes being adjusted so that each weight better represents the relationship between the nodes. As such many learning methods can be considered to be somewhat Hebbian in nature.
Hebbian Learning in Biological Systems
Work in the laboratory of Eric Kandel has provided evidence for the involvement of Hebbian Learning mechanisms at synapses in the marine invertebrate Aplysia.
Experiments on Hebbian synapse modification mechanisms at the central nervous system synapses of vertebrates are much more difficult to control than are experiments with the relatively simple peripheral nervous system synapses studied in marine invertebrates. Much of the work on long-lasting synaptic changes between vertebrate neurons (such asLTP) involves the use non-physiological experimental stimulation of brain cells. However, some of the physiologically relevant synapse modification mechanisms that have been studied in vertebrate brains do seem to be examples of Hebbian processes. The article Natural patterns of activity and long-term synaptic plasticity by Paulsen and Sejnowski (Curr Opin Neurobiol. 2000 Apr;10(2):172-9) reviews results from experiments that indicate that long-lasting changes in synaptic strengths can be induced by physiologically relevant synaptic activity working through Hebbian mechnaisms in cooperation with non-Hebbian synapse modifications.