Principles
From the point of view of artificial neurons and artificial neural networks, Hebb's principle can be described as a method of determining how to alter the weights between model neurons. The weight between two neurons increases if the two neurons activate simultaneously—and reduces if they activate separately. Nodes that tend to be either both positive or both negative at the same time have strong positive weights, while those that tend to be opposite have strong negative weights.
This original principle is perhaps the simplest form of weight selection. While this means it can be relatively easily coded into a computer program and used to update the weights for a network, it also prohibits the number of applications of Hebbian learning. Today, the term Hebbian learning generally refers to some form of mathematical abstraction of the original principle proposed by Hebb. In this sense, Hebbian learning involves weights between learning nodes being adjusted so that each weight better represents the relationship between the nodes. As such, many learning methods can be considered to be somewhat Hebbian in nature.
For example, we have heard the word Nokia for many years, and are still hearing it. We are pretty used to hearing Nokia Mobile Phones, i.e. the word 'Nokia' has been associated with the word 'Mobile Phone' in our Mind. Every time we see a Nokia Mobile, the association between the two words 'Nokia' and 'Phone' gets strengthened in our mind. The association between 'Nokia' and 'Mobile Phone' is so strong that if someone tried to say Nokia is manufacturing Cars and Trucks, it would seem odd.
The following is a formulaic description of Hebbian learning: (note that many other descriptions are possible)
where is the weight of the connection from neuron to neuron and the input for neuron . Note that this is pattern learning (weights updated after every training example). In a Hopfield network, connections are set to zero if (no reflexive connections allowed). With binary neurons (activations either 0 or 1), connections would be set to 1 if the connected neurons have the same activation for a pattern.
Another formulaic description is:
- ,
where is the weight of the connection from neuron to neuron, is the number of training patterns, and the th input for neuron . This is learning by epoch (weights updated after all the training examples are presented). Again, in a Hopfield network, connections are set to zero if (no reflexive connections).
A variation of Hebbian learning that takes into account phenomena such as blocking and many other neural learning phenomena is the mathematical model of Harry Klopf. Klopf's model reproduces a great many biological phenomena, and is also simple to implement.
Read more about this topic: Hebbian Theory
Famous quotes containing the word principles:
“I suppose that one of the psychological principles of advertising is to so hammer the name of your product into the mind of the timid buyer that when he is confronted with a brusk demand for an order he cant think of anything else to say, whether he wants it or not.”
—Robert Benchley (18891945)
“When great changes occur in history, when great principles are involved, as a rule the majority are wrong.”
—Eugene V. Debs (18551926)
“... the history of the race, from infancy through its stages of barbarism, heathenism, civilization, and Christianity, is a process of suffering, as the lower principles of humanity are gradually subjected to the higher.”
—Catherine E. Beecher (18001878)