Hebbian Theory - Generalization and Stability

Generalization and Stability

Hebb's Rule is often generalized as

or the change in the th synaptic weight is equal to a learning rate times the th input times the postsynaptic response . Often cited is the case of a linear neuron,

and the previous section's simplification takes both the learning rate and the input weights to be 1. This version of the rule is clearly unstable, as in any network with a dominant signal the synaptic weights will increase or decrease exponentially. However, it can be shown that for any neuron model, Hebb's rule is unstable. Therefore, network models of neurons usually employ other learning theories such as BCM theory, Oja's rule, or the Generalized Hebbian Algorithm.

Read more about this topic:  Hebbian Theory

Famous quotes containing the words generalization and and/or stability:

    The English have all the material requisites for the revolution. What they lack is the spirit of generalization and revolutionary ardour.
    Karl Marx (1818–1883)

    The message you give your children when you discipline with love is “I care too much about you to let you misbehave. I care enough about you that I’m willing to spend time and effort to help you learn what is appropriate.” All children need the security and stability of food, shelter, love, and protection, but unless they also receive effective and appropriate discipline, they won’t feel secure.
    Stephanie Marston (20th century)