Architecture
The basic architecture consists of three types of neuron layers: input, hidden, and output. In feed-forward networks, the signal flow is from input to output units, strictly in a feed-forward direction. The data processing can extend over multiple layers of units, but no feedback connections are present. Recurrent networks contain feedback connections. Contrary to feed-forward networks, the dynamical properties of the network are important. In some cases, the activation values of the units undergo a relaxation process such that the network will evolve to a stable state in which these activations do not change anymore.
In other applications, the changes of the activation values of the output neurons are significant, such that the dynamical behavior constitutes the output of the network. Other neural network architectures include adaptive resonance theory maps and competitive networks.
Read more about this topic: Neural Network
Famous quotes containing the word architecture:
“For it is not metres, but a metre-making argument, that makes a poem,a thought so passionate and alive, that, like the spirit of a plant or an animal, it has an architecture of its own, and adorns nature with a new thing.”
—Ralph Waldo Emerson (18031882)
“No architecture is so haughty as that which is simple.”
—John Ruskin (18191900)
“Art is a jealous mistress, and if a man have a genius for painting, poetry, music, architecture or philosophy, he makes a bad husband and an ill provider, and should be wise in season and not fetter himself with duties which will embitter his days and spoil him for his proper work.”
—Ralph Waldo Emerson (18031882)