Neural network architectures have called connectionist architecture.
They characterized by having:
- A very large number of simple neuron-like processing elements.
- A large number of weighted connections between the elements. The weights on the connections encode the knowledge of a network.
- Highly parallel, distributed control.
- An emphasis on learning internal representations automatically.
Hopfield : introduced a neural network as a theory of memory: Model of content addressable memory.
Features of a Hopfield Network:
- Distributed representation
- A memory stored as a pattern of activation across a set of processing elements.
- Distributed, asynchronous control
- Content-addressable memory
- A number of patterns can store in a network. To retrieve a pattern, a specific portion of it is specified and the network automatically finds the closest match.
- Fault tolerance: If a few processing elements misbehave or fail completely. Moreover, the network will still function properly.
- Also, Each processing element makes decisions based only on its own local situation.
A simple Hopfield Network is shown below:
- Processing elements or units are always in one of two states, active or inactive.
- Also, Units connected to each other with the weighted symmetric connection a positive weighted connection indicates that the two units tend to activate each other.
- Moreover, A negative weighted connection allows an active unit to deactivate a neighboring unit.
Parallel relaxation algorithm
The network operates as follows:
- A random unit is chosen
- If any of its neighbors are active, the unit computes the sum of the weights on the connections to those active neighbors.
- Moreover, If the sum is positive, the unit becomes active, Otherwise, it becomes inactive.
- So, Another random unit is chosen, and the process repeats until the network reaches a stable state. (e.g. until no more unit can change state)
black and positive → will attempt to activate the unit connected to it
Moreover, The network can think of as storing the patterns, given any set of weights and an initial state, the parallel relaxation algorithm will eventually steer the network into a stable state.
Problem: sometimes the network cannot find the global solution. Also, Because the network sticks with the local minima as nodes settle into stable states via a completely distributed algorithm.