Artificial Neural Network
- An Artificial Neural Network is an information processing paradigm that inspired, by the way, biological nervous systems, such as the brain process information.
- Typically, neurons are five to six orders of magnitude slower than silicon logic gates; events in a silicon chip happen in the nanosecond (up to 10-9 s) range. Whereas neural events happen in the millisecond (up to 10-3 s) range.
Figure: Biological Neuron
- The brain is a highly complex, nonlinear, and parallel information-processing system.
- It has the capability of organizing neurons so as to perform certain computations many times faster than the fastest digital computer.
- A brain has great structure and the ability to build up its own rules through what we usually refer to as experience.
- During this early stage of development, about one million synapses formed per second.
- Synapses are elementary structural and functional units that mediate the interactions between neurons.
- A developing neuron is synonymous with a plastic brain. Plasticity permits the developing nervous system to adapt to its surrounding environment.
- Axons act as transmission lines, and dendrites represent receptive zones. Neurons come in a wide variety of shapes and sizes in different parts of the brain. A pyramidal cell can receive Rs 10,000/- or more synaptic contacts and it can project onto thousands of target cells.
- In its most general form, a neural network is a machine that designed to model the way in which the brain performs a particular task or function of interest.
- Also, The network usually implemented using electronic components or simulated in software on a digital computer.
- To achieve good performance, neural networks employ a massive interconnection of simple computing cells referred to as neurons or processing units.
- Moreover, A neural network is a massively parallel distributed processor that has a natural propensity for storing experiential knowledge and making it available for use. It resembles the brain in two respects.
- Knowledge acquired by the network through a learning process.
- Interneuron connection strengths are known as synaptic weights used to store the Knowledge.
- Neural networks also referred to as neuro-computers, connectionist networks, parallel Distributed processors, etc.