**Activation Functions**

Activation Functions is the important topic of the Artificial Intelligence. Moreover, freestudy9 has all kind of important information and topic related to it.

- A neuron is an information-processing unit that is fundamental to the operation of a neural network.
- An activation function used for limiting the amplitude of the output of a neuron. The activation function also referred to in the literature as a squashing function in that it squashes (limits) the permissible amplitude range of the output signal to some finite value.
- Typically, the normalized amplitude range of the output of a neuron is written as the closed unit interval [0, 1] or alternatively [-1, 1].
- There are many activation functions.

**Figure: a Nonlinear model of a neuron.**

**Threshold activation function (McCulloch–Pitts model)**

In this model, the output of a neuron takes on the value of 1 if the total internal activity level of that neuron is nonnegative and 0 otherwise.

**Piecewise-linear activation function**

- The amplification factor inside the linear region assumed to be unity.
- A linear combiner arises if the linear region of operation in maintained without running into saturation.
- The piecewise-linear function reduces to a threshold function if the amplification factor of the linear region in made infinitely large.

**Sigmoid (logistic) activation function **

- The sigmoid function, most common form of activation function used in the construction of artificial neural networks.
- Whereas a threshold function assumes the value of 0 or 1, a sigmoid function assumes a continuous range of values from 0 and 1.
- Note also that the sigmoid function is differentiable, which is an important feature of neural network theory.

**Hyperbolic tangent function **

The hyperbolic tangent function can easily express in terms of the logistic function: (2 × logistic function – 1).

**Softmax activation function **

- One approach toward approximating probabilities is to choose the output neuron nonlinearity to be exponential rather than sigmoidal and for each pattern to normalize the outputs to sum to 1.
- Let c be the number of output neurons.
- Each output generated by the following activation function:

The softmax activation function is a smoothed version of a winner take- all nonlinearity in which the maximum output is transformed to 1, and all others reduced to 0.

**Related** **Terms**

Short note on Artificial Intelligence, Learning Neural Network, Artificial Neural Network, Hopfield Networks

## Leave a Reply