Select the *MLP
activation functions* tab of the *SANN
- Automated Network Search (ANS)* dialog box to select the type of
activation functions that are to be included in the *Automated Network
Search* (*ANS*) for both hidden (input-hidden) and output (hidden-output)
units for multilayer perceptron networks. For information on the options
that are common to all tabs, see *STATISTICA Automated Network
Search (ANS)*.

Note: For RBF neural networks, the hidden activation functions are always set to isotropic Gaussian basis functions. For regression analysis, the only activation available for the outputs units of an RBF are linear functions. For classification tasks with cross-entropy error function, the output activation function of RBF and MLP networks are set to softmax.

**Hidden
neurons.** This group box provides a list of hidden activation
functions to choose from for inclusion in the *Automated Network Search*
(*ANS*). Note that you can select more than one activation
function at a time:

**Identity.** Uses the identity function.
With this function, the activation level is passed on directly as the
output of the neurons.

**Logistic.** Uses the logistic sigmoid
function. This is an S-shaped (sigmoid) curve, with output in the
range (0,1).

**Tanh.** Uses the hyperbolic
tangent function (recommended). The hyperbolic tangent function (tanh)
is a symmetric S-shaped (sigmoid) function, whose output lies in the range
(-1, +1). Often performs better than the logistic sigmoid function because
of its symmetry.

**Exp.** Uses the negative
exponential activation function.

**Sine.** Uses the standard
sine activation
function.

**Output
neurons.** This group box provides a list of output activation
functions to choose from for inclusion in the *Automated Network
Search* (*ANS*). Note that you can select more than one activation
function at a time:

**Identity.** Uses the identity function
(recommended). With this function, the activation level is passed on directly
as the output of the neurons. This is the only activation function available
for RBF networks when the error function is SOS.

**Logistic.** Uses the logistic sigmoid
function. This is an S-shaped (sigmoid) curve, with output in the
range (0, 1).

**Tanh.** Uses the hyperbolic
tangent function. The hyperbolic tangent function (tanh) is a symmetric
S-shaped (sigmoid) function, whose output lies in the range (-1, +1).
Often performs better than the logistic sigmoid function because of its
symmetry.

**Exponential.** Uses the
negative
exponential activation function.

**Sine.** Uses the standard
sine activation
function.