Select the *Quick
*tab of *the SANN - Automated
Network Search (ANS)* dialog box to access the options described here.
For information on the options that are common to all tabs, see *SANN - Automated Network Search
(ANS)*.

**Network types.** Use the options in this
group box to specify the type of network (MLP or RBF). For each selected
type, you can also specify a range for the complexity
of the neural network models to be tried by the *Automated Network Search*
(*ANS*). Specify the complexity of networks to be tested in terms
of a range of figures for the number of hidden units. Specifying the number
of hidden units exactly (i.e., by setting the minimum equal to the maximum)
may be beneficial if you know, or have good cause to suspect, the optimal
number. In this case, it allows the *Automated Network Search* (*ANS*)
to concentrate its search algorithms on other dimensions in the search
such as activation functions. The larger the number of hidden units in
a neural network model the stronger the model is, i.e. the more capable
the network is to model complex relationships between the inputs and the
target variables.

**MLP.**
Select the MLP check box to include
multilayer perceptron networks in the network search. The multilayer perceptron
is the most common form of network. It requires iterative training, which
may be quite slow for a large number of hidden units and data sets, but
the networks are quite compact, execute quickly once trained, and in most
problems yield better results than the other types of networks.

**Min. hidden units.**
Specify the minimum number of hidden units to be tried by the *Automated
Network Search* (*ANS*) when using MLP
networks.

**Max. hidden units.**
Specify the maximum number of hidden units to be tried by the *Automated
Network Search* (*ANS*) when using MLP
networks.

**RBF.**
Select the *RBF*
check box to include radial basis function networks in the network search.
Radial basis function networks tend to be slower and larger than multilayer
perceptron, and often have relatively inferior performance, but they train
extremely quickly when the output activation functions are the identity.
They are also usually less effective than multilayer perceptrons if you
have a large number of input variables (they are more sensitive to the
inclusion of unnecessary inputs).

**Min. hidden units.**
Specify the minimum number of hidden units to be tried by the *Automated
Network Search* (*ANS*) when using RBF
networks.

**Max. hidden units.**
Specify the maximum number of hidden units to be tried by the *Automated
Network Search* (*ANS*) when using RBF
networks.

**Note:
****What effect does the number of hidden units have?** In
general, increasing the number of hidden units increases the modeling
power of the neural network (it can model a more convoluted, complex underlying
function), but also makes it larger, more difficult to train, slower to
operate, and more prone to over-fitting (modeling noise instead of the
underlying function). Decreasing the number of hidden units has the opposite
effect.

If your data is from a fairly simple function or is very noisy, or if you have too few cases, a network with relatively few hidden units is preferable. If, in experimenting with different numbers of hidden units you find that larger networks have better training performance, but worse selection performance, then you are probably over-fitting and should revert to smaller networks.

To combat overfitting *SANN*
uses a test sample (which you can specify in the Data
selection dialog box) which can help the *SANN*
training algorithm. This test sample is never used to train the neural
network (i.e., to learn the data) but rather used to monitor performance
throughout training at the end of each iteration cycle. See *Overfitting*
for more details.

**Train/Retain networks.** Use the options
in this group box to specify how many networks should be trained and how
many networks should be retained by the *ANS*.

**Networks to train.**
Specify how many networks the *Automated Network Search* (*ANS*)
should perform. The larger the number of networks trained the more detailed
is the search carried out by the *ANS*.
It is recommended that you set the value for this option as large as possible
depending on your hardware speed and resources.

**Networks to retain.**
Specify how many of the neural networks tested by the *Automated Network
Search* (*ANS*) should be retained (for testing, and then insertion
into the current network set). Networks with the lowest error for regression
and highest classification rate for classification will be retained.

**Error function.** Specify the error function
to be used in training a network.

**Sum
of squares. **Select the *Sum
of Squares* check box to generate networks using the sum of squares
error function. This error function is one of the most commonly used in
training neural networks and available in *SANN*
for both MLP and RBF types of networks in both regression and classification
tasks.

**Cross entropy.** Select
the *Cross entropy* check box
to generate networks using cross entropy error functions. This error function
assumes that the data is drawn from the multinomial family of distributions
(see Bishop 1995 for more details) and supports a direct probabilistic
interpretation of the network outputs. Note that this error function is
only available for classification problems. It will be disabled for regression
type analyses. While using the cross entropy error function, the output
activation functions are automatically set to *softmax*.
This restriction ensures that the network outputs are true class membership
probabilities, which is known to enhance the performance of classification
neural networks.