Select the Quick tab of the Model Estimation dialog box to access the options described here.

Estimation method. The Estimation method drop-down list contains options, from which you select a particular estimation procedure. There are four different estimation procedures available in Nonlinear Estimation, which can also be combined. These methods and their strengths and weaknesses are described in Nonlinear Estimation Procedures.

Quasi-Newton. For most applications, the default Quasi-Newton method will yield the best performance; that is, it is the fastest method to converge. In this method the second-order (partial) derivatives of the loss function are asymptotically estimated, and used to determine the movement of parameters from iteration to iteration. To the extent that the second-order derivatives of the loss function are meaningful (and they usually are), this procedure is more efficient than any of the others.

The following procedures do not estimate the second-order derivatives of the loss function but rather use various geometrical approaches to function minimization. They have the general advantage of being more "robust," that is, they are less likely to converge on local minima, and are less sensitive to "bad" (i.e., grossly inadequate) start values.

Simplex. The Simplex algorithm does not rely on the computation or estimation of the derivatives of the loss function. Instead, at each iteration the function will be evaluated at m+1 points in the m dimensional parameter space.

Simplex and quasi-Newton. This estimation procedure combines the Simplex and Quasi-Newton methods (see above).

Hooke-Jeeves pattern moves. At each iteration, the Hooke-Jeeves pattern moves method first defines a pattern of points by moving each parameter one by one, so as to optimize the current loss function. The entire pattern of points is then shifted or moved to a new location; this new location is determined by extrapolating the line from the old base point in the m dimensional parameter space to the new base point. The step sizes in this process are constantly adjusted to "zero in" on the respective optimum. This method is usually quite effective, and should be tried if both the quasi-Newton and Simplex methods fail to produce reasonable estimates.

Hooke-Jeeves and quasi-Newton. This estimation procedure combines the Hooke-Jeeves pattern moves and Quasi-Newton methods (see above).

Rosenbrock pattern search. The Rosenbrock pattern search method will rotate the parameter space and align one axis with a ridge (this method is also called the method of rotating coordinates); all other axes will remain orthogonal to this axis. If the loss function is unimodal and has detectable ridges pointing toward the minimum of the function, then this method will proceed with accuracy toward the minimum of the function. However, note that this search algorithm may terminate early when there are several constraint boundaries (resulting in the penalty value; see above) that intersect, leading to a discontinuity in the ridges.

Rosenbrock and quasi-Newton. This estimation procedure combines the Rosenbrock pattern search and quasi-Newton methods.

Note: Choosing combinations of methods. Because the Simplex, Hooke-Jeeves, and Rosenbrock methods are generally less sensitive to local minima, you can use any one of these methods with the quasi-Newton method. This is particularly useful if you are not sure about the appropriate start values for the estimation. In that case, the first method may generate initial parameter estimates that will then be used in subsequent quasi-Newton iterations.