Gauss-Newton Method

The Gauss-Newton method is a class of methods for solving nonlinear least-squares problems. In general, this method makes use of the Jacobian matrix J of first-order derivatives of  a function F to find the vector of parameter values x that minimizes the residual sums of squares (sum of squared deviations of predicted values from observed values). An improved and efficient version of the method is the so-called Levenberg-Marquardt algorithm. For a detailed discussion of this class of methods, see Dennis & Schnabel (1983); see also Nonlinear Estimation for a discussion of other methods for nonlinear estimation/regression (e.g., Simplex, Quasi-Newton, etc.).