Levenberg-Marquardt Algorithm

The Levenberg-Marquardt (LM) algorithm is an improvement of the classic Gauss-Newton method for solving nonlinear least-squares regression problems. The method is discussed in detail in Moré (1977). It is the recommended method for nonlinear least squares (regression) problems, where it is more efficient than other more general optimization algorithms (such as the Quasi-Newton, or Simplex methods; see also Nonlinear Estimation for a discussion of other methods for nonlinear estimation/regression).

Consider the nonlinear model fitting y = f(θ,x) with the given data Xi and Yi, i = 1,...,m where Xi is of dimension k and θ is of dimension n. The LM method seeks θ, the solution of θ(locally) minimizing:

g(θ)=Σi=m1(Yi - f(θ,Xi))2

The LM finds the solution applying the routine:

 θj+1j - (J'J + λD)-1J'(Y - f(θ,Xi))

iteratively, where:

Y is the m x 1 vector containing Y1,...,Ym,

X is the m x k  matrix containing X1,...,Xm,

J is the m x n Jacobian matrix for f(θ,x) with respect to θ,

D is an n x n  diagonal matrix to adjust scale factors.