In the most general terms, least squares estimation is aimed at minimizing the sum of squared deviations of the observed values for the dependent variable from those predicted by the model. Technically, the least squares estimator of a parameter q is obtained by minimizing Q with respect to q where:

Q = S[Yi - fi(q)]2

Note that fi(q) is a known function of q, Yi = fi(q) + ei where i = 1 to n, and the ei are random variables, and usually assumed to have expectation of 0.

For more information, see Mendenhall and Sincich (1984), Bain and Engelhardt (1989), and Neter, Wasserman, and Kutner (1989). See also, Basic Statistics, Multiple Regression, and Nonlinear Estimation.