As mentioned in the topic on Intrinsically Linear Regression Models, it is often possible to transform a nonlinear model

Growth = exp(-b1*Age)

into a linear model. A linear model is fit to the transformed data, and then the results are "untransformed" into the original metric. However, some regression models cannot be transformed into linear ones and can only be estimated via Nonlinear Estimation.

In the growth rate example in Intrinsically Linear Regression Models, we purposely "forgot" about the random error in the dependent variable. Of course, the growth rate is affected by very many other variables (other than time), and we can expect a considerable amount of random (residual) fluctuation around the fitted line. If we add this error or residual variability to the model, we could rewrite it as follows:

Growth = exp(-b1*Age) + error

Additive error. In this additive error model we assume that the error variability is independent of age, that is, that the amount of residual error variability is the same at any age. Because the error term in this model is additive, you can no longer linearize this model by taking the logarithm of both sides. If for a given data set, you were to log-transform variable Growth anyway and fit the simple linear model, then you would find that the residuals from the analysis would no longer be evenly distributed over the range of variable Age; and thus, the standard linear regression analysis (via Multiple Regression) would no longer be appropriate. Therefore, the only way to estimate the parameters for this model is via the Nonlinear Estimation module. When Nonlinear fits this model, it fits the actual exponential model to the untransformed variables, and minimizes the least squares error around the predicted values.

Multiplicative error. To "defend" our previous example, in this particular instance it is not likely that the error variability is constant at all ages, that is, that the error is additive. Most likely, there is more random and unpredictable fluctuation of the growth rate at the earlier ages than the later ages, when growth comes to a virtual standstill anyway. Thus, a more realistic model including the error would be:

Growth = exp(-b1*Age) * error

Put in words, the greater the age, the smaller the term exp(-b1*Age), and, consequently, the smaller the resultant error variability. If we now take the log of both sides of the equation, the residual error term will become an additive factor in a linear equation, and we can go ahead and estimate b1 via standard multiple regression.

Log (Growth) = -b1*Age + error

Of course, different parameter estimates are expected when fitting an
additive error model via Nonlinear
Estimation

Let us now consider some regression models (that are nonlinear in their parameters) which cannot be "made into" linear models through simple transformations of the raw data.

Models for Binary Responses: Probit & Logit

General Logistic Regression Model

Drug Responsiveness and Half-Maximal Response

Discontinuous Regression Models