Gradient Descent

Gradient descent is the optimization techniques for nonlinear functions (e.g., the error function of a neural network as the weights are varied) that attempt to move incrementally to successively lower points in search space in order to locate a minimum.