What is steepest descent




















The method of steepest descent, also called the gradient descent method, starts at a point and, as many times as needed, moves from to by minimizing along the line extending from in the direction of , the local downhill gradient. When applied to a 1-dimensional function , the method takes the form of iterating. The results are illustrated above for the function with and starting points and 0. This method has the severe drawback of requiring a great many iterations for functions which have long, narrow valley structures.

In such cases, a conjugate gradient method is preferable. Arfken, G. Orlando, FL: Academic Press, pp. Menzel, D. Fundamental Formulas of Physics, Vol. New York: Dover, p. Morse, P. New York: McGraw-Hill, pp. Press, W. Cambridge, England: Cambridge University Press, p. Wolfram, S. A New Kind of Science. Champaign, IL: Wolfram Media, p. Weisstein, Eric W. Can we call using fixed alpha without line search in negative gradient direction steepest descent? Steepest descent is a special case of gradient descent where the step length is chosen to minimize the objective function value.

Gradient descent refers to any of a class of algorithms that calculate the gradient of the objective function, then move "downhill" in the indicated direction; the step length can be fixed, estimated e. Gradient-based optimization is, as Cliff AB points out in comments to the OP, more general still, referring to any method that uses gradients to optimize a function. Note that this does not mean you necessarily move in the direction that would be indicated by the gradient see, for example, Newton's method.

Gradient is a multi-variable generalization of the derivative at a point. While a derivative can be defined on functions of a single variable, for functions of several variables. Since descent is negative sloped, and to perform gradient descent, we are minimizing error, then maximum steepness is the most negative slope.

Sign up to join this community. The best answers are voted up and rise to the top. Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Learn more. What is steepest descent? Is it gradient descent with exact line search? Ask Question. Asked 3 years, 10 months ago. Active 3 years, 10 months ago. Viewed 6k times. In some literature, such as this and this , steepest descent means using negative gradient direction and exact line search on that direction.

But in this note , It seems as well as we are following negative gradient, the method can be called steepest descent. Is the term "steepest descent" loosely defined? Improve this question. Nick Cox Haitao Du Haitao Du This is comparison with gradient free methods, such as bisection method, Nelder Mead, genetic algorithms, etc.

I know what is gradient based optimization, but just want to ask the definition of steepest decent. I revised it. Thanks for the comment.



0コメント

  • 1000 / 1000