Difference between Hill Climbing and Gradient Descent Search
Hill Climbing
In Hill Climbing, we move only one element of the vector space. Then we calculate the value of function and replace it if the value improves. We keep on changing one element of the vector till we can’t move in a direction such that position improves.
Gradient Descent
Gradient descent is a very simple optimization algorithm. It makes iterative movements in the direction opposite to the gradient of a function at a point. It is easy to understand if we visualize the procedure.
Assume we are trying to find the minimum of the function:
f(x) = (x-1)2
(we know the minimum is at x=1. Let’s try to apply gradient descent)
We have to make an initial guess for the minimum. Let our initial guess be x(0) = -1
According to Gradient descent algorithm, our new point x(1) = x(0) – ηf’(x(0))
where η
So, x(1) = -1-0.6*2*(-2)=1.4
Difference between Hill Climbing and Gradient Descent
Hill Climbing | Gradient Descent |
1. In Hill Climbing, you look at all neighboring states and evaluate the cost function in each of them. | 1. In Gradient Descent, you look at the slope of your local neighbor and move in the direction with the steepest slope. |
2. Hill Climbing is less efficient than Gradient Descent. | 2. Gradient Descent is much more efficient than Hill Climbing |
3. In Hill Climbing, you can optimize discrete problems, you just need to be able to evaluate a cost function for a given state. | 3. Gradient Descent assures you optimize a continuous function and you can compute it's gradient in a given state. |