How does gradient descent converge with a fixed step size alpha?

How does gradient descent converge with a fixed step size alpha?



How does gradient descent converge with a fixed step size alpha?
How does gradient descent converge with a fixed step size alpha?


  • As we approach a local minimum, gradient descent will take smaller steps.
  • Thus no need to decrease alpha over time.


Learn More :