How does gradient descent converge with a fixed step size alpha? Machine Learning How does gradient descent converge with a fixed step size alpha? How does gradient descent converge with a fixed step size alpha? As we approach a local minimum, gradient descent will take smaller steps. Thus no need to decrease alpha over time. Learn More : Share this Share on FacebookTweet on TwitterPlus on Google+