What Does Gradient Descent Algorithm Mean?
The gradient descent algorithm is a strategy that helps to refine machine learning operations. The gradient descent algorithm works toward adjusting the input weights of neurons in artificial neural networks and finding local minima or global minima in order to optimize a problem.
The gradient descent algorithm is also known simply as gradient descent.
Techopedia Explains Gradient Descent Algorithm
To understand how gradient descent works, first think about a graph of predicted values alongside a graph of actual values that may not conform to a strictly predictable path. Gradient descent is about shrinking the prediction error or gap between the theoretical values and the observed actual values, or in machine learning, the training set, by adjusting the input weights. The algorithm calculates the gradient or change and gradually shrinks that predictive gap to refine the output of the machine learning system. Gradient descent is a popular way to refine the outputs of ANNs as we explore what they can do in all sorts of software areas.