Gradient descent is an optimization algorithm used to minimize the loss function of a machine learning model by adjusting its parameters iteratively. It calculates the gradient of the loss function with respect to the model’s parameters and updates them in the direction that reduces the error.