linear regression

December 23, 2020

What is gradient descent, and why is it used?

Gradient descent equation describes relationship between (1) current value of error and next value of error, or (2) slope (say m) and intercept of regression line (say c) and the corresponding error.

The error term is differentiated (partial derivatives) with respect to m & c. Then the resultant equations (1st order derivative) is equated with zero, as at minima the value of first order derivative is zero, and the values of m & c are calculated to plot/get the best fit line.

by : Monis Khan

Quick Summary:

Gradient descent equation describes relationship between (1) current value of error and next value of error, or (2) slope (say m) and intercept of regression line (say c) and the corresponding error. The error term is differentiated (partial derivatives) with respect to m & c. Then the resultant equations (1st order derivative) is equated with […]