**December 30, 2020**

When you feed training data to the algorithm it predicts the value of target variable corresponding to each row, the cost function computes the difference between this value and the actual value of the target variable. This is how the learning takes place, when the difference between actual and predicted value is small the model learns that it has identified the right curve on the other hand if the difference is large then model understands the curve it identified is not suitable and starts looking for other curve.

Logistic regression algorithms involve learning based on the gradient descent approach, particularly batch gradient descent. Where batch constitutes the entire dataset. As describes above, after every forward pass the difference between actual and predicted value is measured and the curve is altered accordingly. By altering the curve, we mean, that the slope and intercept values, of the line separating the two classes, is tweaked. This process continues till we reach at the optimum values of the slope and the intercept.

by : Monis Khan

**Quick Summary**:

When you feed training data to the algorithm it predicts the value of target variable corresponding to each row, the cost function computes the difference between this value and the actual value of the target variable. This is how the learning takes place, when the difference between actual and predicted value is small the model […]