Regularized Linear Regression

Hritika Agarwal
2 min readJul 21, 2020

--

We can apply regularization to both linear regression and logistic regression. We will approach linear regression first.

Gradient descent

We will modify our gradient descent function to separate out θ0​ from the rest of the parameters because we do not want to penalize θ0​.

Normal Equation

Now let’s approach regularization using the alternate method of the non-iterative normal equation.

To add in regularization, the equation is the same as our original, except that we add another term inside the parentheses:

L is a matrix with 0 at the top left and 1’s down the diagonal, with 0’s everywhere else. It should have a dimension (n+1)×(n+1). Intuitively, this is the identity matrix (though we are not including x0​), multiplied with a single real number λ.

Read Next -Regularized Logical Regression

--

--

No responses yet