Linear Regression

1. Regression Model

In statistics, linear regression is a linear approach to modelling the relationship between a dependent variable and one or more independent variables. Let x1x_1 be the independent variable and yy be the dependent variable. We will define a linear relationship between these two variables as follows:

y=θ0+θ1x1y = \theta_0+\theta_1 x_1

2. Define Loss Function

We will use the Mean Squared Error function.

L=1ni=1n(ytrueypredicted)2L = \frac{1}{n}\sum_{i=1}^n (y_{true}-y_{predicted})^2

3. Utilize the Gradient Descent Algorithm

You might know that the partial derivative of a function at its minimum value is equal to 0. So gradient descent basically uses this concept to estimate the parameters or weights of our model by minimizing the loss function.

  1. Initialize the weights, θ0=0\theta_0 = 0and θ1=0\theta_1 =0

  2. Calculate the partial derivatives w.r.t. to θ0\theta_0and θ1\theta_1 dθ0=2ni=1n(yiyiˉ)dθ1=2ni=1n(yiyiˉ)×xid_{\theta_0} = -\frac{2}{n} \sum_{i=1}^n(y_i - \bar{y_i}) \\ d_{\theta_1} = -\frac{2}{n} \sum_{i=1}^n(y_i - \bar{y_i}) \times x_i

  3. Update the weights θ0=θ0l×dθ0θ1=θ1l×dθ1\theta_0 = \theta_0 - l \times d_{\theta_0} \\ \theta_1 = \theta_1 - l \times d_{\theta_1}

Python Implementation

Last updated

Was this helpful?