top of page

Linear Regression In Single Variable || Python 3.7

What Is Regression?


Regression is a statistical measurement used in finance, investing, and other disciplines that attempts to determine the strength of the relationship between one dependent variable (usually denoted by Y) and a series of other changing variables (X also known as independent variables).



Note: There are more Regression methods, but only Linear Regression is discussed here


Linear Regression



Linear Regression is the most common, popular and simplest way of implementing Machine Learning using Supervised Learning.


Machine Learning task of learning a function that maps an input to an output based on example input-output pairs.

This type of Regression is used when the value of Y is Continuous.


Linear Regression is also divided into two parts:


  • Simple Linear Regression

In this type of Regression, the Dependent Variable(Y) linearly depends upon only an Independent variable(X) and a constant (b0).


The linear dependence in a Simple linear regression and can be written as follows:


Y = mX + C ---(1)


Where equation 1 is an equation of a straight line and m and C are the slope and the Y-intercept respectively.


Y = b0 + b1X + z


Where b0 and b1 are the constant(bias) and coefficient(weight) respectively, and z is the error term which we need to minimise.

  • Multiple Linear Regression

Multiple Linear Regression is like Simple Linear Regression, the only difference is that the dependent variable is Linearly dependent on more than one Independent variables.


The linear dependence in a Multiple linear regression can be written as follows:


Y = b0 + b1X1 + b2X2 + b3X3 +…… + bnX


Where b0 is a constant and b1, b2, b3 ….. bn are the coefficients.


Multiple Linear Regression is generally implemented using the concept of Vectors and Matrices for faster processing.


The dataset for applying linear regression is as follows:



ree

Cost Function


After we have calculated the predicted value we need to calculate the error and reduce it.

So, here we use Mean Square to calculate the error in the predicted values.

This error is then used to update the weights(b1,b0).


MSE measures the average squared difference between an observation’s actual and predicted values. The output is a single number representing the cost, or score, associated with our current set of weights.

Our goal is to minimize MSE to improve the accuracy of our model.



ree

Gradient Descent


To minimize MSE we use Gradient Descent to calculate the gradient of our cost function.




ree


ree

Note:</b> Alpha is the Learning rate.


Here the code for how to implement Simple Linear Regression Using Python For figuring out how Y is Related with X.


Note: Here vectors and matrices are used for simplicity of the algorathm and faster processing.







Congratulations You have written the code for your first ML model.

1 Comment


ratnam2001shree
ratnam2001shree
Jan 27, 2020

This is really helpful. Thanks man. Appreciate it 👍👍👍👍

Like

09748303416

  • linkedin
  • linkedin

©2020 by XORbians.

Subscribe Form

bottom of page