Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

Regression Line with Mathematics for the Linear Regression
Statistics

Regression Line with Mathematics for the Linear Regression

Last Updated on July 9, 2020 by Editorial Team

Author(s): Gul Ershad

Statistics

Introduction

Regression is a prediction where the target is continuous and its applications are several. It is the simplest parametric model. Every data-set is given in a pair consisting of an input feature vector and label value. The main goal is to hypothesize the parameters to predict the target values of the test data after training from the training data-set.

The below table has two variables X and Y. Here, Y is known as the target variable or independent variable, and X is known as the explanatory variable.

X and Y variables

The prediction of the height of a child based on his age and weight can be an example of a regression problem.

Lets X is a real-values:

Values ofΒ X

And, the real value ofΒ Y:

Values ofΒ Y

So, the regression process based on the givenΒ rule:

Approach to Regression

Following are the general approach to Regression:

  1. Collect data
  2. Prepare data: Numeric values should be there for the regression. If there are nominal values, it should be mapped to binaryΒ values.
  3. Analysis: Good for the visualization into 2DΒ plots.
  4. Train: Find the regression weights.
  5. Test: Measure the R2, or correlation of the predicted values and data. It measures the accuracy of theΒ model.

Regression Line

Linear regression consists of finding the best-fitting straight line through the points. The best-fitting line is called a regression line.

Regression Line

The equation of Regression Line:

The Equation of Regression Line

The equation of Intercept a:

The Equation of intercept a

The equation of SlopeΒ b:

Properties of Regression Line

The regression line has the following properties:

  1. The regression always runs and rise through points x and yΒ mean.
  2. This line minimizes the sum of square differences between observed values and predicted values.
  3. In the regression line, x is the input value and y is the outputΒ value.

Residual Error in Regression Line

Residual Error is the difference between the observed value of the dependent value and predicted value.

Residual Error = Observed value – Predicated value

Residual Error

Derivative to find the equation of Regression Line

Let’s consider the following variables x and y with theirΒ values:

Variables X and Y with theirΒ values

So, to calculate the values of a and b lets find the values of XY, XΒ², andΒ YΒ².

Ready values to find Intercept andΒ Slope

Here,

No. ofΒ items

Now, find the value of Intercept aΒ :

Value of Intercept

Find the value of SlopeΒ b:

Value ofΒ Slope

Hence, the Regression Line equation:

The Regression LineΒ equation

Linear Regression

Let's take an example, try to forecast the horsepower of a friend’s automobile so its equation willΒ be:

Horsepower = 0.0018 * annual_salaryβ€Šβ€”β€Š0.99*hourslistening_radio

This equation is known as a regression equation. The values of 0.0018 and 0.99 are known as regression weights. And, the process of finding these regression weights is called regression.

Forecasting new values given set of input is easy once the regression weights areΒ found.

For regression, the prediction formula for the linear regression is likeΒ below:

Equation of Linear Regression
import mglearn
mglearn.plots.plot_linear_regression_wave()
Linear Regression on WaveΒ data-set

There are many different linear models for regression. The difference between these models lies in how the model parameters w and b are learned from the training data, and how model complexity can be controlled.

Pros of Linear Regression:

  1. It easy to interpret and computationally inexpensive

Cons of Linear Regression:

  1. It poorly models on non-linear data

Conclusion

To find the best-fitting straight line through the points is an important part of Linear regression and this line is called a regression line. Linear regression consists of finding the best-fitting straight line through the points. The least-squares method is used to find the best-fitting straight line in regression.

References

Introduction to Linear Regression: http://onlinestatbook.com/2/regression/introC.html


Regression Line with Mathematics for the Linear Regression was originally published in Towards AIβ€Šβ€”β€ŠMultidisciplinary Science Journal on Medium, where people are continuing the conversation by highlighting and responding to this story.

Published via Towards AI

Feedback ↓