Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

Linear Regression Complete Derivation With Mathematics Explained!
Machine Learning

Linear Regression Complete Derivation With Mathematics Explained!

Last Updated on May 26, 2020 by Editorial Team

Author(s): Pratik Shukla

Machine Learning

Part 3/5 in Linear Regression

Part 1: Linear Regression FromΒ Scratch.

Part 2: Linear Regression Line Through BruteΒ Force.

Part 3: Linear Regression Complete Derivation.

Part 4: Simple Linear Regression Implementation FromΒ Scratch.

Part 5: Simple Linear Regression Implementation Using Scikit-Learn.

In the last article, we saw how we could find the regression line using brute force. But that is not that fruitful for our data, which is usually in millions. So to tackle such datasets, we use python libraries, but such libraries are built on some logical theories, right? So let’s find out the logic behind some creepy looking formulas. Believe me, the math behind it isΒ sexier!

Before we begin, the knowledge of the following topics might beΒ helpful!

  • Partial Derivatives
  • Summations

Are you excited to find the line of bestΒ fit?

Let’s start by defining a fewΒ things

1) Given n inputs andΒ outputs.

2) We define the line of best fitΒ as:

3) Now we need to minimize the error function we namedΒ S

4) Put the value of equation 2 into equationΒ 3.

To minimize our error function, S, we must find where the first derivative of S is equal to 0 concerning a and b. The closer a and b are to 0, the less total error for each point is. Let’s find the partial derivative of aΒ first.

Finding aΒ :

1 ) Find the derivative of S concerning a.

2 ) Using the chain rule, let’sΒ say

3) Using partial derivative

4) Expanding

5) Simplifying

6) To find extreme values, we put it toΒ zero

7) Dividing the left side withΒ -2

8) Now let’s break the summation in 3Β parts

9) Now the summation of aΒ is

10) Substituting it back in theΒ equation

11) Now we need to solve forΒ a

12) The summation of Y and x divided by n, is simply it’sΒ mean

We’ve minimized the cost function concerning x. Now let’s find the last part which S concerning b.

Finding BΒ :

1 ) Same as we have done withΒ a

2) Finding the partial derivative

3) Expanding it aΒ bit

4) Putting it back in theΒ equation

5) Let’s divide by -2 bothΒ sides

6) Let’s distribute x for ease ofΒ viewing

Now let’s do something fun! Remember, we found the value of earlier in this article? Why don’t we substitute it? Well, let’s see whatΒ happens.

7) Substituting the value ofΒ a

8) Let’s distribute the minus sign andΒ x

Well, you don’t like it? Let’s split up the sum into twoΒ sums

9) Splitting theΒ sum

10) Simplifying

11) Finding B fromΒ it

Great. We did it. We have isolated a and b in the form of x and y. It wasn’t that hard, wasΒ it?

I still have some energy and want to explore it aΒ bit!

12 ) Simplifying theΒ formula

13) Multiplying numerator and denominator by n in equationΒ 11

14) Now if we simplify the value of a using equation 13 weΒ get

SummaryΒ πŸ™‚

If you have a dataset with one independent variable, then you can find the line that best fits by calculating B.

Then substituting B intoΒ a

And finally substituting B and a into the line of bestΒ fit

Moving Onwards,

In the next article, we’ll see how we can implement simple linear regression from scratch (without sklearn) inΒ Python.

And please let me know whether you liked this article or not! I bet you likedΒ it.

To find more such detailed explanation, visit my blog: patrickstar0110.blogspot.com

(1) Simple Linear Regression Explained With Its Derivation.

(2)How to Calculate The Accuracy Of A Model In Linear Regression FromΒ Scratch.

(3) Simple Linear Regression Using Sklearn.
You can download the code and some handwritten notes on the derivation on GoogleΒ Drive.

If you have any additional questions, feel free to contact me: [email protected].


Linear Regression Complete Derivation With Mathematics Explained! was originally published in Towards AIβ€Šβ€”β€ŠMultidisciplinary Science Journal on Medium, where people are continuing the conversation by highlighting and responding to this story.

Published via Towards AI

Feedback ↓