Assumptions of Linear Regression — What Fellow Data Scientists Should Know
Last Updated on July 24, 2023 by Editorial Team
Author(s): Shaurya Lalwani
Originally published on Towards AI.
Photo by Marius Masalar on Unsplash
Linear Regression is a linear approach to modeling the relationship between a target variable and one or more independent variables. This modeled relationship is then used for predictive analytics. Working on the linear regression algorithm is just half the work done. The other half lies in understanding the following assumptions that this technique depends on:
For linear regression to work on the given data, it is assumed that Errors (residuals) follow a normal distribution. Although this is not necessarily required when the sample size is very large. The normality can be verified using the Q-Q Plot… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.
Published via Towards AI