Linear Algebra for ML
Last Updated on December 21, 2020 by Editorial Team
Author(s): Johar M. Ashfaque
Machine Learning, Mathematics
You do not need to learn linear algebra before you get started in machine learning, but at some point, you may wish to diveΒ deeper.
Linear algebra will give you the tools to help you with the other areas of mathematics required to understand and build better intuitions for machine learning algorithms.
What is LinearΒ Algebra
Linear Algebra is a branch of mathematics that concisely describes the coordinates and interactions of planes in higher dimensions and perform operations onΒ them.
Think of it as an extension of algebra (dealing with unknowns) into an arbitrary number of dimensions. Linear Algebra is about working on linear systems of equations (linear regression is an example: y = Ax). Rather than working with scalars, we start working with vectors and matrices.
Broadly speaking, in linear algebra data is represented in the form of linear equations. These linear equations are in turn represented in the form of matrices andΒ vectors.
β Vignesh Natarajan in answer to the question βHow is Linear Algebra used in Machine Learning?β
As a field, it is useful to you because you can describe complex operations used in machine learning using the notation and formalisms from linearΒ algebra.
Linear algebra finds widespread application because it generally parallelizes extremely well. Further to that most linear algebra operations can be implemented without messaging passing which makes them amenable to MapReduce implementations.
β Raphael Cendrillon in answer to the question βWhy is Linear Algebra a prerequisite behind modern scientific/computational research?β
The Minimum Linear Algebra for MachineΒ Learning
Linear Algebra is a foundation field that is to say that the notation and formalisms are used by other branches of mathematics to express concepts that are also relevant to machine learning.
For example, matrices and vectors are used in calculus, needed when you want to talk about function derivatives when optimizing a loss function. They are also used in probability when you want to talk about statistical inference.
β¦itβs used everywhere in mathematics, so youβll find it used wherever math isΒ usedβ¦
β David Joyce, in answer to the question, βWhat is the point of linear algebra?β
The minimum linear algebra that should be learned to improve your capabilities in machine learning would have to be the following 3Β topics:
- Notation: Knowing the notation will let you read algorithm descriptions in papers, books, and websites to get an idea of what is going on. Even if you use for-loops rather than matrix operations, at least you will be able to piece things together.
- Operations: Working at the next level of abstraction in vectors and matrices can make things clearer. This can apply to descriptions, code, and even thinking. Learn how to do or apply simple operations like adding, multiplying, inverting, transposing, etc., matrices andΒ vectors.
- Matrix Factorization: Especially matrix deposition methods like SVD and QR. The numerical precision of computers is limited, and working with decomposed matrices allows you to sidestep a lot of the overflow/underflow madness that can result. Also, a quick LU, SVD, or QR decomposing using a library will give you ordinary least squares for your regression problem. A bedrock of machine learning and statistics.
If you want to get into the theory of it all, you need to know linear algebra. If you want to read white papers and consider cutting edge new algorithms and systems, you need to know a lot ofΒ math.
β Jesse Reiss in answer to the question βHow important is linear algebra in computer science?β
5 Reasons To Improve Your LinearΒ Algebra
Of course, you can diveΒ deeper.
If you need to know more and get better does not motivate you down the path, here are five reasons that might give you thatΒ push.
- Building Block: Linear algebra is absolutely key to understanding the calculus and statistics you need in machine learning.
- Deeper Intuition: If you can understand machine learning methods at the level of vectors and matrices, you will improve your intuition for how and when theyΒ work.
- Get More From Algorithms: A deeper understanding of the algorithm and its constraints will allow you to customize its application and better understand the impact of tuning parameters on theΒ results.
- Implement Algorithms From Scratch: You require an understanding of linear algebra to implement machine learning algorithms from scratch. At the very least, to read the algorithm descriptions and, at best, to effectively use the libraries that provide the vector and matrix operations.
- Devise New Algorithms: The notation and tools of linear algebra can be used directly in environments like Octave and MATLAB, allowing you to prototype modifications to existing algorithms and entirely new approaches veryΒ quickly.
Linear Algebra will feature heavily in your machine learning journey, whether you like it orΒ not.
2 Video Courses To Learn LinearΒ Algebra
Here are some suggestions.
1. Linear Algebra Refresher
This is a quick whip around the topics in linear algebra you should be familiarΒ with.
The video is titled βLinear Algebra for machine learningβ and was created by Patrick van der Smagt using slides from University CollegeΒ London.
2. Linear Algebra CrashΒ Course
The second suggestion is the Linear Algebra crash course presented as an optional module in Week 1 of his Coursera Machine LearningΒ course.
This is suited to the engineer or programmer who is perhaps less or not at all familiar with linear algebra and is looking for a first bootstrap into theΒ topic.
It contains 6 short videos, and you can access a YouTube playlist here titled βMachine Learningβββ03. Linear AlgebraΒ Reviewβ.
The topics coveredΒ include:
- Matrices andΒ Vectors
- Addition and Scalar Multiplication
- Matrix-Vector Multiplication
- Matrix Matrix Multiplication
- Matrix Multiplication Properties
- Inverse and Transpose
Programming LinearΒ Algebra
As a programmer or engineer, you likely learn best byΒ doing.
As such, you may wish to grab a programming environment or library and start coding up matrix multiplication, SVD, and QR decompositions with testΒ data.
Below are some options you might like to consider:
- Octave: Octave is the open-source version of MATLAB, and for most operations, they are equivalent. These platforms were built for linear algebra. This is what they do, and they do it very well. They are a joy toΒ use.
- R: It can do t, but it's less beautiful than Octave. Check out this handy report: βIntroduction to linear algebra with RβΒ (PDF)
- SciPy numpy.linalg: Easy and fun if you are a Python programmer with clean syntax and access to all the operations youΒ need.
- BLAS: Basic Linear Algebra Subprograms like multiplication, inverse, and the like. Ported or available in most programming languages.
- LAPACK: Linear Algebra Library, the successor to LINPACK. The place to go for various matrix factorizations and the like. Like BLAS, ported or available in most programming languages.
Linear AlgebraΒ Books
This section lists some of the top textbooks on Linear Algebra for beginners.
Foundations
These are some beginner textbooks that cover the foundations of linearΒ algebra:
- Introduction to Linear Algebra by SergeΒ Lang.
- Introduction to Linear Algebra by GilbertΒ Strang
Applied
These are books that lean more towards the application of linearΒ algebra:
- Numerical Linear Algebra by Lloyd Trefethen.
- Linear Algebra and Its Applications by GilbertΒ Strang.
- Matrix Computations by Gene Golub and Charles VanΒ Loan
Linear Algebra for ML was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.
Published via Towards AI