Convex Functions and SVMs
Last Updated on April 16, 2025 by Editorial Team
Author(s): Kim Hyun Bin
Originally published on Towards AI.
This is the first part to a two part series to an introduction to convex functions, their optimizations and how it all ends up in the explanation of Support Vector Machines.
You might have heard about convex optimizations, especially if you have spent some time in the mathematical department of machine learning and artifical intelligence. The reason for its popularity lies behind the fact that it guarantees global optimality and efficient solvability. This means that any local minimum found, is also a global minimum, unlike non-convex problems that can have multiple local minima. Furthermore, there are numerous efficient algorithms to help solve convex problems such as gradient descent, interior-point methods, and Newtonβs methods. Last but not least, the strong duality conditions are also satisfied with convex problems, which we shall see in a minute. These advantages led to convex optimizations being found in numerous different places, such as finance, signal processing and machine learning, and specifically in SVMs for example.
In this article, the first part will show you around the world of convexity and its optimization. The second part will then introduce you to the world of Maximal Margin Classifiers, and Support Vector Classifiers and Support Vector Machines. This will probably… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI