Kolmogorov-Arnold Networks for Mathematical Discovery
Last Updated on June 11, 2024 by Editorial Team
Author(s): Shenggang Li
Originally published on Towards AI.
Exploring Prime Number Distribution and Predictive Modeling with KAN and Other Mathematical Insights
Photo by Saad Ahmad on Unsplash
In my previous paper, Unpacking Kolmogorov-Arnold Networks, I introduced the mechanism of KAN, a novel framework developed by MIT. After closely studying the theory (Kolmogorov-Arnold Representation Theorem) behind KAN, I realized that KAN could solve many mathematical problems and even make discoveries.
One impressive feature of KAN is symbolic regression, representing data in a specified symbolic form (mathematical formula). This tool can be used for classification problems based on the regression formulation, making segmentation interpretable. However, a drawback is that itβs difficult to find the exact formula in real-world data because of their complexity and noise, which can be hard to explain using parametric forms such as logarithms and exponentials. One way to improve this is using multi-layer KAN, which can provide a better formula. However, this could make classification harder to interpret since the math formula might be too complicated.
Using Kolmogorov Arnold Networks (KAN), especially its symbolic regression to study math using data may have an advantage. If math rules exist, then KAN can help us find regulations. In this post, I aim to use KAN as an innovative and intriguing approach to tackle the problem of prime numbers distribution. Here are potential insights and… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI