Centroid Neural Network for Clustering with Numpy
Last Updated on August 25, 2021 by Editorial Team
Author(s): LA Tran
Let’s elevate potentials that are not paid much attention
Centroid neural network (CentNN) is an efficient and stable clustering algorithm that has been successfully applied to numerous problems. CentNN does not require a pre-determined learning coefficient but still yields competitive clustering results compared to K-means Clustering or Self-Organizing Map (SOM) whose results heavily depend on the initial parameters. For those who neither know about nor understand this great algorithm, you are welcome to read my explanation with visual examples here. And now, let’s make CentNN clear with several lines of code.
I would like to choose the 2-d data clustering problem as the explaining example in this post because it is very straightforward to understand and watch and as it is an efficient tool to explain clustering algorithms.
Firstly, let’s import several necessary libraries and generate a dataset with 10 centers.
Some necessary subroutines that are prepared for the implementation, everything is available at my github:
Again, I assume that all of you guys already understood this algorithm. Otherwise, I highly recommend that you read the theory before going ahead. You can check my explanation for this algorithm here.
Now, let’s start training the algorithm:
Find the centroid c for all data, then split c into 2 weights w1, w2 with a small Ɛ.
Here are the 2 weights:
Find the winner neuron for each x in X.
The result after epoch 0:
From epoch 1 to the end, keep finding the winner neuron for every single data and updating winner neurons and loser neurons until reaching the desired number of clusters.
The algorithm stops after 81 epochs for the example dataset:
Final clustering result:
[ 9.30888233 -2.32652503]
In this post, I have introduced to all of you a tutorial on the implementation of the Centroid Neural Network (CentNN) algorithm using Numpy. You guys can find my implementation of CentNN here. If you feel it helps, please do not hesitate to give it a star. You are welcome to visit my Facebook page which is for sharing things regarding Machine Learning: Diving Into Machine Learning.
The next post in my series of CentNN will be the combination of vector quantization (block quantization) and CentNN in image compression application.
That’s all for today. Thanks for spending time!
Published via Towards AI