Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Unlock the full potential of AI with Building LLMs for Productionβ€”our 470+ page guide to mastering LLMs with practical projects and expert insights!

Publication

Centroid Neural Network: An Efficient and Stable Clustering Algorithm
Latest

Centroid Neural Network: An Efficient and Stable Clustering Algorithm

Last Updated on August 19, 2021 by Editorial Team

Author(s): LA Tran

Deep Learning

Let’s upraise potentials that are not paid much attention

Generally, clustering is grouping multi-dimensional datasets into closely related groups. Classical representatives of clustering algorithms are K-means Clustering and Self-Organizing Map (SOM). You can easily find numerous resources for those algorithm explanations. This time, let me introduce to all of you another efficient clustering algorithm but seemingly no many researchers pay attention: Centroid Neural Network for Unsupervised Competitive Learning. Please click here to have a closer looking at the originalΒ paper.

Centroid Neural Network Resultβ€Šβ€”β€ŠImage by AuthorΒ (source)

And now, let’s getΒ started!

Centroid Neural NetworkΒ (CentNN)

To avoid confusion with Convolution Neural Network, I would like to use the term β€œCentNN” in thisΒ post.

CentNN is an unsupervised competitive learning algorithm based on the classical k-means clustering algorithm that estimates centroids of the related cluster groups in training date. CentNN requires neither a predetermined schedule for learning coefficient nor a total number of iterations for clustering. In the paper, the simulation results on clustering problems and image compression problems show that CentNN converges much faster than conventional algorithms with compatible clustering quality while other algorithms may give unstable results depending on the initial values of the learning coefficient and the total number of iterations.

The Centroid Neural-Network Algorithm. (figure fromΒ paper)

The above figure shows the CentNN algorithm. If you get bored when looking at all those mathematical equations, don’t worry, I am going to make it more understandable with a secondary-school-maths example.

Weight Update

The core of the algorithm is the weight update procedure: How to calculate efficiently the centroid of a group of data points when a single data point moves in orΒ out?

For instance, we have 100 integers, it’s very simple to calculate the mean value by averaging all of them. Then what happens if we have one more integer? Do we need to average all 101 those ones again? The answer isΒ NO!

Let’s say we have N and N+1 data points, the mean value w is calculated asΒ follows:

From the 2 above equations:

Now we can calculate the mean of 101 integers without averaging all of them again. It’s similar when we move out one integer to have 99 integers.

And the example above explains these equations in theΒ paper:

Weight Update Equations

Visual Explanation Example

Given a dataset with 30 data points as the figure below, our aim is to cluster them into 3 clusters.

Image byΒ Author

Epoch 0: Initialization

Find 1 centroid c for all data, then split c into 2 weights w1, w2 with a small Ɛ.

w1 = c + Ɛ, w2 = cβ€Šβ€”β€ŠΖ (e.g. Ɛ = 0.05Β )

Image byΒ Author

Before going ahead, let’s talk about the term β€œwinner neuron” and β€œloserΒ neuron”.

  • winner neuron: input a data x, the winner neuron is the weight (w) which is closest to x among allΒ weights.
  • loser neuron: input a data x at epoch n, if in epoch (n-1), w was the closest weight to x, but in epoch n, there is another weight w’ which is closer to x than w, then w is a loser neuron and w’ is a winner neuron for the data x at epochΒ n.

Find the winner neuron for each x inΒ X.

  • x1 comes, w1 wins β†’ updateΒ w1
  • x2 comes, w2 wins β†’ updateΒ w2
  • x3 comes, w1 wins β†’ updateΒ w1
  • …
  • x30 comes, w1 wins β†’ updateΒ w1
Image byΒ Author

After epoch 0, we always have 2 centroids and information of clustered data points. Now, keep itΒ here:

  • x1 comes, w1Β wins
  • x2 comes, w2Β wins
  • x3 comes, w1Β wins
  • …
  • x30 comes, w1Β wins

Epoch 1:

Find the winner neuron for each x inΒ X.

Image byΒ Author
  • x1 comes, w2 wins β†’ w1 loses, loser += 1, updateΒ weights.
  • x2 comes, w2Β wins
  • x3 comes, w1Β wins
  • …
  • x30 comes, w1Β wins

After epoch 1, we have 1 data whose cluster is updated, and the number of loser neurons increases by 1. By looking, we can still realize that x1 is closer to w2 rather than w1, but at epoch 0, x1 was clustered to w1, now it is updated correctly. Again, keep the information here:

  • x1 comes, w2Β wins
  • x2 comes, w2Β wins
  • x3 comes, w1Β wins
  • …
  • x30 comes, w1Β wins

Epoch 2:

Keep finding the winner neuron for each x inΒ X.

Image byΒ Author
  • x1 comes, w2Β wins
  • x2 comes, w2Β wins
  • x3 comes, w1Β wins
  • …
  • x30 comes, w1Β wins

Now, nothing changes. Let’s check if the number of clusters now reaches the desired number of not. The answer is NOT YET. Then we start splitting the centroid with the most error with a small Ɛ.

Image byΒ Author

Now do the same procedure from epochΒ 1.

Epoch 3:

Images byΒ Author
Images byΒ Author

Epoch 4:

Image byΒ Author

Epoch 5:

Image byΒ Author

Now, nothing changes, and we also reach the desired num of clusters. Now, time toΒ party!

GIF via Tenor [image-source]

Below is the comparison between the results of K-Means Clustering and CentNN in image compression. We can see that CentNN outperforms K-Means Clustering in terms of PSNR (peak signal-to-noise ratio), with PSNR 47.53 compared to PSNR 46.35, respectively. In the example, the image size is 128×128, the number of clusters isΒ 48.

Image Compression Resultsβ€Šβ€”β€ŠImage byΒ Author

CentNN Implementation

You guys can find my implementation of CentNN in Python here. If you feel it helps, please do not hesitate to give it aΒ star.

By the way, you are welcome to visit my Facebook page which is for sharing things regarding Machine Learning: Diving Into Machine Learning.

In this post, I have introduced to all of you Centroid Neural Network (CentNN), an efficient clustering algorithm. The main feature of CentNN is that it updates weights every time a single new data comes, while K-means Clustering updates centroids after every iteration.

In order to avoid the algorithm from getting stuck at an undesirable local minimum solution, CentNN starts with setting the number of groups to two and increases the number of groups one by one until it reaches the predetermined number of groups. The CentNN algorithm does not provide any guarantee of convergence to the global minimum solution like other unsupervised algorithms but does provide a guarantee of convergence to a localΒ minimum.


Centroid Neural Network: An Efficient and Stable Clustering Algorithm was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.

Published via Towards AI

Feedback ↓