Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Unlock the full potential of AI with Building LLMs for Productionβ€”our 470+ page guide to mastering LLMs with practical projects and expert insights!

Publication

Active Learning and Semi-supervised Learning turn your unlabeled data into annotated data
Latest

Active Learning and Semi-supervised Learning turn your unlabeled data into annotated data

Last Updated on December 9, 2021 by Editorial Team

Author(s): Edward Ma

Originally published on Towards AI the World’s Leading AI and Technology News and Media Company. If you are building an AI-related product or service, we invite you to consider becoming an AI sponsor. At Towards AI, we help scale AI and technology startups. Let us help you unleash your technology to the masses.

Machine Learning

Active Learning and Semi-supervised Learning

Photo by Vadim Fomenok onΒ Unsplash

Active Learning is one of the teaching strategies which engage learners (e.g. students) to participate in the learning process actively. Compared to the traditional learning process, learners do not just sit and listen but work together with teachers interactively. Progress of learning can be adjusted according to the feedback from learners. Therefore, the cycle of active learning is very important. If you are not familiar with active learning, you may visit thisΒ post.

Semi-supervised Learning is a way to combine both labeled data and unlabeled data for model training. The assumptions of this approach are continuity assumption, cluster assumption and manifold assumption. In short, data points share the same label if they are closed to eachΒ other.

Hybrid Learning

Han et al. (2016) proposed to combine both active learning and semi-supervised learning for sound classification. Of course, we may apply it in other areas such as text, video, etc. Practitioners leverage the active learning approach to estimate the most valuable data points for labeling. However, they go further by introducing semi-supervised learning to estimate most confidence unlabeled after each round of iteration.

Overview

You may have a look at the following flow chart to understand how the aforementioned approach works. Here is theΒ pseudo:

  1. We have a large amount of unlabeled data and limited labeled data at the beginning.
  2. Go through a normal active learning process to perform data annotation.
  3. Fit both initial labeled data and new annotated data to train a classification model and classifyΒ it.
  4. If the confidence is higher than the pre-defined threshold (says 85%), we will assign the label to thoseΒ data.
  5. Repeat step 2 to step 4 until exit points. For example, acquired 500 annotated data or model performance achieves certain accuracy.
Active Learning + Semi-supervised LearningΒ Persudo

Most Confidence Sampling

The story does not finish yet. One of the outstanding is how do we control the self-learned data. By setting up a high threshold, we can ensure that the candidates should most likely belong to a particular category. However, we should not blindly trust it because of severalΒ reasons.

First of all, it may cause data imbalance. If the model is doing very well in one or a few categories, those self-learned data will belong to that one or a few categories and cause data imbalance after a few iterations.

To control the progress, we should acquire self-learned data progressively. Instead of picking those data points higher than the threshold, random sampling some of them is a better approach.

Python code byΒ NLPatl

NLPatl provides semi-supervised learning in active learning. You just need to fit your data to it and you can annotate the most valuable data points and self-learned data points. Let prepare to get your hands dirty. I will walk through how can you apply active learning in NLP with a few lines of code. You can visit this notebook for the full version of theΒ code.

# Initialize entropy sampling apporach to estimate the most valuable data for labeling
learning = SemiSupervisedLearning(
sampling=sampling,
embeddings_model=embeddings_model,
classification_model=classification_model
)
# Label data in notebook interactively
learning.explore_educate_in_notebook(train_texts, num_sample=2)

Reference

Like toΒ learn?

I am Data Scientist in Bay Area. Focusing on the state-of-the-art in Data Science, Artificial Intelligence, especially in NLP and platform related. Feel free to connect with me on LinkedIn orΒ Github.


Active Learning and Semi-supervised Learning turn your unlabeled data into annotated data was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.

Join thousands of data leaders on the AI newsletter. It’s free, we don’t spam, and we never share your email address. Keep up to date with the latest work in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.

Published via Towards AI

Feedback ↓