Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!

Publication

When Amazon’s AI Only Recruited Men and Discriminated Women
Latest   Machine Learning

When Amazon’s AI Only Recruited Men and Discriminated Women

Last Updated on July 20, 2023 by Editorial Team

Author(s): Surya Govind

Originally published on Towards AI.

An AI Failure Case Study on Amazon

Use Cases:

In 2014, Amazon decided to build a machine learning model into its recruitment system, which could evaluate hundreds and thousands of applicants and spit out the best profiles for recruitment. As you know, there was more male applicant at that time for the position of a software engineer and Technical positions.

The image source is here.

This project was to be used for internal purposes only. I can not say if Amazon was too confident or not, but they were very sure to get the best candidates while using this system.

The machine learning model:

Data scientists in Amazon collected all the 10 to 15 years of old job applications the company had received pbefore2014. Job applications can be from freshers or experienced persons. Finally, they have all types of applications to put on for training purposes, but would it be enough for the future also. The situation was still in doubt, and I don’t know if they have thought about it or not. Usually, the machine learning model works on what we taught to it i.e., how we train it.
They used this data to train the machine learning model and integrated it with the Amazon recruitment system. Combined with the Amazon recruitment system, this machine learning model will work as a moderator to separate the lower reference applications from the best-suited job applications.

The failure:

By 2015, the data scientist started seeing a glaring problem. It turned out that for the role of Technical and software jobs in the model was showing preferential bias towards male applicants and giving low ratings to female applicants.

The image source is here.

The model was quickly discarded from the recruitment system. The integration of the machine learning model was only failed when it comes to preferring a male over female, although it was a bit late because this model discarded many female applicants.

The reason for failure:

Data scientists had used 10 to 15 years of data of job applications when women’s presence in software technical jobs used to be very less compared to current times. I think they need to train the machine learning model every year with the latest job application data.
So, the machine learning model learned incorrect correlation that men are better for technical jobs than women. And yes! Any incorrect information provided to the model will output as a bad model prediction.

Lessons to learn here:

Data Collection is the most important step while creating machine learning models. Incorrect data will lead to an incorrect model only.
The data scientist should have Social awareness when collecting and processing data.

The image source is here.

Otherwise, the social buyers that will be passed on to ML models are not dealt with carefully. The contribution of men and women should be kept Equal while training the model. Every day, the type of data changes, and the requirement also changes parallelly. Updating the model with the latest data is very necessary for a machine learning model, older data will always give you the wrong predictions as per the latest trends. Train with the latest data and get awesome results, it’s simple.

I hope, you get to know something with this simple case study. I have tried to make it very simple to understand, so enjoy it.

Happy AI Learning!

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓