Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

How to Use scikit-learn ‘eli5’ Library to Compute Permutation Importance?
Latest   Machine Learning

How to Use scikit-learn ‘eli5’ Library to Compute Permutation Importance?

Last Updated on July 20, 2023 by Editorial Team

Author(s): Abhinav Prakash

Originally published on Towards AI.

Feature Permutation Importance with ‘eli5’ U+007C Towards AI

Understanding the workings of scikit-learn’s ‘eli5’ library to compute feature importance on a sample housing dataset and interpreting its results

cc: Forbes

Most of the Data Scientist(ML guys) treat their machine learning model as a black-box. They don’t know what are the things which are happening underhood.
They load their data, do manual data cleaning & prepare their data to fit it on ml modal. Then the train their model & predict the target values(regression problem).

But they don’t know, what features does their model think are important?

For answering the above question Permutation Importance comes into the picture.

What is it?

Permutation Importance is an algorithm that computes importance scores
for each of the feature variables of a dataset,
The importance measures are determined by computing the sensitivity of a model to random permutations of feature values.

How does it work?

The concept is really straightforward:
We measure the importance of a feature by calculating the increase in the model’s prediction error after permuting the feature.
A feature is “important” if shuffling its values increases the model error because in this case, the model relied on the feature for the prediction.
A feature is “unimportant” if shuffling its values leave the model error unchanged because in this case, the model ignored the feature for the prediction.

Should I compute importance on Training or Test data(validation data)?

The answer to this question is, we always measure permutation importance on test data.
permutation importance based on training data is garbage. The permutation importance based on training data makes us mistakenly believe that features are important for the predictions when in reality the model was just overfitting and the features were not important at all.

eli5 — a scikit-learn library:-

eli5 is a scikit learn library, used for computing permutation importance.

caution to take before using eli5:-

1. Permutation Importance is calculated after a model has been fitted.

2. We always compute permutation importance on test data(Validation Data).

3. The output of eli5 is in HTML format. So, we can only use it in the ipython notebook(i.e Jupiter notebook, google collab & kaggle kernel, etc).

Now, let us get some test of codes U+1F60B

I’ve built a rudimentary model(RandomForestRegressor) to predict the sale price of the housing data set.
This is a good dataset example for showing the Permutation Importance because this dataset has a lot of features.
So, we can see which features make an impact while predicting the values and which are not.

Now, we use the ‘eli5’ library to calculate Permutation importance.

you can see the output of the above code below:-

Interpreting Results:-

Features have decreasing importance in top-down order.
The first number in each row shows the reduction in model performance by the reshuffle of that feature.
The second number is a measure of the randomness of the performance reduction for different reshuffles of the feature column.
overallQual(overall quality) feature of the housing data set makes the biggest impact in the model while predicting the Sale Price.

You can get the housing-data set in .csv format from my GitHub profile
LINK:- https://github.com/abhinavsp0730/housing_data/blob/master/home-data-for-ml-course.zip

You can also get .ipnyb file(kaggle Kernel) file from my GitHub profile
LINK:-

https://github.com/abhinavsp0730/housing_data/blob/master/kernel659579854a(2).ipynb

THANK YOU

If you enjoy my article then do claps and follow me U+2764️.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓