Now Reading
IBM Launches ‘AI Fairness 360’ To Detect Bias In Artificial Intelligence

IBM Launches ‘AI Fairness 360’ To Detect Bias In Artificial Intelligence

AI Fairness 360


AI Fairness 360

IBM is launching a new tool which will check biases in algorithms and try to understand how and why they crept in, and how to mitigate them. Called AI Fairness 360 (AIF360), this product by IBM is a comprehensive open-source toolkit of metrics to check for unwanted bias in datasets, machine learning models, and state-of-the-art algorithms.



ML models are increasingly used to make high-stakes decisions about people. Although ML, by its very nature, is always a form of statistical discrimination, the discrimination becomes objectionable when it places certain privileged groups at a systematic advantage and certain unprivileged groups at a disadvantage, says an official release by IBM. Bias in training data, due to either prejudice in labels or under or over-sampling, yields models with unwanted bias.

Reportedly, this initial release of the AIF360 Python package contains nine different algorithms, developed by the broader algorithmic fairness research community, to mitigate that unwanted bias. They can all be called in a standard way, very similar to scikit-learn’s fit or predict paradigm.

AIF360 contains three tutorials as of now. They revolve around:

See Also

  1. Credit scoring
  2. Predicting medical expenditures
  3. Classifying face images by gender

“In this way, we hope that the package is not only a way to bring all of us researchers together, but also a way to translate our collective research results to data scientists, data engineers, and developers deploying solutions in a variety of industries. AIF360 is a bit different from currently available open source efforts1 due its focus on bias mitigation (as opposed to simply on metrics), its focus on industrial usability, and its software engineering,” wrote Kush Varshney, principal research staff member and manager at IBM Research.

 


Enjoyed this story? Join our Telegram group. And be part of an engaging community.

Provide your comments below

comments

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
Scroll To Top