Although Amazon is at the forefront of AI technology, the company could not find a way to make its algorithm gender neutral. But the company’s failure reminds us that AI develops bias from a variety of sources. While there’s a common belief that algorithms are supposed to be built without any of the bias or prejudices that color human decision-making, the truth is that an algorithm can unintentionally learn bias from a variety of different sources. Everything from the data used to train it, to the people who are using it, and even seemingly unrelated factors, can all contribute to AI bias. This Case study is all about how Amazon AI recruiting tool that showed bias against women. #Amazon #genderequality #AIinrecruitment #marketingcasestudy #mbacasestudy #managementcasestudy #casestudy #HRcasestudy
Gunay devotes her TED Talk to her main research directions: Gender and racial biases, as well as inclusiveness, in AI. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx
Every year, Amazon gets more than 200 thousand resumes for the various jobs they are hiring for. Google, gets 10 times that, with over 2 million resumes each year. Imagine being the HR managers responsible for vetting all those. That seems like an absolutely daunting task, but, in the modern age we live in, this seems like a task that could be given over to something that could process those resumes nearly instantaneously: an artificial intelligence system. In fact, that’s exactly what companies like Amazon and Google have tried in the past, though the results were not what they expected. Welcome to Data Demystified. I’m Jeff Galak and in this episode we’re going to talk about gender bias in artificial intelligence. To be sure, there are many examples of bias in machine learning and AI systems and I plan to make videos about those too, but, for now, I want to focus on one big example in the world of resume vetting. After all, one of the goals of things like gender and racial equity is to ensure that everyone, regardless of their gender or race, has a fair shake at the most desirable jobs out there. But when companies let AI algorithms have a say in those decisions, bias has a sneaky way of creeping in. In this episode, I’m going to try and provide you with the intuition to understand how this type of bias could emerge, even when a big goal of these systems is to take [More]
“Man is to computer programmer as woman is to____” Common sense says that the missing term should be computer programmer because the term is not intrinsically gendered, unlike king and queen, but a computer with a standard word embedding system would probably complete it “Man is to computer programmer as woman is to homemaker.” In this episode, we explain how our unconscious biases can be passed down to machine learning algorithms. Read more at https://go.unbabel.com/blog/gender-bias-artificial-intelligence/ Illustration, Animation and Sound Design: Favo Studio https://vimeo.com/favostudio ►► Every day it seems like machines learn more and more and the content we consume says less and less. That’s why we’re building Understanding with Unbabel — a deeply human take on language, artificial intelligence, and the way they’re transforming customer experience. About Unbabel At Unbabel, we believe language shouldn’t stand in the way of relationships. By combining human expertise and artificial intelligence, we give businesses and their customers the ability to understand each other, make smarter choices, and have richer experiences. Follow us on: Facebook: https://www.facebook.com/unbabel/ Twitter: https://twitter.com/Unbabel Linkedin: https://www.linkedin.com/company/unbabel/ Instagram: https://instagram.com/unbabel/