Although Amazon is at the forefront of AI technology, the company could not find a way to make its algorithm gender neutral. But the company’s failure reminds us that AI develops bias from a variety of sources. While there’s a common belief that algorithms are supposed to be built without any of the bias or prejudices that color human decision-making, the truth is that an algorithm can unintentionally learn bias from a variety of different sources. Everything from the data used to train it, to the people who are using it, and even seemingly unrelated factors, can all contribute to AI bias. This Case study is all about how Amazon AI recruiting tool that showed bias against women. #Amazon #genderequality #AIinrecruitment #marketingcasestudy #mbacasestudy #managementcasestudy #casestudy #HRcasestudy