THE FUTURE IS HERE

Algorithmic Bias and Fairness: Crash Course AI #18

Check out my collab with “Above the Noise” about Deepfakes: https://www.youtube.com/watch?v=Ro8b69VeL9U
Today, we’re going to talk about five common types of algorithmic bias we should pay attention to: data that reflects existing biases, unbalanced classes in training data, data that doesn’t capture the right value, data that is amplified by feedback loops, and malicious data. Now bias itself isn’t necessarily a terrible thing, our brains often use it to take shortcuts by finding patterns, but bias can become a problem if we don’t acknowledge exceptions to patterns or if we allow it to discriminate.

Crash Course is produced in association with PBS Digital Studios:
https://www.youtube.com/pbsdigitalstudios

Crash Course is on Patreon! You can support us directly by signing up at http://www.patreon.com/crashcourse

Thanks to the following patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:

Eric Prestemon, Sam Buck, Mark Brouwer, Efrain R. Pedroza, Matthew Curls, Indika Siriwardena, Avi Yashchin, Timothy J Kwist, Brian Thomas Gossett, Haixiang N/A Liu, Jonathan Zbikowski, Siobhan Sabino, Jennifer Killen, Nathan Catchings, Brandon Westmoreland, dorsey, Kenneth F Penttinen, Trevin Beattie, Erika & Alexa Saur, Justin Zingsheim, Jessica Wode, Tom Trval, Jason Saslow, Nathan Taylor, Khaled El Shalakany, SR Foxley, Yasenia Cruz, Eric Koslow, Caleb Weeks, Tim Curwick, DAVID NOE, Shawn Arnold, William McGraw, Andrei Krishkevich, Rachel Bright, Jirat, Ian Dundore

Want to find Crash Course elsewhere on the internet?
Facebook – http://www.facebook.com/YouTubeCrashCourse
Twitter – http://www.twitter.com/TheCrashCourse
Tumblr – http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse

CC Kids: http://www.youtube.com/crashcoursekids

#CrashCourse #ArtificialIntelligence #MachineLearning