Predictive algorithms may help us shop, discover new music or literature, but do they belong in the courthouse? Dartmouth professor Dr. Hany Farid reverse engineers the inherent dangers and potential biases of recommendations engines built to mete out justice in today’s criminal justice system. The co-founder and CTO of Fourandsix Technologies, an image authentication and forensics company, Hany Farid works to advance the field of digital forensics. Hany said, “For the past decade I have been working on technology and policy that will find a balance between an open and free Internet while reining in online abuses. With approximately a billion Facebook uploads per day and 400 hours of video uploaded to YouTube every minute, this task is technically and logistically complicated but also, I believe, critical to the long-term health of our online communities.” Hany is the Albert Bradley 1915 Third Century Professor and Chair of Computer Science at Dartmouth. He is also a Senior Adviser to the Counter Extremism Project. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx
Increasingly, algorithms and machine learning are being implemented at various touch points throughout the criminal justice system, from deciding where to deploy police officers to aiding in bail and sentencing decisions. The question is, will this tech make the system more fair for minorities and low-income residents, or will it simply amplify our human biases? We all know humans are imperfect. We’re subject to biases and stereotypes, and when these come into play in the criminal justice system, the most disadvantaged communities end up suffering. It’s easy to imagine that there’s a better way, that one day we’ll find a tool that can make neutral, dispassionate decisions about policing and punishment. Some think that day has already arrived. Around the country, police departments and courtrooms are turning to artificial intelligence algorithms to help them decide everything from where to deploy police officers to whether to release defendants on bail. Supporters believe that the technology will lead to increased objectivity, ultimately creating safer communities. Others however, say that the data fed into these algorithms is encoded with human bias, meaning the tech will simply reinforce historical disparities. Learn more about the ways in which communities, policemen and judges across the U.S. are using these algorithms to make decisions about public safety and people’s lives. » Subscribe to CNBC: http://cnb.cx/SubscribeCNBC About CNBC: From ‘Wall Street’ to ‘Main Street’ to award winning original documentaries and Reality TV series, CNBC has you covered. Experience special sneak peeks of your favorite shows, exclusive video and [More]