Share it with your friends Like

Thanks! Share it with your friends!

Close

AI algorithms make important decisions about you all the time — like how much you should pay for car insurance or whether or not you get that job interview. But what happens when these machines are built with human bias coded into their systems? Technologist Kriti Sharma explores how the lack of diversity in tech is creeping into our AI, offering three ways we can start making more ethical algorithms.

Get TED Talks recommended just for you! Learn more at https://www.ted.com/signup.

The TED Talks channel features the best talks and performances from the TED Conference, where the world’s leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design — plus science, business, global issues, the arts and more.

Follow TED on Twitter: http://www.twitter.com/TEDTalks
Like TED on Facebook: https://www.facebook.com/TED

Subscribe to our channel: https://www.youtube.com/TED

Buy/Stream:

Comments

Roman says:

Great talk, thank you. Stop presenting and thinking of yourself as a nerd (yeah looks like you are a nerd but no need to behave like 1), think of yourself and present yourself as a smart and successful young woman, maybe then people will take you more seriously.

CR500 KING says:

You misworded your title. Should be "how to keep ai politically correct"

NFox says:

"AI learns men are more likely to be programers than women." => "men make better programmers than women"
This is so inaccurate my ears hurts. This podcast has nothing to do with AI, it's about SJW.

TheBlueBluedoggy says:

I never thought about the whole Alexa and what not being all female before you said anything. I’m 14 and I have have two younger brothers and it has never mattered to us at all, this person wants to make it a problem where it isn’t.

James Humphrey says:

Why has TED swallowed a massive blue pill?

Mr. Derek Gets it! says:

THE COLUMBINE SHOOTERS MOM GETS A CLIP. That loser should be jailed! No comments allowed so I'll leave a comment on every video! DO NOT CELEBRATE LOSER PARENTS THAT STOOD IN THE WAY OF HELP, when parents are responsible for their kids actions then we will see less death! SICKENING TO APPLAUD COLUMBINES SHOOTERS MOM SHAME,

GlaciusTS says:

She lost me at “based on the biases it has learned from us”.

Those AI know nothing about human bias. They looked at the numbers and simply came to the same conclusions as biased people. The AI came to the conclusion that people with different gender and race are actually quite different from one another when it comes to things like driving ability and debt.

Is it so hard to believe that skin color might not be the only difference between two races? I mean, it only makes sense that all those years we spend apart, the Africans acquired different personality traits and skills on account of their cultural differences and the ecosystem they lived in. If an AI recognizes it, it’s probably a common genetic trait among people of that race or gender.

Andrei says:

Please change the title. It's not about how to keep human bias out of AI but how bias exists in machine learning based on statistics. I'm rarely pissed at TED for wasting my time but today it's the case.

Also, I prefer my assistants to be women. Not because I want to shout orders at a woman but because her voice is more intelligible in a noisy environment than his.

Max Liu says:

This speech is literally biased

Max Liu says:

Why women?!

barry weber says:

AI will at some point create cringe worthy episodes in our everyday lives.

Eli Nope says:

I heard that cell phone companies are biased against people who opt out of using cell phones. 0% of people who refuse to own a cell phone, own a cell phone, this is obviously bias from the cell phone companies. And that is basically what she is saying.

Honudes Gai says:

only two things I think about Indians online they are either trying to get a medical degree at 8 or they are scammers

Vaylain says:

What if she did not suffer from any gender biases after all and instead it was just a simple case where the online trolls had recognized and attacked her original profile and did not attack her kitty-cat profile? What if her observed profile issue was misunderstood and incorrectly diagnosed to be a causal effect? Mayhaps she may also be instilling a Human-Bias and a touch of misguided Social Justice into her TED talk, effectively wasting a lot of effort and time?

Another thing to consider…Statistics, "Observed and Collected Data and NOT RACISM/SEXISM Plots" are used to determine trends based on collected data. The more data collected, the more accurate the trend may become and the easier it is to detect. Women may not be in as much abundance to that of men in the software development industry simply due to the fact that not as many women are choosing to become a software developer. It is NOT because they are forbidden or ridiculed for becoming one, the statistics are just showing a trend that not as many women are electing to become a software developer as their career path.

Muralykrishnan S says:

Hope she just started a discussion. It's a good one.

TheAgency says:

cool video!!!

Mr Doodleydoo says:

If the groups involved changed the statistics about themselves, the algorithms would reflect that. If White/Asian people suddenly became the least likely to repay loans, the algorithms would reflect that.

Dr Peter jones says:

This is not about Ai at all this is about discrimination.

This is more frightening than you think. So a female goes into a railway station and asks for a ticket from A to B. They are given the same cost ticket as a man. A blind person goes into the same railway station to ask for the same ticket and is charged twice the amount even though the law is very clear that blind person should be charged the lowest fare possible of all tickets regardless of gender.

Why is this ?
Ansawer the algorythm written by a person discriminates against the blind person because it was written by a non blind person. Its a case of a process designed by the sighted for the sighted which excludes the non sighted even though the law has been broken.

The only way forward is yearly testing of machines to certify they are disabled tested before they can be used by the public. Simply the machine tries to cement existing predujices against groups and are not fit for purpose.

justletmepostthis says:

"Don't worry, she will never be on the internet"…She IS the internet…Funny how these people (Elitist mindset) will lie, just to get what they want, without any regard to anyone but themselves.

Ayala Crescent The Shield Abode says:

If a machine say so… I can do dishes.

krishna punyakoti says:

Mark Zuckerberg or Elon Musk didnt look like now Mark Zuckerberg or Elon Musk when they had to try hard cracking their first deals, building trust, struggling to make things happen.
There seems to a lot of bias going in this talk than in the AI or in the stats. Lets build an AI to screen military job applications by introducing 50-50 gender bias and see how it goes. If we are introducing 50-50 its not taking out bias, its adding more bias coz its totally deviant from the ground truth.

Joxman2k says:

I didn't hear anything about "How" to keep human bias out of AI, just that there is bias. I think this has more to do with machine learning than actual AI. Many viewpoints can be part of an AI algorithm, but being neutral should be the goal. I'm not sure how her expressing apparent male centric developers bias as being bad and her more woman centric bias as being correct is supposed to balance that out. I mean exchanging one bias for another is not keeping out human bias. She does bring up an important topic, but it is more about awareness than it is about solving it.

Kevin Reardon says:

What we need is less dickheads in AI

John Farris says:

I think it's funny that they think they know what I want. When due to boredom what I want changes every day.

Isedorgamlit says:

wow this sets a new low bar – now I could give Ted talks too, it seems.

Cephalic Miasma says:

Entire argument is based on a flawed assumption – that there are no distinctions between racial and gender groups (whether they are inherent or the result of socioeconomic factors) and that all statements regarding any differences are inherently prejudiced. This needs to be shown first, you cannot merely assert this.

Husam Starxin says:

I'm sorry, but this is by far, one of the dumbest TED talks I've ever seen. And take it from a commuter science graduate, she has No idea what she's talking about when it comes to ML and AI

Tommy Kiddler says:

There are tons of problems AI should think to.

Write a comment

*

DARPA SUPERHIT 2021 Play Now!Close

DARPA SUPERHIT 2021

(StoneBridge Mix)

Play Now!

×