EU
  
You are currently viewing the European Union version of the site.
Would you like to switch to your local site?
4 MIN READ TIME

How AI Learns to Be Sexist

If I said you had a beautiful bot, would you hold it against me?

TECHNOLOGY

@kmaney

IT WON’T BE LONG BEFORE someone reports a case of sexual harassment by an artificial intelligence. Loutish AI behavior won’t reach the grotesque levels of a Bill O’Reilly or Harvey Weinstein, given that a software bot can’t exactly open its robe and demand a massage. But we’re entering an era when Siri-like conversational AI will be embedded in the workplace, listening and commenting from, say, speakers in conference rooms. Sooner or later, one of these bots will mutate into a personality right out of Mad Men, single out the youngest woman in a meeting, and tell her, “Hey, Sugar, why don’t you get us all some coffee so we can watch you walk across the room.”

This sexist scenario highlights one of the great challenges we’ll face with AI: It learns from humans. And humans can be shitheads. Researchers have found that AI tends to latch on to excesses and go hard in that direction. This means that an AI soaking up the wrong signals from an already biased work culture could double down on its learnings and turn into an HR nightmare. “[AI] could work to not only reinforce existing social biases but actually make them worse”, says Mark Yatskar of the Allen Institute for Artificial Intelligence, which was started by Microsoft co-founder Paul Allen.

Unlock this article and much more with
You can enjoy:
Enjoy this edition in full
Instant access to 600+ titles
Thousands of back issues
No contract or commitment
Try for €1.09
SUBSCRIBE NOW
30 day trial, then just €11,99 / month. Cancel anytime. New subscribers only.


Learn more
Pocketmags Plus
Pocketmags Plus
Chat
X
Pocketmags Support