Bias in facial recognition isn’t hard to discover, but it’s hard to get rid of

My list

coded-bias-ft-img

Joy Buolamwini is a researcher at the MIT Media Lab who pioneered research into bias that’s built into artificial intelligence and facial recognition. And the way she came to this work is almost a little too on the nose. As a graduate student at MIT, she created a mirror that would project aspirational images onto her face, like a lion or tennis star Serena Williams.

But the facial-recognition software she installed wouldn’t work on her Black face, until she literally put on a white mask. Buolamwini is featured in a documentary called “Coded Bias,” airing tonight on PBS. She told me about one scene in which facial-recognition tech was installed at an apartment complex in the Brownsville neighborhood of Brooklyn, New York. The following is an edited transcript of our conversation.

+INFO: MarketPlace

Related Posts

SmartCity
Thank you for registering to Tomorrow.City. You can now start exploring from your computer, or with your phone or tablet downloading our app!
Only accessible for registered users
This content is available only for registered users
TO: $$toName$$
SUBJECT: Message from $$fromName$$