The facial recognition software … couldn’t ‘see’ her dark-skinned face unless she put on a white mask

Jan 29, 2019

How Facial Recognition Software has biases

The facial recognition software … couldn’t ‘see’ her dark-skinned face unless she put on a white mask

MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn’t detect her face — because the people who coded the algorithm hadn’t taught it to identify a broad range of skin tones and facial structures.

Now she’s on a mission to fight bias in machine learning, a phenomenon she calls the “coded gaze.” It’s an eye-opening talk about the need for accountability in coding … as algorithms take over more and more aspects of our lives.

Note from SunShower: our programs can help you develop your ability to track your biases.

More From Our Blog…

My Mid-September Reflections

My Mid-September Reflections

A moment of reflectionAs I sat down to write this email, I found myself scrolling through the mailing list and recognizing so many names—people I’ve had conversations with that still stand out. Whether it was about Ouch!, purchasing a license or discussing trends in...

read more
DEI LEAP: Empowering Leaders Through Turbulent Times

DEI LEAP: Empowering Leaders Through Turbulent Times

DEI LEAP: Empowering Leaders Through Turbulent Times ​As we all know, 2024 has brought a wave of attacks against DEI. A handful of outspoken critics, such as Elon Musk, are misrepresenting DEI and attacking the strategies and practices that are creating more equitable...

read more
The Colorblindness Trap

The Colorblindness Trap

Read. This. Article. It's important. The Color Blindness Trap: How a civil rights ideal got hijacked Nikole Hannah-Jones is a domestic correspondent for The New York Times Magazine focusing on racial injustice. Her extensive reporting in both print and radio has...

read more