Lack of diversity in AI development results in predictable problems, yet they persist

There are articles every day about software programs that have embedded biases. How does this happen? It shouldn’t be a surprise when the teams that write the code are mainly white men. Unless those teams are working to disrupt their unconscious bias and include coders who are women, Black men and women, Hispanic men and women, and people from other marginalized communities, the problems will be hard-wired into their products.From the Guardian: “As facial recognition tools play a bigger role in fighting crime, inbuilt racial biases raise troubling questions about the systems that create them.”

“If you’re black, you’re more likely to be subjected to this technology and the technology is more likely to be wrong,” the late Elijah Cummings said in a congressional hearing on law enforcement’s use of facial recognition software in March 2017. “That’s a hell of a combination.” (Link)
NPR featured an interview (link) with Angle Bush, the founder of Black Women in A.I. (link), a company providing mentorship, education, and empowerment for Black women in the field.

What is AI?  The host points out that every time you ask Alexa to turn on your lights or play a song, you’re using AI. But AI is also put to work in more serious ways, like facial recognition software by law enforcement. Some critics say there’s a troubling lack of diversity among those who create the programs, and that is causing serious harm for people of color.

ANGLE BUSH: When we have these products and services and software, we have to understand that what’s at stake could literally be someone’s life. For example, there was a young man in Detroit. He was actually arrested in front of his family based on facial recognition. And they showed him the photo, and he said, that is not me. But the officers proclaimed, well, the artificial intelligence says that it is you. And so it’s a sad day in America when we have to prove that we’re innocent.

HOST: How could code get something like that wrong? How could code just look at his face and say that he’s somebody else?

BUSH: It’s all in the data. It’s all in the algorithm. For example, if we wanted to train something on what a cat looks like, and we only put in black cats, for example, if you tried to identify an orange cat or a gray cat or something like that, the computer would not identify that because that is not a part of the dataset.

HOST: But this can apply not just to, you know, cases of the criminal justice system. I understand that this can apply to, you know, other parts of people’s lives. What are other examples of ways poor AI code could impact somebody’s daily life?

BUSH: It can affect you and determine where you live, whether you get a loan for a home, your FICO score, your scores. It can determine a lot of different things as far as how your health care is provided. And in terms of the criminal justice system, some of the systems are determining whether people are allowed parole based on artificial intelligence and trying to remove the human that is in the loop.

HOST: Now, is this a situation of overt bias or racism, or is this unconscious bias, ways that people don’t know that they are setting up a system to not properly serve all of the people who may be using it?

BUSH: This is a system of unconscious bias when you don’t have diversity, when you don’t have people in the room to say, well, let’s step back on this data, because what’s happening is people are using historical data to solve current problems.

HOST: So can you explain a little bit more about why historical data in particular would be a problem?

BUSH: Because the historical data doesn’t necessarily represent what’s happening in the world right now. Where are you getting this data from? Have you cleaned the data? Have you looked at the data to see if it reflects current trends? Was there diversity when you first collected the data, or is this based on your own bias?

HOST: Is this a problem of big tech companies not being welcoming to Black engineers?

BUSH: It is. If the pandemic has taught us anything, companies are going to have to pivot and reimagine their company culture and what it takes to create a world where everyone feels welcome. And that’s one of the things that Black Women in A.I. is definitely looking to do.

HOST: So you’re talking about diversifying AI engineers. It seems like that’s just one step that could target the issues surrounding this entire ecosystem. But it’s also not an all-encompassing solution. Should we be reconsidering how AI is used in systems like the criminal justice system or even just in more innocuous ways in our lives?

BUSH: Yes, exactly. I think what we have to do is take a step back. Unfortunately, we can’t put everything back in the box, but we have to look at governance. It’s very important that the government – the U.S. government and all other governments start to look at how this is affecting people’s daily lives because as we know, artificial intelligence affects every aspect of our lives. And until we can get a hold and a grasp of how it’s going to affect someone negatively, then we have to pause.

More From Our Blog…

Toxic behavior that affects morale and retention

Toxic behavior that affects morale and retention

I came across an article in Fast Company (LINK) by Melinda Briana Epler, the author of How to Be an Ally: Actions You Can Take for a Stronger, Happier, Workplace. Melinda makes many excellent points about microaggressions and non-verbal bullying. She writes about the...

read more
Wait, was that a microaggression?

Wait, was that a microaggression?

I just discovered a new online encyclopedia of microaggressions. It lists specific examples of microaggressions and debriefs why they are harmful. Check out Micropedia here. I found it to be very interesting and useful. Micropedia lists microaggressions pertaining to...

read more
Joel interviewed on the Learning UNLOCKED podcast

Joel interviewed on the Learning UNLOCKED podcast

As a director, I’m most comfortable behind the camera. I enjoy interviewing people, asking questions and drawing out their stories. So, it was with a good deal of anxiety that I accepted the invitation to be interviewed on the Learning Unlocked podcast, sponsored by...

read more
25 Stories About Racial Microaggressions At School

25 Stories About Racial Microaggressions At School

BuzzFeed asked its readers to share stories of racial microaggressions when they were in school. The response is published in the article, People Of Color Are Sharing Their Experiences With Racial Microaggressions In School, And Honestly, None Of These Stories Should...

read more
Training is a Terrible Word

Training is a Terrible Word

The other day I was speaking with a client about the training courses that SunShower offers. I heard myself say, “but training is just a terrible word, isn’t it?” She laughed and agreed. We joked that training is for dogs and maybe some cats. It was funny, and, as I...

read more
Colin in Black and White

Colin in Black and White

If you’re like me, you’re always on the lookout for the next great show. Well, look no further than Colin in Black and White, the Netflix limited series. Ava DuVernay has woven together the inspiring and emotional story of how young Colin Kaepernick not only became a...

read more
Book Review – Americanah

Book Review – Americanah

In our work with organizations and especially in our Workshops Via Zoom, we always suggest strategies, skills and concrete actions that people can use to counter any automatic judgments or biases that come up. Because of how our brains categorize people and...

read more
Ava DuVernay on Microaggressions

Ava DuVernay on Microaggressions

Netflix is out with a new show on Colin Kaepernick’s early life, produced and directed by award-winning filmmaker Ava DuVernay. She explains what motivated her to do a story about a famous person, which is, in her own words, “not my thing”. She discusses how she got...

read more
Shrill

Shrill

You don’t have to be a keen observer of culture to notice that certain words and descriptions have become code-words to criticize and marginalize women and dismiss their ideas. To monitor how your own unconscious biases may be at play in your communications, you’ll...

read more
“I congratulate you and your people”

“I congratulate you and your people”

In a Congressional hearing to confirm President Biden’s nominee to the 9th US Circuit of Appeals, US Senator Grassley welcomed the Honorable Lucy H. Koh by saying, “If I’ve learned anything from Korean people, it’s a hard work ethic. And how you can make a lot out of...

read more