How does A.I. technology exacerbate inequalities in policing?

WHYY’s “Morning Edition” host Jennifer Lynn speaks with Malcolm Burnley and Ed Felten, hosts of WHYY’s new podcast, “A.I. Nation.”

Listen 7:46
Ed Felten (left) and Malcolm Burnley

Ed Felten (left) and Malcolm Burnley (right) host WHYY's podcast A.I. Nation. (WHYY)

Artificial intelligence is a kind of oxygen we need as our everyday lives intertwine with high-tech machines. It helps us make mobile check deposits, connect with ridesharing services, and set reminders.

But, at times, A.I. has its hiccups … when its algorithm, a set of instructions designed to perform a specific task, doesn’t adjust to circumstances and humans’ needs. Fans of “The Office” might recall a scene when Michael and Dwight are using GPS.

That scene from “The Office” shows up in this week’s episode of the new podcast “A.I. Nation” from WHYY and Princeton University. A little bit of humor is a starting point for a very serious discussion about A.I. as it’s being used in policing.

WHYY’s “Morning Edition” host Jennifer Lynn met up with “A.I. Nation” hosts over Zoom to talk about this. First, some introductions:

  • WHYY thanks our sponsors — become a WHYY sponsor

___

Malcolm: I’m Malcolm Burnley. I’m a journalist in Philadelphia, and I’m the co-host of “A.I. Nation.”

Ed, you’re on.

Ed: I’m Ed Felten. I’m a professor at Princeton University and a co-host of “A.I. Nation.”

And you are now entangled forever.

Ed: That’s right.

I think this is a winner.

Malcolm: Yeah.

How did you meet? How did that happen?

Ed: Well, I think it came out … the desire to have a podcast about this topic. I worked in the White House on A.I. policy and came back after that experience, really excited about this area and wanted to talk to people, communicate with people about it and talk to them where they were.

Malcolm?

Malcolm: Conversely, I know nothing about A.I. and I’m not a tech reporter. I’ve focused most of my career around reporting on equity. And I think my interest came from a point of looking at how A.I. has penetrated our lives and looking at previous points in history, like the Industrial Revolution, where great inequity came out of that time and wondering if we’re going to be facing some of those same difficulties with A.I.

Yeah, and this particular episode, Episode Four, does definitely go down that road. Malcolm, we hear from a man who’s wrongfully arrested due to facial recognition technology, an African American man, Nijeer Parks. He lives in New Jersey. Tell me a little bit more about his story and what happens.

Malcolm: The police picked up and arrested Nijeer, and the suspect in the crime was several inches taller than him. He was described as driving away from the scene, and Nijeer didn’t have a driver’s license at the time. There was all sorts of evidence that would suggest that, at the very least, they needed to go several steps further than simply the facial recognition match.

  • WHYY thanks our sponsors — become a WHYY sponsor

Nijeer is accused of messing with a police officer. Here is Nijeer Parks from the podcast “A.I. Nation.”

CLIP: I grew up in the streets most of my life, so I’ve seen things happen. So I know what kind of things that [you] think you have to do. If they think you have the police officers, they’re going to beat the crap out of you when they get you.

Ed, what was the mistake here that the algorithm made?

Ed: The really troubling thing with many facial recognition algorithms is that they seem to do much worse, to make a lot more mistakes for dark-skinned faces. So obviously that’s problematic. People being held by mistake is a problem in itself. And when that burden falls even more on historically discriminated-against communities, then that’s even more of a problem.

Understood. And fortunately, Nijeer Parks proves his innocence with a solid alibi. Malcolm, also in this episode, we hear from a lawyer and director of research policy at A.I. Now Institute, and her name is Rashida Richardson. And she talks about her research into dirty data, biased data that feeds into what’s called “predictive policing” algorithms and asks whether it’s possible for a police department with, say, a history of civil rights violations to make unbiased predictions. Tell me about interviewing her.

Malcolm: Rashida Richardson ultimately comes down on believing that we can’t necessarily correct for the bias that’s in the data, and that ultimately predictive policing is kind of inherently flawed.

And she looked at 13 cities across the country about the data they were using in predictive policing algorithms. Here’s another clip from “A.I. Nation.” We start with you, Malcolm, and your narration.

CLIP: For example, predictive policing might recommend you send more police to a neighborhood that’s had a lot of 911 calls, which makes sense at face value. People call the police to report a crime, right?

“When you have individuals that call the police on others for non-criminal activity and that type of data is not corrected, then that can also skew both what looks like the prevalence of crime, but also who is committing crime.”

911 calls don’t necessarily mean crime is happening. They just mean somebody called 911. And we’ve seen big instances in the news of white people calling the police on Black people who were just being Black in public.

“There is an African American man who is recording me and threatened myself and my dog.”

Other predictive policing algorithms may use arrests in their data sets, but again, arrests don’t always equal crime.

Another clip from the podcast “A.I. Nation.” Ed, can this policing technology be more ethical?

Ed: It can. There are really important choices to make about how to use the technology and what we do to measure and fight against the kind of bias that can slip in. If we’re not careful, we’re basically asking the system to have us do things the way we did them before. So you have data about past policing practices, and if you tell the algorithm, predict what we would have done in this situation in the past, well, what you’re going to get is the same kind of policing you had in the past. If you want to do things differently, you need to build that into the process. You need to build it into the algorithm. There are ways of using this technology to detect and fight bias, but you have to actually set out to do that and not just sort of say, “Well, let’s keep doing things the way we did in the past, but maybe 5% more efficiently.”

Hey, thank you so much. Ed Felten and Malcolm Burnley, hosts of “A.I. Nation.” Good luck with the new podcast.

Ed: Thanks.

Malcolm: Sure thing.

Get daily updates from WHYY News!

WHYY is your source for fact-based, in-depth journalism and information. As a nonprofit organization, we rely on financial support from readers like you. Please give today.

Want a digest of WHYY’s programs, events & stories? Sign up for our weekly newsletter.

Together we can reach 100% of WHYY’s fiscal year goal