Face Recognition and What it Means for Our Privacy
Inside the company that could ‘end privacy as we know it’
Listen 41:11In November of 2019, New York Times tech reporter Kashmir Hill got a tip that immediately had her on high alert. It was about a secretive tech company called Clearview AI, that claimed to have developed a facial recognition app that could identify people with 99 percent accuracy. The company had apparently scraped billions of images from the internet to create this tool, and was already offering this software to police departments across the U.S.
The tip sent Hill on a chase to find out who was behind this company and how this tool was being used. Her new book, “Your Face Belongs to Us: A Secretive Startup’s Quest to End Privacy as We Know It,” details what she found. On this podcast extra, Hill joins us to talk about the company’s billionaire investors, the mysterious and glamorous tech genius at its center, and what all of this means for our right to privacy.
On the people who had early access to the app
As I start doing all of this digging on the company, I find out that a very elite group of people knew about its existence and were using it, essentially billionaires and investors that the company had approached to try to get funding … Peter Thiel … well-known venture capital firms in San Francisco, Sequoia Capital, a billionaire in New York named John Catsimatidis, who’s run for mayor before, owns the Gristedes supermarkets. He had actually tried out the technology in his stores to catch shoplifters. He kind of memorably told me they had a real problem because people kept stealing Häagen-Dazs ice cream. And then he also had the Clearview AI app on his phone. And he said he was telling me about how it could come in useful because he was having dinner one night at an Italian restaurant and his daughter walked in with a guy on his with a guy on her arm. And he wanted to know who this guy was. And so he sent a waiter over to take a picture of them and then ran the guy’s photo through Clearview AI and found out he was a venture capitalist from San Francisco. And … he approved and sent a text to his daughter.
How this technology has gone wrong
Robert Williams is a suburban dad. He lives outside of Detroit. He’s got two young daughters, lives in a nice house … has a full time job … And one day he’s at work and he gets a call from the police saying that he needs to come to the station and turn himself in. It’s a couple of days before his birthday. And so he thought it was a friend pranking him. But when he gets home, he pulls into his driveway. He’s got dinner for his family and a police car pulls in behind and blocks him in. These two police officers kind of rush up to him, handcuff him, and tell him he’s under arrest for larceny and shoplifting and he gets arrested in front of his two young daughters. His wife is horrified. She doesn’t know what’s going on. The neighbors might be looking and … he gets taken to jail, held there overnight. And it turns out that he was arrested for the crime of looking like someone else.
What this app means for privacy
When I first heard about Clearview, I thought that what they had done was a technological breakthrough — that they had developed the best version of the facial recognition algorithm. But what I’ve come to learn since then is that this power had actually been developed earlier on by other technology companies: Facebook and Google had both basically internally developed this tool, the ability to … Google a face, the ability to take a photo of somebody and figure out who they are. And both of them had held it back. And these are companies that are not known necessarily for being super privacy protective. But they both decided this is too risky a technology to make publicly available. And so they sat on it. What Clearview had done was really an ethical breakthrough if they’d been willing to do something that others hadn’t.
If we don’t … claw back this technology, it will mean that any time you’re moving in public, someone with this power with an app like this can take a photo of you and know who you are. And so it means that … sensitive conversation that you’re quietly having over dinner, assuming that anyone around you isn’t going to know who you are, that changes.
Or you’re on the subway and … you bump into somebody or you’re rude to somebody or having a bad day. They could take your picture and know who you are, maybe go write nasty things about you online. It just means that people have much more information about you.
How people can protect themselves
This … advice may be pretty counter-intuitive, but because these face recognition apps are already out there, these search apps are out there, I think it could be a good idea to use them while they’re publicly available and see what is out there for you … what’s findable for yourself. And so I actually did this with my colleagues at the Times. One of the public face search engines is called PimEyes. And so I ran their photos with their consent. And I remember one of my colleagues found this … photo of her where she’s kind of cry laughing because she had just been proposed to and … her fiance had … arranged for a photographer to … take pictures. And she didn’t like this photo herself, but the photographer apparently really liked it, so he put it on his Yelp page and she’s like, ‘What? Why? Why is this photo on the public Internet?’ More extreme version of that, playing out right now in Virginia: one of the political candidates in the local race there has been tied to online sex videos that she made at some point in the past. Not tied to her real name at all, but they were dug up, we don’t know how they were found, but it could have been in one of these face searches. So I think at this point, one thing you can do is figure out, gosh, what is in … my face footprint.
WHYY is your source for fact-based, in-depth journalism and information. As a nonprofit organization, we rely on financial support from readers like you. Please give today.