Face Scanning and the Freedom To Be Stupid In Public: A … – The Markup

Posted: October 23, 2023 at 10:46 pm

Hello, friends,

As we move further into autumn, and the leaves start to turn and sweaters and scarves come out where I am in New York City, I want to take you back to the last holiday season, when a group of lawyers received a not-so-festive surprise at Radio City Music Hall and Madison Square Garden.

In December 2022, three days before Christmas, New York Times reporter Kashmir Hill, along with her colleague Corey Kilgannon, showed how MSG Entertainment, which owns Radio City, the Garden, and other venues, had created an attorney exclusion list for lawyers and law firms suing the company.

With facial recognition tools, MSG could instantly detect when any of the lawyers on the list visited one of their venues. One attorney was pulled aside while trying to chaperone her 9-year-old daughters Girl Scout troop to the Christmas Spectacular at Radio City, the Times said. Others were turned away from Rangers and Knicks games and a Mariah Carey concert.

The Breakdown

Technically, yes, but it is not always easy

The lawyers had strong words for MSGIts a dystopian, shocking act of repression, one told the Timesbut, as always, the profession did its real talking in court, with suits filed in a state supreme court and in the federal Southern New York district.

Hill, who began covering digital privacy nearly a decade and a half ago, kept reporting on facial recognition as it spread across multiple industries in the U.S. and Britain. She showed how stores, including supermarkets, have used facial recognition to eject and monitor alleged shoplifters, police forces have used it to arrest people based on false face matches, and increasingly wary tech giants have begun pumping the brakes on their use of the tools.

Last month, Hill released a gripping and disturbing book, Your Face Belongs to Us, about Clearview AI, a startup whose aggressive use of facial recognition has made it a key purveyor to law enforcement and other government agencies around the world. For the book, Hill drew on her extensive reporting on the company, starting with her January 2020 expos The Secretive Company That Might End Privacy as We Know It, which revealed the companys existence, founders, capabilities, and police client base, generating immediate concern among civil and digital rights groups and government watchdogs.

I recently spoke with Hill, who Ive known since 2009, when she was writing about privacy at Forbes and I was covering Silicon Valley scandals as Gawkers Valleywag columnist. We talked about Clearviews messy origin story; how her own thinking on facial recognition evolved in the course of covering the company and writing the book; how Clearview has changed the world, including tech and law enforcement; possible ways to address the problems created by facial recognition; and much more. You can find our conversation below, edited for brevity and clarity.

Ryan Tate: I expected this book to be a book about technology, but instead I was immediately reading about people who were not hugely technically proficient. Did it surprise you that looking into Clearview AI led you to interview the sort of people who might post on 4Chan?

Kashmir Hill: Yeah, definitely. When I first heard about Clearview AI, I just assumed that there was some mastermind involved in the company that allowed them to do what Facebook and Google and even the government hadnt been able to do: build this crazy tool that looks through the whole internet for a face. I was surprised to never quite find the technological mastermind.

Instead, it was a different story, essentially that this technology had become accessible enough for marginal characters to create a very powerful tool. The barrier to entry had lowered so much. Its kind of like my tagline now, that what Clearview did was not a technological breakthrough, it was an ethical one. They were just willing to do what others hadnt been willing to do.

With so much of this technology now, advances in AI that are really widely accessible, it will be what the marginal characters are willing to do that will create the new lines in the sand. Its not just the big tech giants that wield these powers anymore.

Tate: You have this fascinating chapter in the book about where you go into this tactical police center in Miami. I felt like you alternated between showing how invasive this technology could be, and almost lamenting how bad some of the surveillance technology was. At one point in the chapter, its almost like youre marveling at this high-resolution camera on top of a hotel that can really zoom in and see people really closely. Then there are these other cameras they have access to that are totally grainya crime happens and they dont capture anything they could run an algorithm against.

In reporting this book, did you ever feel like the reporting put you in the shoes of the users or advocates of facial recognition and gave you insights into why theyre interested in it?

I remember thinking, I wish I had Clearview. I want to know who these people are who arent willing to stand for a pregnant lady.

Hill: Yeah, talking to officers, especially talking to one officer from the Department of Homeland Security, who works on all these child crime cases and just hearing about those cases where they find these images of abuse, like on an account in another country, where they have no idea who this person is. Sometimes they can tell thats in the U.S. because of the electrical outlets, but they have no idea whos this child, whos this abuser. They could be anyone in the country.

And I relate a case where they run the abusers face and they get a lead to this guy in Las Vegas, and they end up going to his Facebook account, seeing photos of the child. That was the first case that the Department of Homeland Security used Clearview in, and it led them to get a subscription. I see the power of a use case like that.

It was funny, when I was working on the first Clearview story, I was really pregnant, and I would get on the subway to ride from Brooklyn, where I lived at the time, to the office in Manhattan. Sometimes no one would get up for me and let me sit down on the subway. I just remember thinking, I wish I had Clearview. I want to know who these people are who arent willing to stand for a pregnant lady.

I can see the appeal of tools like this. And I think they can be useful. But I also dont want to live in a world with no anonymity, where were subject to this all the time, because I do think it would be very chilling.

Tate: Do you believe that whatever legislation comes along for facial recognition should have an exception that would allow facial recognition on people not yielding their seats to pregnant people on the subway? [laughs]

Hill: Thats going to be the worst. Its going to be like, This guy was manspreading, and its going to have his name attached to it, and theres going to be a whole cycle of abuse on social media.

I also dont think we want perfect enforcement of the law, because people like to jaywalk and they like to speed. And they like to get drunk and be stupid in public sometimes.

When I was working on this book, I thought a lot about this vast web of vengeance story I did. Its about the serial defamer who would go after people she had grudges againstand anyone related to them and their colleagues. She was defaming hundreds of people online for a slight that happened at a firm she worked at in the 90s. I just think about someone like that who carries a grudge, whos kind of got a vicious streak, having a tool like Clearview AI or PimEyes, and you bump into her on the subway and she takes your photo and writes horrible things about you online for years to comeand you have no idea where you even encountered her.

I can imagine those kinds of scenarios where brief slights in the real world carry over, because all of a sudden were not strangers anymore, or it could make the world more accountable. So, you dont slight anyone anymore, because who knows what happens after that.

Tate: Is there a moment in the Clearview story that youre surprised hasnt resonated more?

Hill: The one thing that surprised me was that time that Clearview AI went to the attorneys generals meeting at Gillette Stadium during the Rolling Stones show and was showing all the attorneys general what they had done. They were like, thats creepy or thats weird. There was no more formal reaction to what theyd just been shown. I was surprised that none of those attorneys general launched investigations into the company after seeing it on display, especially because it made them so uncomfortable. [Hill wrote that the event, for Democratic attorneys general, was in a private box at the stadium. It took place six months before Hills expos on Clearview.]

I do feel like thats something thats hard with these kinds of cutting-edge technologies is that sometimes people see them, and I think they think it already existed. They dont realize what theyre looking at, and how new it is or how groundbreaking it is.

I heard the same thing from lawyers when they were getting banned from Madison Square Garden. It was happening for months before the media reported on it. I was like, Why didnt you tell anybody this was happening? They were like, Oh, I just thought this was a thing that happens in the world. They didnt realize that it was such a shocking use of the technology.

I think sometimes people are looking at the future and they dont realize it.

Tate: Would you put that inability to see the future when its in front of you on government employees, and/or attorneys, or do you think thats happening to all of us?

Hill: I think its happening to all of us, this belief that all of technology is so powerful and so good. Just all these kinds of assumptions that smartphones are listening to usthey must be, because the ads Im getting are so targeted. Just the belief that what youve seen in science fiction movies is real. I think so many of these companies are basically trying to make dystopian depictions of the future real, and maybe thats part of it.

But I find theres real cognitive dissonance between how powerful the technology is and the understanding of how poorly it works, and that it can work really well. I really like the Miami chapter for that. You think that law enforcement is so powerfulthat they have these eyes everywhere, they can hear everything that happens. When youre in the control room, you see, actually, how blurry their vision is and how limited. I think its on all of us that we have to try to keep both of those things in our mind.

Tate: The racial inequity problems with this technology are prominent early in the book, but later you write about how the window of time for that criticism to be effective is closing as top developers have focused on addressing the problems of biased algorithms. Can you say more about that?

Hill: I think theres a racial inequity issue in terms of who it will be used on, particularly in policing. Even as the problems have been addressed in terms of the training data and making sure its trained on more diverse faces and getting rid of what they call differential performance or bias, were still seeingin every single wrongful arrest we know ofthat the person is Black.

So, I think theres clearly still racial problems there. Part of it is just Black people are more subject to policing tools than anyone else. So, theyre suffering the harms of it when it goes wrong.

Tate: Is there momentum behind systemic remedies around facial recognition, like legislation? I was struck by what you wrote about how we could have a world where there are speed cameras everywhere and automatically send speeding tickets to people, and we seem to have chosen not to do that. Is there a world where facial recognition goes into the trash can in a similar way? Or do you think its just too useful?

Hill: Its funny, I was talking to a facial recognition vendor whose company is based in the U.K., and hes like, Why is the U.S. so opposed to real-time facial recognition? It really makes you safer. The U.K. really likes that, and they have been resisting how we use it here, where you use it retroactively to identify criminal suspects. So, there are some cultural differences in how its playing out.

Its so chilling to think that every moment can be recorded and that we could have this time machine where you can trace and track everything weve ever done.

There are a lot of technologies that we have constrained, from speed cameras to recording devices. All of our conversations could be recorded by surveillance cameras or on the wires. It would be very easy to just keep records of everything that happens. We have, as a society, resisted that because its so chilling to think that every moment can be recorded and that we could have this time machine where you can trace and track everything weve ever done.

I dont think we want that. I also dont think we want perfect enforcement of the law, because people like to jaywalk and they like to speed. And they like to get drunk and be stupid in public sometimes. They want to fondle their first date at a Beetlejuice theater. [laughs] I think people want a little bit of anonymity and the freedom to make bad decisions, you know, within reason.

I do think that the appeal of facial recognition technology to solve horrible crimes is very real and is a reason why activists who want it completely banned are probably not going to see that happen.

Tate: Are there other interesting ways we might constrain this technology that have emerged? Are there ideas you think are particularly promising in that area that might get some momentum?

Hill: I think constraining the commercial use of it, like weve seen in Illinoiswhere youre not supposed to be using peoples biometric information, including their face prints without consenthas been a powerful law for facial recognition. Its just not being widely deployed there.

My favorite example is Madison Square Garden, which originally installed it for security threats and then in the past year, used it to keep lawyers out of their New York City venues like MSG and Beacon Theatre and Radio City Music Hall. But they also have a theater in Chicago, and they dont use facial recognition technology there, because the Illinois law prevents them from doing that. Thats a law that works. Its a way to make sure that its only used in a way that benefits youand not in a way of penalizing you.

In terms of police use, Massachusetts passed a law that creates rules for how police are allowed to use facial recognition technology, from getting a warrant to running a search. Detroit is a really interesting place where theyve had three known cases of bad face matches that have led to arrest, so I think the city is really thinking about this. They want to keep using the tool and theyre trying to use it responsibly, but only use it for serious crimes, violent crimes.

Tate: One of Clearviews founders, toward the end of the book, mentions background recognition as a potential new feature, to the point where we see this brick in the wall, we can determine the age of the brick, or know that its used in this particular neighborhood of London. What other new technologies or approaches might lie in Clearviews future?

Hill: I dont know if it would be Clearview, but Ive been thinking a lot about voice search. You could imagine a Clearview AI that started gathering all the audio thats been recorded and link[s] it to individuals, so that you can upload a few seconds of somebodys voice and find anything theyve ever recorded or said.

The one thing that kept coming up with activists is, if we say its okay for Clearview to gather everyones photos and create this database, what stops a company from starting to build a genetic database, whether buying clippings from hairstylists, or going out on garbage collection day and collecting samples? Or what Charles Johnson says hes doinggoing to funeral homes and buying genetic material from corpses that you could create a genetic database that you then sell to access to the police, or sell access to whoever might possibly want that.

Theres so many ways that you could reorganize the internet of information and the real world around these markers for usmany of which are quite dystopian.

Thank you, as always, for reading, and may your fall camera moments be uniformly happy ones.

Yours,

Ryan Tate Editor The Markup

Read more here:

Face Scanning and the Freedom To Be Stupid In Public: A ... - The Markup

Related Posts