The pandemic is testing the limits of face recognition – MIT Technology Review

Posted: September 29, 2021 at 7:30 am

More and more, its being used in whats presented as the interest of public health. Australia recently expanded a program using facial recognition to enforce covid-19 safety precautions. People who are quarantining are subject to random check-ins, in which theyre required to send a selfie to confirm they are following rules. Location data is also collected, according to Reuters.

When it comes to essentials like emergency benefits to pay for housing and food, the first priority should be making sure everyone is able to access help, Greer says. Preventing fraud is a reasonable objective on the surface, she adds, but the most pressing goal must be to get people the benefits they need.

Systems have to be built with human rights and with vulnerable peoples needs in mind from the start. Those cant be afterthoughts, Greer says. They cant be bug fixes after it already goes wrong.

ID.mes Hall says his companys services are preferable to the existing methods of verifying identity and have helped states cut down on massive unemployment fraud since implementing face verification checks. He says unemployment claims have around a 91% true pass rateeither on their own or through a video call with an ID.me representative.

[That] was our goal going in, he says. If we could automate away 91% of this, then the states that are just outgunned in terms of resources can use those resources to provide white-glove concierge service to the 9%.

When users are not able to get through the face recognition process, ID.me emails them to follow up, according to Hall.

Everything about this company is about helping people get access to things theyre eligible for, he says.

The months that JB survived without income were difficult. The financial worry was enough to cause stress, and other troubles like a broken computer compounded the anxiety. Even their former employer couldnt or wouldnt help cut through the red tape.

Its very isolating to be like, No one is helping me in any situation, JB says.

On the government side, experts say it makes sense that the pandemic brought new technology to the forefront, but cases like JBs show that technology in itself is not the whole answer. Anne L. Washington, an assistant professor of data policy at New York University, says its tempting to consider a new government technology a success when it works most of the time during the research phase but fails 5% of the time in the real world. She compares the result to a game of musical chairs, where in a room of 100 people, five will always be left without a seat.

The problem is that governments get some kind of technology and it works 95% of the timethey think its solved, she says. Instead, human intervention becomes more important than ever. Says Washington: They need a system to regularly handle the five people who are standing.

Theres an additional layer of risk when a private company is involved. The biggest issue that arises in the rollout of a new kind of technology is where the data is kept, Washington says. Without a trusted entity that has the legal duty to protect peoples information, sensitive data could end up in the hands of others. How would we feel, for example, if the federal government had entrusted a private company with our Social Security numbers when they were created?

The problem is that governments get some kind of technology and it works 95% of the timethey think its solved

Widespread and unchecked use of face recognition tools also has the potential to affect already marginalized groups more than others. Transgender people, for example, have detailed, frequent problems with tools like Google Photos, which may question whether pre- and post-transition photos show the same person. It means reckoning with the software over and over.

[Theres] inaccuracy in technologys ability to reflect the breadth of actual diversity and edge cases there are in the real world, says Daly Barnett, a technologist at the Electronic Frontier Foundation. We cant rely on them to accurately classify and compute and reflect those beautiful edge cases.

Conversations about face recognition typically debate how the technology could fail or discriminate. But Barnett encourages people to think beyond whether the biometric tools work or not, or whether bias show up in the technology. She pushes back on the idea that we need them at all. Indeed, activists like Greer warn, the tools could be even more dangerous when they work perfectly. Face recognition has already been used to identify, punish, or stifle protesters, though people are fighting back. In Hong Kong, protesters wore masks and goggles to hide their faces from such police surveillance. In the US, federal prosecutors dropped charges against a protester identified using face recognition who had been accused of assaulting police officers.

Read the original post:

The pandemic is testing the limits of face recognition - MIT Technology Review

Related Posts