This week, King county, Washington, became the first county in the US to ban its sheriffs office and other agencies from using facial recognition technology in nearly all circumstances.
The new measure flies in the face of Washington states lenient facial recognition law, which was authored by a Microsoft employee and passed last year, and brings a challenge on an emerging set of technologies to greater Seattle, the home of both Microsoft and Amazon, two tech giants that sell facial recognition software.
The situation in Washington state reflects a broader national debate. Law enforcement is increasingly reliant on facial recognition, which can help identify suspects. Earlier this year, it was used by the FBI to identify insurgents at the US Capitol on Jan. 6. But as its use spreads, theres a growing backlash from critics, who see the practice as an invasion of privacy, an expansion of police power, and biased against people of color.
In the absence of any federal privacy regulation, some states and localities are taking action on the matter. King county joins more than a dozen cities, including San Francisco and Boston, and the state of Vermont, which have already banned police and other agencies from using facial recognition. Experts expect more to join the list.
The King county law, which only applies to countywide agencies, prohibits the government from acquiring facial recognition technology or using that kind of biometric data. The county police can still participate in the nationwide program that locates missing children, which relies on facial recognition.
King county was not currently using any facial recognition technology and both Amazon and Microsoft halted sales of their tech to police departments last summer following the police killing of George Floyd and nationwide protests over the mistreatment of Black Americans at the hands of law enforcement.
IBM and others have stopped selling facial recognition technology to law enforcement agencies as well, but others carry on. Earlier this year, BuzzFeed News found that more than 7,000 people from nearly 2,000 government agencies used or tested facial recognition software from Clearview AI.
Washington states nascent law allows facial recognition technology as long as its AI systems are subject to meaningful human review, a provision widely criticized by civil liberties advocates.
It is an industry-backed bill that purports to put safeguards around the use of facial recognition technology, but actually legitimizes the infrastructural expansion of facial recognition technology that is sold by tech companies, Jennifer Lee, the technology and liberty project manager at the Washington ACLU, said in an interview.
Facial recognition is likely more widespread than you expect in the US. According to Georgetown Law Schools Center on Privacy & Technology, as of 2016, about half of all American adults had their photo included in a criminal facial recognition database.
While facial recognition technology has been used to capture criminals like the Capital Gazette newspaper mass shooter, it also carries the risk of overreach and false positives. There have been numerous reports of false identification of criminals based on the technology in recent years in which Black people were wrongly arrested.
The Electronic Frontier Foundation, a leading digital rights group, says it supports the complete abolition of government surveillance via facial recognition. Adam Schwartz, a senior staff attorney at the foundation, said its use is discriminatory, an Orwellian invasion of privacy, and deters people from participating in public protests.
There are some surveillance technologies that we think that broad requirements and transparency about how its being used might be enough, but face surveillance is so dangerous that governments should not be using it at all, Schwartz told Quartz.
Daniel Castro, who directs the Information Technology and Innovation Foundations (ITIF) Center for Data Innovation, said the King county law is premised on faulty information about how biased these systems are. Castro said the evidence is incontrovertible that the best-performing systems have no racial bias.
A 2019 study from the National Institute of Standards and Technology (NIST), a government agency, found significant differences among various algorithms tested. NIST concluded that US-developed algorithms had high rates of false positives for Asians, African Americans, and Native Americans.
But Castro asserts that theres a stark difference between the best and worst-performing algorithms, and said the King county ban makes it harder for local government to use everyday technology to keep its citizens and workers safe.
Pressure is mounting to rein in the worst aspects of government use of facial recognition or ban it entirely.
On Thursday, a coalition of civil liberties groups including the ACLU and EFF put out a joint statement saying that even with improvements in facial recognition technology, its still fundamentally flawed. The group, which also includes the NAACP and the Innocence Project, is calling for a ban or moratorium on law enforcement use of the tech. The letter also urged lawmakers not to preempt state and local bans through any federal legislation.
A series of bills were introduced last Congress, including the George Floyd Justice In Policing Act and Advancing Facial Recognition Act, that would have clamped down on facial recognition, but none were adopted.
Brian Hengesbaugh, who chairs the Global Data Privacy and Security Business Unit at the law firm Baker McKenzie, expects a wildfire of activity in privacy regulations in the coming years. And any future federal and state privacy laws would likely include requirements for public and private entities capturing biometric data, he said. In many of the US jurisdictions that regulate biometric data collection, and under the EUs GDPR regulations, private companies have to obtain express consent from individuals they track or else they could face legal action. Thats not really feasible in a law enforcement context.
For local jurisdictions considering what to do about facial recognition, the question is broader: Does this technology have any place in law enforcement, or should we head it off before it goes too far?
Read more: