San Francisco supervisors voted 8 to 1 Tuesday to ban the use of facial recognition software by police and other city departments, becoming the first U.S. city to outlaw a rapidly developing technology that has alarmed privacy and civil liberties advocates.
The ban is part of broader legislation that requires city departments to establish use policies and obtain board approval for surveillance technology they want to purchase or are using at present. Several other local governments require departments to disclose and seek approval for surveillance technology.
“This is really about saying: ‘We can have security without being a security state. We can have good policing without being a police state.’ And part of that is building trust with the community based on good community information, not on Big Brother technology,” said Supervisor Aaron Peskin, who championed the legislation.
The ban applies to San Francisco police and other municipal departments. It does not affect use of the technology by the federal government at airports and ports, nor does it limit personal or business use.
The San Francisco board did not spend time Tuesday debating the outright ban on facial recognition technology, focusing instead on the possible burdens placed on police, the transit system and other city agencies that need to maintain public safety.
“I worry about politicizing these decisions,” said Supervisor Catherine Stefani, a former prosecutor who was the sole no vote.
The Information Technology and Innovation Foundation, a nonprofit think tank based in Washington, D.C., issued a statement chiding San Francisco for considering the facial recognition ban. It said advanced technology makes it cheaper and faster for police to find suspects and identify missing people.
Critics were silly to compare surveillance usage in the United States with China, given that one country has strong constitutional protections and the other does not, said Daniel Castro, the foundation’s vice president.
“In reality, San Francisco is more at risk of becoming Cuba than China_a ban on facial recognition will make it frozen in time with outdated technology,” he said.
It’s unclear how many San Francisco departments are using surveillance and for what purposes, said Peskin. There are valid reasons for license-plate readers, body cameras, and security cameras, he said, but the public should know how the tools are being used or if they are being abused.
San Francisco’s police department stopped testing face ID technology in 2017. A representative at Tuesday’s board meeting said the department would need two to four additional employees to comply with the legislation.
Privacy advocates have squared off with public safety proponents at several heated hearings in San Francisco, a city teeming with tech innovation and the home of Twitter, Airbnb and Uber.
Those who support the ban say the technology is flawed and a serious threat to civil liberties, especially in a city that cherishes public protest and privacy. They worry people will one day not be able to go to a mall, the park or a school without being identified and tracked.
But critics say police need all the help they can get, especially in a city with high-profile events and high rates of property crime. That people expect privacy in public space is unreasonable given the proliferation of cellphones and surveillance cameras, said Meredith Serra, a member of a resident public safety group Stop Crime SF.
“To me, the ordinance seems to be a costly additional layer of bureaucracy that really does nothing to improve the safety of our citizens,” she said at a hearing.
The city of Oakland is considering similar legislation.
San Francisco is on track to become the first U.S. city to ban the use of facial recognition by police and other city agencies, reflecting a growing backlash against a technology that’s creeping into airports, motor vehicle departments, stores, stadiums and home security cameras.
Government agencies around the U.S. have used the technology for more than a decade to scan databases for suspects and prevent identity fraud. But recent advances in artificial intelligence have created more sophisticated computer vision tools, making it easier for police to pinpoint a missing child or protester in a moving crowd or for retailers to analyze a shopper’s facial expressions as they peruse store shelves.
Efforts to restrict its use are getting pushback from law enforcement groups and the tech industry, though it’s far from a united front. Microsoft, while opposed to an outright ban, has urged lawmakers to set limits on the technology, warning that leaving it unchecked could enable an oppressive dystopia reminiscent of George Orwell’s novel “1984.”
“Face recognition is one of those technologies that people get how creepy it is,” said Alvaro Bedoya, who directs Georgetown University’s Center on Privacy and Technology. “It’s not like cookies on a browser. There’s something about this technology that really sets the hairs on the back of people’s heads up.”
Without regulations barring law enforcement from accessing driver’s license databases, people who have never been arrested could be part of virtual police line-ups without their knowledge, skeptics of the technology say.
They worry people will one day not be able to go to a park, store or school without being identified and tracked.
Already, a handful of big box stores across the U.S. are trying out cameras with facial recognition that can guess their customers’ age, gender or mood as they walk by, with the goal of showing them targeted, real-time ads on in-store video screens.
If San Francisco adopts a ban, other cities, states or even Congress could follow, with lawmakers from both parties looking to curtail government surveillance and others hoping to restrict how businesses analyze the faces, emotions and gaits of an unsuspecting public.
The California Legislature is considering a proposal prohibiting the use of facial ID technology on body cameras. A bipartisan bill in the U.S. Senate would exempt police applications but set limits on businesses analyzing people’s faces without their consent.
Legislation similar to San Francisco’s is pending in Oakland, and on Thursday another proposed ban was introduced in Somerville, Massachusetts.
Bedoya said a ban in San Francisco, the “most technologically advanced city in our country,” would send a warning to other police departments thinking of trying out the imperfect technology. But Daniel Castro, vice president of the industry-backed Information Technology and Innovation Foundation, said the ordinance is too extreme to serve as a model.
“It might find success in San Francisco, but I will be surprised if it finds success in a lot of other cities,” he said.
San Francisco is home to tech innovators such as Uber, Airbnb and Twitter, but the city’s relationship with the industry is testy. Some supervisors in City Hall are calling for a tax on stock-based compensation in response to a wave of San Francisco companies going public, including Lyft and Pinterest.
At the same time, San Francisco is big on protecting immigrants, civil liberties and privacy. In November, nearly 60% of voters approved a proposition to strengthen data privacy guidelines.
The city’s proposed face-recognition ban is part of broader legislation aimed at regulating the use of surveillance by city departments. The legislation applies only to San Francisco government and would not affect companies or people who want to use the technology. It also would not affect the use of facial recognition at San Francisco International Airport, where security is mostly overseen by federal agencies.
The Board of Supervisors is scheduled to vote on the bill Tuesday.
San Francisco police say they stopped testing face recognition in 2017. Spokesman David Stevenson said in a statement the department looks forward to “developing legislation that addresses the privacy concerns of technology while balancing the public safety concerns of our growing, international city.”
Supervisor Aaron Peskin acknowledges his legislation, called the “Stop Secret Surveillance Ordinance,” isn’t very tech-friendly. But public oversight is critical given the potential for abuse, he said.
The technology often misfires. Studies have shown error rates in facial-analysis systems built by Amazon, IBM and Microsoft were far higher for darker-skinned women than lighter-skinned men.
Even if facial recognition were perfectly accurate, its use would pose a severe threat to civil rights, especially in a city with a rich history of protest and expression, said Matt Cagle, attorney at the ACLU of Northern California.
“If facial recognition were added to body cameras or public-facing surveillance feeds, it would threaten the ability of people to go to a protest or hang out in Dolores Park without having their identity tracked by the city,” he said, referring to a popular park in San Francisco’s Mission District.
Local critics of San Francisco’s legislation, however, worry about hampering police investigations in a city with a high number of vehicle break-ins and several high-profile annual parades. They want to make sure police can keep using merchants and residents’ video surveillance in investigations without bureaucratic hassles.
Joel Engardio, vice president of grassroots group Stop Crime SF, wants the city to be flexible.
“Our point of view is, rather than a blanket ban forever, why not a moratorium so we’re not using problematic technology, but we open the door for when technology improves?” he said.
Such a moratorium is under consideration in the Massachusetts Legislature, where it has the backing of Republican and Democratic senators.
Often, a government’s facial recognition efforts happen in secret or go unnoticed. In Massachusetts, the motor vehicle registry has used the technology since 2006 to prevent driver’s license fraud, and some police agencies have used it as a tool for detectives.
“It is technology we use,” said Massachusetts State Police Lt. Tom Ryan, adding that “we tend not to get too involved in publicizing” that fact. Ryan and the agency declined to answer further questions about how it’s used.
Massachusetts Sen. Cynthia Creem, a Democrat and sponsor of the moratorium bill, said she worries about a lack of standards protecting the public from inaccurate or biased facial recognition technology. Until better guidelines exist, she said, “it shouldn’t be used” by government.
The California Highway Patrol does not use face recognition technology, spokeswoman Fran Clader said.
California Department of Motor Vehicles spokesman Marty Greenstein says facial recognition technology “is specifically not allowed on DMV photos.” State Justice Department spokeswoman Jennifer Molina said her agency does not use face ID technology, and policy states “DOJ and requesters shall not maintain DMV images for the purpose of creating a database” unless authorized.
Legislators also sought a face recognition moratorium this year in Washington, the home state of Microsoft and Amazon, but it was gutted following industry and police opposition. Microsoft instead backed a lighter-touch proposal as part of a broader data privacy bill, but deliberations stalled before lawmakers adjourned late last month.