The city’s ban on the technology could set a nationwide precedent.
San Francisco is the first major city to ban local government agencies’ use of facial recognition, becoming a leader in regulating technology criticized for its potential to expand widespread government surveillance and reinforce police bias.
The “Stop Secret Surveillance” ordinance passed 8-1 in a vote by the city’s board of supervisors Tuesday. The ordinance will implement an all-out ban on San Francisco city agencies’ use of facial surveillance, which tech companies such as Amazon and Microsoft currently sell to various US government agencies, including in Amazon’s case, US police departments and in Microsoft’s case, a US prison. These technologies can detect faces in images or live video streams and match those facial characteristics to someone’s identity in a database.
Today, facial recognition technology is widely used by the Chinese government for Orwellian mass surveillance of ordinary citizens in public life — most alarmingly to target the Uighur Muslim ethnic minority in what’s been called “automated racism.”
In the US, the tools are far less ubiquitous but becoming increasingly popular with law enforcement agencies. Dozens of local police departments across the US use the technology to match driver’s license pictures and mug shots to criminal databases. It’s also used (in some cases by private citizens, not police) to monitor crowds at events such as protests, shopping malls, and concerts to identify potential suspects in real time, which has caused alarm among civil liberties advocates, who say this use can have a chilling effect on free speech.
The ban is just one part of San Francisco’s surveillance oversight ordinance, which will also require city agencies to get city approval before purchasing other kinds of surveillance technologies, such as automatic license plate readers and camera-enabled drones. It won’t stop private citizens or businesses, however, from using these facial recognition systems. (So, Taylor Swift, if you’re reading this — you’re still in the clear to welcome San Francisco concertgoers with a face scan.) And of course, everyday San Franciscans can continue to willingly participate in pervasive facial recognition technology like the rest of us when we unlock our iPhones, or tag a suggested friend in a Facebook photo, for example.
Supporters of facial recognition technology say that it has the capability to help police departments more efficiently identify and arrest criminal suspects, but critics point to examples of misuses that they say prove it can do more harm than good.
In a particularly egregious example, the American Civil Liberties Union (ACLU) ran a test of Amazon’s facial recognition software and found it incorrectly misidentified 28 black members of Congress as criminals. Researchers at MIT found that, overall, the software returned worse results for women and darker-skinned individuals (in both cases, Amazon has disputed the findings). And in places like Maryland, police agencies have been accused of generally using facial recognition technology more heavily in black communities and to target activists — for example, police in Baltimore used it to identify and arrest protesters of Freddie Gray’s death at the hands of law enforcement.
“The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits,” reads the San Francisco ordinance, which was authored by City Supervisor Aaron Peskin and five other supervisors on the 11-person board, “and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.”
Other cities are following San Francisco’s lead. In nearby Oakland — and across the country in Somerville, Massachusetts — the city council is set to vote on a bill that would implement a similar ban.
“These ordinances shift power from law enforcement to the people and ensure democratic debate and oversight,” said Mana Azarmi, policy council for the Center for Democracy and Technology. Azarmi praised San Francisco and Oakland as a “vanguard” in developing legislation that is a “unique accountability tool to end secret surveillance.”
The San Francisco Bay Area is home to a robust network of civil liberties and racial justice groups, so it’s no surprise that city governments there are leading regulation on technology that could strip privacy and reinforce societal inequalities.
But with tech giants like Amazon, Microsoft, Google, and Facebook in a race to build all-seeing AI, the ban is a sign of something that we don’t often see — governments trying to get ahead of a potential technological Frankenstein.
The ordinance isn’t without its critics, and it attempts to oversee a complex network of public-private partnerships on surveillance technology that will likely change over time. Here’s some context on what it actually does, and the precedent it could set.
What the ban will — and won’t — do
The facial recognition ban will most directly limit the San Francisco Police Department, which doesn’t currently use facial recognition technology but has tested it in the past. If the ordinance passes, SFPD won’t be able to restart any testing of such tools. That means that they won’t be able to, say, connect security cameras installed on public streets to image-processing technology and databases of criminal mugshots.
In other cities, police departments have been major proponents of facial recognition technology, arguing that it helps them in criminal investigations. In Washington County, Oregon, the Sheriff’s Office said that Amazon’s Rekognition product has “greatly increased the ability of our law enforcement officers to act quickly and decisively” by reducing the time it takes to “identify criminal suspects” down from two to three days to minutes, according to a testimonial on Amazon’s customer website for the software.
It’s easy to see the appeal of these tools. Although the SFPD has shied away from publicly supporting facial recognition (or disavowing it), the department has called for amendments to the legislation that address the privacy concerns of technology “while balancing the public safety concerns of our growing, international city.”
The ordinance will also restrict local police from sharing some information with federal agencies such as Immigration and Customs Enforcement, according to Matt Cagle, a technology and civil liberties attorney for the ACLU. Cagle said in a public hearing on the ordinance that the immigration agency responsible for deporting undocumented immigrants has previously requested information from the San Francisco Police Department. San Francisco is a sanctuary city, which means that it in most cases, it doesn’t cooperate with federal agencies like ICE to deport unauthorized immigrants.
However, restrictions on surveillance technology won’t apply at the San Francisco International Airport, where federal agencies such as the Transportation Security Administration and Customs and Border Patrol have jurisdiction — and are free to use facial recognition systems and biometric scanners as they please. (Of course, the public is also free to push back on this as well, and has.)
Several San Francisco residents at a recent public hearing were concerned that the measure would make it harder for local businesses to catch and deter shoplifters. The new ordinance allows for private businesses and citizens to share security camera footage, including from tools that use facial recognition tech, with police to help investigations. However, it outlines procedures for how citizens can share that footage.
Another criticism of the legislation is that it stops local law enforcement from using facial surveillance technology to identify suspected terrorists at mass events, such as concerts and parades.
“Do we really want to say to every white supremacist — ‘Hey, San Francisco’s holding a Lunar Parade, but they’re restricting security cameras,’” said Frank Noto, president of STOP Crime SF, a grassroots group for criminal justice accountability, at a recent public hearing on the ordinance.
The spread to other cities
In many ways, San Francisco, and California in general, is setting a trend for urban areas across the nation that are increasingly demanding more oversight of surveillance technology.
Back in 2016, the ACLU started the “Community Control Over Police Surveillance” effort to provide a framework to local governments for passing legislation to improve police surveillance. Bay Area regional authorities and cities such as Santa Clara County, Oakland, and Berkeley were some of the first places to pass such legislation.
“Symbolically, it’s important that we’re creating most of this technology in the Bay Area, and now we’re putting regulations in place around it,” said Brian Hofer, chair of the City of Oakland Privacy Advisory Commission, who has been helping lead efforts to place a facial recognition ban there.
Just under a dozen US cities — including Seattle, Nashville, and Cambridge, Massachusetts — have passed laws using that framework to give their local officials more power to regulate the use of surveillance tools. And about 20 more cities are actively working on similar legislation. Meanwhile, many have called for federal legislation — including Microsoft, a major vendor of facial recognition technology. But at least in the short term, a patchwork of local regulation seems more likely and achievable, according to privacy researchers and advocates in the field.
American consumers may have willingly given up an expectation of digital privacy, as our appetite grows for always-listening smart devices, always-location-tracking mobile technology, and always-hackable social media apps. And so far, regulators have largely been slow and ineffective in curbing that addiction or regulating the privacy intrusions.
But the expectation of being able to cross the street without Big Brother knowing where you are is a civil liberty deeply ingrained in American culture, and one that makes government use of facial recognition technology ripe for regulation, before it’s too late, Hofer said.
“These kinds of technologies are spreading so fast in the private sector, but we still have the chance to keep the genie in the bottle with limiting government use of them,” he said.
Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.
Vox – All Go to Source
Powered by WPeMatico