American Cities Question Police Use of Facial Recognition Technology

American Cities Question Police Use of Facial Recognition Technology

American police departments are increasingly using face recognition computer programs as part of their fight against crime. But some lawmakers — and even some major technology companies — are showing resistance.

Are fears of an all-seeing, artificially intelligent security system understandable? Maybe in China, where huge networks of street cameras have helped officials track members of ethnic minority groups for signs of subversive behavior.

Visitors are tracked by facial recognition technology from state-owned surveillance equipment manufacturer Hikvision at the Security China 2018 expo in Beijing, China, October 23, 2018.

American police officials and their video surveillance industry partners say that will not happen in the United States. They are pushing back against a movement by cities, states and federal legislators to ban or limit use of the technology.

Just this year, the politically liberal California cities of San Francisco, Oakland and Berkeley enacted bans on the technology. So did the towns of Somerville and Brookline in the state of Massachusetts.

But traditionally more conservative areas, like the western Massachusetts city of Springfield, are also looking at possible restrictions. The small city has spent millions of dollars to deal with legal actions against police accused of violence and other wrongdoing.

Springfield police say they have no plans to deploy facial recognition technology. But some local officials still are taking action to block any future government use.

Springfield City Councilor Orlando Ramos spoke on the issue in a hearing in October. “It would only lead to more racial discrimination and racial profiling,” he said. He noted several studies that found higher rates of mistakes for facial recognition software used to identify women and people with darker skin color.

“I’m a black woman and I’m dark,” another Springfield councilor, Tracye Whitfield, told Cheryl Clapprood, head of the city’s police. Clapprood is white .

Whitfield explained, “I cannot approve something that’s going to target me more than it will target you.”

Clapprood defended the technology and asked the council to trust her to seek it carefully. “The facial recognition technology does not come along and drop a net from the sky and carry you off to prison,” she said.

In this photo taken Tuesday, May 7, 2019, is a security camera in the Financial District of San Francisco.

In this photo taken Tuesday, May 7, 2019, is a security camera in the Financial District of San Francisco.

The council has not yet acted. But Springfield’s mayor has threatened to veto the proposal that Ramos plans to present in January.

Similar debates across the country suggest racial concerns and conflicting understanding of the technology.

“I wish our leadership would look at the science and not at the hysteria,” said R. Rex Parris, who is mayor of Lancaster, California. The city is working to set up more than 10,000 streetlight cameras that Parris says could observe known child sex abusers and gang members. “There are ways to build in safeguards,” Parris said.

Research suggests that facial recognition systems can provide correct results. A federal government investigation on the leading systems found that they were more than 99 percent correct when matching high-quality photographs to a database of other frontal poses.

But trying to identify a face from a video feed can cause accuracy rates to sharply drop. The government experts found that recognition accuracy could fall below 10 percent when using ceiling cameras commonly found in stores and government buildings.

In October, California Governor Gavin Newsom signed a measure that temporarily barred police departments from using facial recognition technology with body cameras. Some other states have similar restrictions.

Brad Smith is president and chief legal officer of Microsoft. He reports that a California police agency asked the company for facial recognition software. The agency planned to use the software in police vehicle and body cameras.

Smith said Microsoft rejected the request. He said the technology would wrongly identify too many people, especially women and people of color.

Smith does not support a ban of all such technology. But he has warned that facial recognition with too few rules could lead to “mass surveillance on an unprecedented scale.

Other companies, including Amazon, have shown fewer concerns about selling their technology to police. Some law enforcement agencies feed images from video surveillance into software that can search government databases or social media for possible matches.

I’m Caty Weaver.

And I’m Bryan Lynn.

The Associated Press reported this story. Caty Weaver adapted it for VOA Learning English. Ashley Thompson was the editor.


Words in This Story

artificialadj. not natural or real : made, produced, or done to seem like something natural​

hysterian. a situation in which many people behave or react in an extreme or uncontrolled way because of fear, anger, etc.​

match v. to make or see a connection or relationship between (two people or things​

posen. the position in which someone stands, sits, lies down, etc., especially as a model for a photograph, painting, etc.​

accuracyn. freedom from mistake or error : the quality or state of being accurate​

unprecedentedadj. not done or experienced before​

scalen. the size or level of something especially in comparison to something else​

Source link