Real-Time Surveillance Will Test the British Tolerance for Cameras


CARDIFF, Wales — A number of hours earlier than a latest Wales-Ireland rugby match in Cardiff, amid throngs of followers wearing crew colours of pink and inexperienced, and sidewalk retailers promoting scarves and flags, cops popped out of a white van.

The officers stopped a person carrying a big Starbucks espresso, requested him a sequence of questions after which arrested him. A digital camera hooked up to the van had captured his picture, and facial recognition expertise utilized by the metropolis recognized him as somebody needed on suspicion of assault.

The presence of the cameras, and the native police’s use of the software program, is at the heart of a debate in Britain that’s testing the nation’s longstanding acceptance of surveillance.

Britain has historically sacrificed privateness greater than different Western democracies, principally in the identify of safety. The authorities’s use of hundreds of closed-circuit cameras and its means to observe digital communications have been influenced by home bombings throughout years of battle involving Northern Ireland and assaults since Sept. 11, 2001.

But now a brand new technology of cameras is starting for use. Like the one perched on the prime of the Cardiff police van, these cameras feed into facial recognition software program, enabling real-time identification checks — elevating new issues amongst public officers, civil society teams and residents. Some members of Parliament have referred to as for a moratorium on the use of facial recognition software. The mayor of London, Sadiq Khan, said there was “serious and widespread concern” about the technology. Britain’s top privacy regulator, Elizabeth Denham, is investigating its use by the police and private businesses.

And this month, in a case that has been closely watched because there is little legal precedent in the country on the use of facial recognition, a British High Court ruled against a man from Cardiff, the capital of Wales, who sued to end the use of facial recognition by the South Wales Police. The man, Ed Bridges, said the police had violated his privacy and human rights by scanning his face without consent on at least two occasions — once when he was shopping, and again when he attended a political rally. He has vowed to appeal the decision.

“Technology is driving forward, and legislation and regulation follows ever so slowly behind,” said Tony Porter, Britain’s surveillance camera commissioner, who oversees compliance with the country’s surveillance camera code of practice. “It would be wrong for me to suggest the balance is right.”

Britain’s experience mirrors debates about the technology in the United States and elsewhere in Europe. Critics say the technology is an intrusion of privacy, akin to constant identification checks of an unsuspecting public, and has questionable accuracy, particularly at identifying people who aren’t white men.

In Cardiff, the largest city in Wales, vans carrying facial recognition cameras have become a common sight over the past year. On game days, the vehicles have taken the place of vans the police used to detain fans causing trouble, said Stephen Williams, 57, who volunteers for the Socialist Party at a table nearby. “On most occasions, if it’s a busy event, you’ll see a van there,” he said.

The South Wales Police said the technology was necessary to make up for years of budget cuts by the central government. “We are having to do more with less,” said Alun Michael, the South Wales police and crime commissioner. He said the technology was “no different than a police officer standing on the corner looking out for individuals and if he recognizes somebody, saying, ‘I want to talk to you.’”

The police said that since 2017, 58 people had been arrested after being identified by the technology.

New questions are being raised about facial recognition’s use extending beyond the police to private companies. This month, after a report was published by the Financial Times, a large London property developer acknowledged that it used the technology at Kings Cross, a commercial and transit hub.

Critics say there has been a lack of transparency about the technology’s use, particularly about the creation of watch lists, which are considered the backbone of the technology because they determine which faces a camera system is hunting for. In tests in Britain, the police often programmed the system to look for a few thousand wanted people, according to a research paper published in July. But the potential could be far greater: Another government report said that as of July 2016, there were over 16 million images of people who had been taken into custody in the country’s Police National Database that could be searchable with facial recognition software.

Silkie Carlo, the executive director of Big Brother Watch, a British privacy group calling for a ban on the technology’s use, said the murky way watch lists were created showed that police departments and private companies, not elected officials, were making public policy about the use of facial recognition.

“We’ve skipped some real fundamental steps in the debate,” Ms. Carlo said. “Policymakers have arrived so late in the discussion and don’t fully understand the implications and the big picture.”

Sandra Wachter, an associate professor at Oxford University who focuses on technology ethics, said that even if the technology could be proven to identify wanted people accurately, laws were needed to specify when the technology could be used, how watch lists were created and shared, and the length of time images could be stored.

“We still need rules around accountability,” she said, “which right now I don’t think we really do.”



Source link Nytimes.com

Get more stuff like this

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Leave a Reply

Your email address will not be published. Required fields are marked *

Get more stuff like this
in your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.