Look out: facial recognition has finally come to London

Look out: facial recognition has finally come to London

What you need to know — This week, the Metropolitan police began its first live deployment of the controversial technology in the capital. We headed to Stratford to see the rollout.

“It doesn’t fucking work,” screamed one bearded gentleman, as he scurried across the huge plaza at the entrance to Stratford station in east London. “I’m a bloody criminal and it hasn’t recognised me!”

A defiant mood was in the air, along with the thunderous gales of Storm Ciara, as the Metropolitan police began its first live deployment of facial recognition technology in the capital city this week.

Most had no idea that the blue van equipped with two large facial recognition cameras, supported by a team of almost 30 police largely twiddling their thumbs, would be present. Some pedestrians covered their faces and others shouted abuse, but many were unaware of the huge controversy surrounding this biometric software.

“You do need some level of consent,” said Anisa, 16, who goes to college in the area. “I would never have known it was happening [if you had not said].”

Andy, a 25-year-old kitchen porter, was worried about the growing lack of privacy in London. “There’s already enough cameras here, it has the most CCTV in the world,” he said. “It’s like Big Brother.”

The launch in Stratford followed 10 trials around the city that have taken place since 2016, mostly limited to public events like concerts and football matches. But last month the Met announced that the trial period had ended for the “tried and tested” technology in identifying people suspected of serious crimes and that it is ready to be permanently rolled out for everyday policing.

 

Deployed using NeoFace Watch software by Japanese company NEC, the technology can process thousands of faces per minute in real-time, according to the company’s website. It can carry out “real-time surveillance” and “real-time large database search” as well as “offline identification analysis of recorded video and static images”. Notification alerts are then sent to phone apps held by police if a match is made.

On the scene in Stratford, acting chief inspector Chris Nixon, who was accompanied by a press officer, said the cameras worked by “matching faces to a watchlist of about 5,000 people”, who were either wanted by the police or missing persons whose images were provided by family members. “We have people walking around the streets wanted for serious offences,” he said. “This can help us catch them.”

Yet human rights campaigners from Liberty and Big Brother Watch, which is part of a legal challenge against the Met over the technology, as well as London Assembly member Sian Berry, were in Stratford to protest the use of it.

Griff Ferris, a legal and policy officer at Big Brother Watch, raised concerns over the infringement on civil liberties, alleged racial bias, and lack of a legal justification to use it. He cited research by BBW showing that 93 per cent of those stopped during all 10 public Met trials were wrongly identified. Only eight arrests were made as a result of scanning in three years of trials.

“They’ve disregarded people’s fundamental rights, they’ve disregarded people’s freedoms,” said Ferris. “We should be free to walk around public spaces without intrusive, authoritarian facial recognition technology. We don’t believe it would ever be proportionate to use this kind of technology.”

When questioned, inspector Nixon said the force “hadn’t found any racial bias” during the trial period and that the area in Stratford, known for its high proportion of ethnic minority communities, had been chosen because of “gang issues and violence” reported to the police.

But although violent crime is rising across the country, there is a growing backlash against facial recognition technology as a means to tackle it. Last year it was revealed that cameras had secretly been used at King’s Cross station in London, while an independent review commissioned by the Met itself warned that the technology was only 19 per cent accurate. Even the CEO of Google, Sundar Pichai, has supported the EU’s proposal for a temporary ban. This week Scottish parliament said there was no justification to use live biometrics, and called it a “radical departure” from the current practice of policing by consent.

Carly Kind, director of the Ada Lovelace Institute, an independent body that monitors the ethical use of AI, called for facial recognition technology to stop being used until the public have been consulted on it.

“The use of live facial recognition technology (LFR) by the Met Police in Stratford is out of step with the views of the British public, the majority of whom want to see the government impose restrictions on police use of this technology,” she said.

But for now, despite mounting legal and ethical concerns, the use of facial recognition technology in London will roll on.

Back in Stratford, Owais, 25, who lives in nearby Leyton, said he sympathised with the police but that other means should be used. “They’re doing their job,” he said. “But if it’s a systemic problem with the technology, then there are other ways.”

“It’s something new for me, it’s quite shocking,” said Kolbie, a 26-year-old who works in marketing. “It’s really disappointing to have that bias and I don’t think this is the most efficient way.”

The Met later confirmed that no arrests had been made during the day-long operation.

Follow Peter Yeung on Twitter

Enjoyed this article? Like Huck on Facebook or follow us on Twitter.