Look out: facial recognition has finally come to London

What you need to know — This week, the Metropolitan police began its first live deployment of the controversial technology in the capital. We headed to Stratford to see the rollout.

“It doesn’t fucking work,” screamed one bearded gentleman, as he scurried across the huge plaza at the entrance to Stratford station in east London. “I’m a bloody criminal and it hasn’t recognised me!”

A defiant mood was in the air, along with the thunderous gales of Storm Ciara, as the Metropolitan police began its first live deployment of facial recognition technology in the capital city this week.

Most had no idea that the blue van equipped with two large facial recognition cameras, supported by a team of almost 30 police largely twiddling their thumbs, would be present. Some pedestrians covered their faces and others shouted abuse, but many were unaware of the huge controversy surrounding this biometric software.

“You do need some level of consent,” said Anisa, 16, who goes to college in the area. “I would never have known it was happening [if you had not said].”

Andy, a 25-year-old kitchen porter, was worried about the growing lack of privacy in London. “There’s already enough cameras here, it has the most CCTV in the world,” he said. “It’s like Big Brother.”

The launch in Stratford followed 10 trials around the city that have taken place since 2016, mostly limited to public events like concerts and football matches. But last month the Met announced that the trial period had ended for the “tried and tested” technology in identifying people suspected of serious crimes and that it is ready to be permanently rolled out for everyday policing.

 

Deployed using NeoFace Watch software by Japanese company NEC, the technology can process thousands of faces per minute in real-time, according to the company’s website. It can carry out “real-time surveillance” and “real-time large database search” as well as “offline identification analysis of recorded video and static images”. Notification alerts are then sent to phone apps held by police if a match is made.

On the scene in Stratford, acting chief inspector Chris Nixon, who was accompanied by a press officer, said the cameras worked by “matching faces to a watchlist of about 5,000 people”, who were either wanted by the police or missing persons whose images were provided by family members. “We have people walking around the streets wanted for serious offences,” he said. “This can help us catch them.”

Yet human rights campaigners from Liberty and Big Brother Watch, which is part of a legal challenge against the Met over the technology, as well as London Assembly member Sian Berry, were in Stratford to protest the use of it.

Griff Ferris, a legal and policy officer at Big Brother Watch, raised concerns over the infringement on civil liberties, alleged racial bias, and lack of a legal justification to use it. He cited research by BBW showing that 93 per cent of those stopped during all 10 public Met trials were wrongly identified. Only eight arrests were made as a result of scanning in three years of trials.

“They’ve disregarded people’s fundamental rights, they’ve disregarded people’s freedoms,” said Ferris. “We should be free to walk around public spaces without intrusive, authoritarian facial recognition technology. We don’t believe it would ever be proportionate to use this kind of technology.”

When questioned, inspector Nixon said the force “hadn’t found any racial bias” during the trial period and that the area in Stratford, known for its high proportion of ethnic minority communities, had been chosen because of “gang issues and violence” reported to the police.

But although violent crime is rising across the country, there is a growing backlash against facial recognition technology as a means to tackle it. Last year it was revealed that cameras had secretly been used at King’s Cross station in London, while an independent review commissioned by the Met itself warned that the technology was only 19 per cent accurate. Even the CEO of Google, Sundar Pichai, has supported the EU’s proposal for a temporary ban. This week Scottish parliament said there was no justification to use live biometrics, and called it a “radical departure” from the current practice of policing by consent.

Carly Kind, director of the Ada Lovelace Institute, an independent body that monitors the ethical use of AI, called for facial recognition technology to stop being used until the public have been consulted on it.

“The use of live facial recognition technology (LFR) by the Met Police in Stratford is out of step with the views of the British public, the majority of whom want to see the government impose restrictions on police use of this technology,” she said.

But for now, despite mounting legal and ethical concerns, the use of facial recognition technology in London will roll on.

Back in Stratford, Owais, 25, who lives in nearby Leyton, said he sympathised with the police but that other means should be used. “They’re doing their job,” he said. “But if it’s a systemic problem with the technology, then there are other ways.”

“It’s something new for me, it’s quite shocking,” said Kolbie, a 26-year-old who works in marketing. “It’s really disappointing to have that bias and I don’t think this is the most efficient way.”

The Met later confirmed that no arrests had been made during the day-long operation.

Follow Peter Yeung on Twitter

Enjoyed this article? Like Huck on Facebook or follow us on Twitter.


Latest on Huck

Smiling person in black wetsuit riding surfboard on calm ocean with rocky hills in background.
Sport

Maryam El Gardoum is breaking new shores for Morocco’s indigenous surfers

The Amazigh Atlantic — Through her groundbreaking career and popular surf school, the five-time Moroccan champion is helping women find their places in the waves.

Written by: Sam Haddad

Dimly lit underground carpark, long winding corridor with concrete walls, floor, and pipes above.
Activism

Youth violence’s rise is deeply concerning, but mass hysteria doesn’t help

Safe — On Knife Crime Awareness Week, writer, podcaster and youth worker Ciaran Thapar reflects on the presence of violent content online, growing awareness about the need for action, and the two decades since Saul Dibb’s Bullet Boy.

Written by: Ciaran Thapar

Colourful embroidered jackets worn by two people, with skateboarder visible in background. Bright colours and graphic designs on the clothing.
Sport

Volcom teams up with Bob Mollema for the latest in its Featured Artist Series

True to This — The boardsports lifestyle brand will host an art show in Biarritz to celebrate the Dutch illustrators’ second capsule collection.

Written by: Huck

Black and white image showing a group of shirtless men socialising, some laughing.
Culture

A visual trip through 100 years of New York’s LGBTQ+ spaces

Queer Happened Here — A new book from historian and writer Marc Zinaman maps scores of Manhattan’s queer venues and informal meeting places, documenting the city’s long LGBTQ+ history in the process.

Written by: Isaac Muk

Four persons - three women and one man - posing outdoors. The women are wearing elaborate clothing and jewellery.
Culture

Nostalgic photos of everyday life in ’70s San Francisco

A Fearless Eye — Having moved to the Bay Area in 1969, Barbara Ramos spent days wandering its streets, photographing its landscape and characters. In the process she captured a city in flux, as its burgeoning countercultural youth movement crossed with longtime residents.

Written by: Miss Rosen

A person wearing a black cap and holding a sign that says "What made me"
Music

Tony Njoku: ‘I wanted to see Black artists living my dream’

What Made Me — In this series, we ask artists and rebels about the forces and experiences that shaped who they are. Today, it’s avant-garde electronic and classical music hybridist Tony Njoku.

Written by: Tony Njoku

Huck is supported by our readers, subscribers and Club Huck members. It is also made possible by sponsorship from:

Accessibility Settings

Text

Applies the Open Dyslexic font, designed to improve readability for individuals with dyslexia.

Applies a more readable font throughout the website, improving readability.

Underlines links throughout the website, making them easier to distinguish.

Adjusts the font size for improved readability.

Visuals

Reduces animations and disables autoplaying videos across the website, reducing distractions and improving focus.

Reduces the colour saturation throughout the website to create a more soothing visual experience.

Increases the contrast of elements on the website, making text and interface elements easier to distinguish.