17 individuals in London arrested after AI facial recognition operation

March 26, 2024

Last week, in south London, the Metropolitan Police used live facial recognition cameras to assist in the arrest of 17 individuals.

The arrests occurred during specific operations conducted on March 19th and 21st in Croydon and March 21st in Tooting. 

Among those arrested was a 23-year-old man caught possessing two rounds of blank ammunition. This led police to seize ammunition, stolen mobile phones, and cannabis from property linked to that individual. The facial recognition system targeted him due to an outstanding warrant for his arrest. 

Currently, the technology is designed to identify individuals featured on a “bespoke watchlist,” which includes persons with outstanding arrest warrants. The Metropolitan Police says this technology enables them to execute “precision policing.”

This follows a previous tally of 42 arrests made in February using the same technology, though it remains unclear how many of those arrested have been charged, as per inquiries made by BBC News.

Arrests covered a broad spectrum of offenses, including sexual offenses, assault, theft, fraud, burglary, racially aggravated harassment, and breaches of anti-social behavior orders (ASBOs).

Following the operation, the police said they offered communities “information and reassurance” about their actions.

Facial recognition in policing is controversial

Last year, members of the UK’s House of Lords and Commons wanted the police to reevaluate live facial recognition technology after the policing minister hinted at police forces gaining access to a database of 45 million passport images.

Michael Birtwistle from the Ada Lovelace Institute concurred with their skepticism, “The accuracy and scientific basis of facial recognition technologies is highly contested, and their legality is uncertain.”

Civil rights advocacy group Big Brother Watch also highlighted that 89% of UK police facial recognition alerts fail.

Lindsey Chiswick, the Metropolitan Police intelligence director, sought to dispel privacy concerns. She told the BBC, “We do not keep your data. If there is no match, your data is immediately and automatically deleted in seconds.” Chiswick also asserted that the technology has been “independently tested” for reliability and bias. 

Others contest that. For instance, Madeleine Stone from Big Brother Watch expressed concerns about AI surveillance, labeling it “Orwellian.” 

Stone continued, “Everyone wants dangerous criminals off the street, but papering over the cracks of a creaking policing system with intrusive and Orwellian surveillance technology is not the solution. Rather than actively pursuing people who pose a risk to the public, police officers are relying on chance and hoping that wanted people happen to walk in front of a police camera.”

Big Brother Watch also raised the alarm about a new operation in Catford taking place yesterday (26/03).

How the UK police use AI facial recognition

The UK police force started testing facial recognition technologies in 2018, deploying camera-equipped vans to capture footage from public places.

Facial recognition
An example of a facial recognition van at an early trial in central London. Source: X

A recent Freedom of Information request directed at the Metropolitan Police Service (MPS) sought clarification about whether AI is used to screen individuals automatically and how that data is processed. 

The MPS disclosed using AI technologies like Live Facial Recognition (LFR) and Retrospective Facial Recognition (RFR) within specific operations.

However, the MPS refused to respond to the bulk of the inquiry, citing exemptions under the Freedom of Information Act 2000 related to “national security,” “law enforcement,” and “the protection of security bodies.”

Specifically, the MPS argued that divulging details about the covert use of facial recognition could compromise law enforcement tactics.

The police’s response states: “Confirming or denying that any information relating to any possible covert practice of Facial Recognition would show criminals what the capacity, tactical abilities, and capabilities of the MPS are, allowing them to target specific areas of the UK to conduct/undertake their criminal/terrorist activities.”

Lessons from the past

While predictive policing was designed to make communities safer, it’s led to some troubling outcomes, including the wrongful arrest of several individuals. 

These aren’t just isolated incidents but rather a pattern that reveals a critical flaw in relying too heavily on AI for police work.

Robert McDaniel in Chicago, despite having no violent history, was targeted by police as a potential threat simply because an algorithm placed him on a list. 

His story isn’t unique. Across the US, there have been instances where people were wrongly accused and arrested based on faulty facial recognition matches. 

Nijeer Parks’s story is a stark example. Accused of crimes he had nothing to do with, Parks faced jail time and a hefty legal bill – all because of an incorrect match by facial recognition technology.

Facial recognition technology has been exposed as inaccurate for darker-skinned individuals, particularly black women. While facial recognition for white faces can be accurate in over 90% of cases, it can be as low as 35% for black faces.

On current evidence, marginalized groups stand to lose the most from inaccurate algorithmic policing strategies. 

Wrongful arrests aren’t just distressing for those directly involved; they also cast a long shadow over the communities affected. 

When people are arrested based on predictions rather than concrete actions, it shakes the very foundation of trust between law enforcement and the public. 

Indeed, public trust in the police is at rock bottom both in the UK and the US. AI threatens to erode this further if poorly managed. 

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.

×

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions