Police scanned Beyoncé concert for pedophiles and terrorists

November 13, 2023

Welsh police have admitted that they used facial recognition to scan Beyoncé concertgoers in May this year.

The police were apparently scanning the crowd in Cardiff to see if they could find matches to a watch list of suspected terrorists and pedophiles.

South Wales Police and Crime Commissioner Alun Michael said that since the Manchester Arena bombing in 2017, it had become the norm for police to look for terrorist suspects at concerts.

Michael said that the reason why they were also looking for pedophiles is that “there would be very large numbers of young girls attending that concert”.

Michael was one of four crime commissioners giving evidence in a parliamentary inquiry looking into how forces tackle crime.

While real-time facial recognition has been heavily criticized, Michael said its use at the concert “seemed to me entirely sensible.” He also said that its use at the concert “was announced in advance and reported to me, it wasn’t secretive.”

Naturally, it wasn’t announced in advance to those who planned to attend the concert. As they filed into the concert venue they unwittingly passed the van that housed the facial recognition tech.

Michael didn’t say whether or not they nabbed anyone on their list and said that the scanned footage was kept for a maximum of 31 days. That’s cold comfort for concertgoers who value their privacy.

It’s happened before

At a Taylor Swift concert in 2019, a booth playing video clips of Swift doubled as a facial recognition tool to scan for her stalkers. And Rolling Stone reported that Madison Square Garden used facial recognition last year to kick attorneys out of concerts if their firms had litigation against the venue.

Rage Against The Machine cofounders Tom Morello and Zack De La Rocha joined a host of other artists in signing a pledge to say they wouldn’t play at venues that used facial recognition.

If the police are using the tech secretly then their pledge won’t mean a whole lot.

It would be great if police could catch more terrorists and pedophiles, but the largely unregulated use of AI facial recognition feels overly intrusive for a lot of people. If the tech worked better and we heard a lot of success stories then it might be an easier sell.

As it is, facial recognition has earned its negative perception with a running list of biased misidentifications leading to wrongful arrests.

In light of this 65 British politicians called for the temporary halt on the deployment of live facial recognition technology by the police earlier this year.

If police can prove that facial recognition is having tangible results, then maybe the privacy cost-benefit discussion is worth having. In the absence of that, it just feels like Orwellian overreach.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Eugene van der Watt

Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.

×

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions