A drug trafficker in New York was arrested after an AI-powered surveillance system flagged his driving patterns as suspicious.
David Zayas was arrested back in March 2022 but it’s only recently that the use of the AI surveillance came to light when his lawyer filed an FOIA application. The Westchester County Police Department has been using Rekor Scout software which takes data from the automatic license plate recognition (ALPR) cameras and checks for suspicious driving behavior.
Without the AI, it’s unlikely that Zayas would have been arrested. At the time he was driving his car in a manner that would not have attracted any attention from law enforcement. But the AI picked up on something that no cop could.
The 480 ALPR cameras that the Westchester police use scan approximately 16 million license plates a week. Rekor’s software combed through 2 years’ worth of that data and flagged the vehicle Zayas was traveling in as suspicious.
It identified that over a period of 10 months in 2020 and 2021, the vehicle had made 9 trips along a known drug trafficking route.
When the cops acted on the AI tipoff and pulled him over, they found drugs, a gun, and thousands of dollars in the vehicle. Busted.
AI surveillance raises privacy issues
Zayas pleaded guilty and AI’s involvement in his arrest wouldn’t have made the news if it wasn’t for his lawyer, Ben Gold, who filed a motion to suppress the evidence.
In his motion, Gold said, “This is the systematic development and deployment of a vast surveillance network that invades society’s reasonable expectation of privacy.”
ALPR cameras aren’t new tech. They’ve been around for years, checking license plates against a list of flagged plates and letting cops know when it spots one. What is new is what AI software like Rekor Scout can do with that information.
The software tracks every car by its plate, color, and model, and it knows where it has been, going back for however long the police have kept the data. Identifying a driving pattern as suspicious drug activity sounds good, but the means to get the data feels invasive.
It’s like doing a stop-and-frisk of everyone walking down a street to catch one criminal. We want the bad guy caught, but not if it means we all have to turn out our pockets.
State institutions have been in a rush to regulate what they perceive as AI dangers, but they seem to be less concerned about the ethics of using AI in law enforcement.
As more civil rights groups publicize how AI is watching us on the subway or while we’re driving, it seems AI has been watching us for longer than we think. And it’s unlikely to slow down.