AI surveillance company Flock Safety might be having a banner year as it aggressively expands its dragnet across the US, but ordinary people aren’t impressed.
For four hours on Tuesday, residents of Longmont, Colorado — where Flock has at least 23 AI-powered cameras and license plate readers — expressed their outrage at city counselors over the city’s contract with the company. According to local magazine Yellowscene, nearly every seat at the hearing was full, and 90 percent of attendees were there to express their transparency and privacy concerns over Flock.
“Longmont’s websites states the community’s safety cameras do not perform predictive analytics or facial recognition,” software engineer and Longmont resident Andrew Gentry told the council, per the outlet. “That same assurance is absent from the Flock page, leaving me to believe that this privacy standard may have been quietly discarded with Flock’s adoption.”
“Flock’s retention policy doubles the length of [data] retention used for community canvas, from 14 days to 30 days, once again signaling to me that the Flock program has been a convenient way to stretch existing privacy standards,” he added.
When all was said and done, the city’s council voted 5-1 in favor of rejecting any future expansion of its contract with the company. While the fate of the 23 existing cameras are still up in the air, it’s a pretty decisive win for critics of the tech in the Boulder exurb.
And they’re far from alone. All throughout the country, communities and activists are rising up in anger against the cameras, which have been installed largely at the behest of municipal police departments. The company’s bread and butter, Automated License Plate Recognition programs (ALPR), have been the subject of heated debates, as the company’s untested AI recognition program has led to numerous false-positives.
Just as troubling are the cases where it does work, like those of the Atlanta police chief who used Flock’s ALPR to stalk and harass people, or the Texas cop who used data from 83,000 ALPRs to track a woman suspected of seeking an out-of-state abortion.
Citizens in areas like Yakima, Washington, Cleveland, Ohio, and Eugene, Oregon have mounted municipal the cameras, which each function as nodes in an interconnected network. Others have organized public protests, ironically enough becoming the likely subjects of police surveillance via Flock’s AI facial-recognition cameras.
Not everyone is going through official channels, either. Earlier this week, a man was accused of using vice grips to rip down 13 Flock cameras throughout Suffolk, Virginia. Meanwhile, an open-source project called DeFlock has taken a crowd-sourcing approach to mapping ALPRs across the country. (Though Flock sent the activist behind DeFlock a cease and desist, he has so far defied the legal threat, with help from the Electronic Frontier Foundation.)
Time will tell what tactics end up being the most effective, but given the sheer number of anti-Flock activities, there should be no shortage of data to pull from.
More on surveillance: AI Surveillance Startup Caught Using Sweatshop Workers to Monitor US Residents
The post Regular People Are Rising Up Against AI Surveillance Cameras appeared first on Futurism.








