From Anna Washenko's 5-19-25 ENGADGET article entitled "New Orleans Police Secretly Used Facial Recognition on Over 200 Live Camera Feeds":
New Orleans' police force secretly used constant facial recognition to seek out suspects for two years. An investigation by The Washington Post discovered that the city's police department was using facial recognition technology on a privately owned camera network to continually look for suspects. This application seems to violate a city ordinance passed in 2022 that required facial recognition only be used by the NOLA police to search for specific suspects of violent crimes and then to provide details about the scans' use to the city council. However, WaPo found that officers did not reveal their reliance on the technology in the paperwork for several arrests where facial recognition was used, and none of those cases were included in mandatory city council reports.
"This is the facial recognition technology nightmare scenario that we have been worried about,” said Nathan Freed Wessler, an ACLU deputy director. "This is the government giving itself the power to track anyone — for that matter, everyone — as we go about our lives walking around in public." Wessler added that the is the first known case in a major US city where police used AI-powered automated facial recognition to identify people in live camera feeds for the purpose of making immediate arrests.
Police use and misuse of surveillance technology has been thoroughly documented over the years. Although several US cities and states have placed restrictions on how law enforcement can use facial recognition, those limits won't do anything to protect privacy if they're routinely ignored by officers.
To read Washenko's entire article, click HERE.
From Ashley Belanger's 5-19-25 ARS TECHNICA article entitled "New Orleans Called Out for Sketchiest Use of Facial Recognition Yet in the US":
New Orleans Police Department superintendent Anne Kirkpatrick told the Post that she would be conducting a review of the program and turning off all automated alerts until she is "sure that the use of the app meets all the requirements of the law and policies."
The ACLU is demanding a stronger response, asking for a full investigation into how many arrests were made and urging NOPD to permanently stop using the AI-enhanced feeds. In a statement sent to Ars, Alanah Odoms, the executive director of the ACLU of Louisiana, said that without a full investigation, there would be no way to know the extent of potential harms of the secret AI surveillance to the community.
"We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies," Odoms said. "These individuals could be added to Project Nola's watchlist without the public’s knowledge and with no accountability or transparency on the part of the police departments."
The cameras in New Orleans are operated by Project Nola, a nonprofit
founded by a former cop, Bryan Lagarde, who wanted to help police more
closely monitor the city's "crime-heavy areas," the Post reported.
Configured
to scan live footage for people "on a list of wanted suspects," the
camera network supposedly assisted in at least 34 arrests since 2023,
Project Nola has claimed in social media posts. But the Post struggled
to verify that claim, since "the city does not track such data and the
nonprofit does not publish a full accounting of its cases."
To read Belanger's entire article, click HERE.
No comments:
Post a Comment