Violation of civil liberties.
Violence.
Death.
And no accountability whatsoever.
https://scheerpost.com/2022/08/26/defund-the-police-algorithms/
Norris was fortunate. As a civil rights attorney at the Ella Baker Center for Human Rights, he cleared up the situation relatively quickly. (The Walnut Creek Police Department did not respond to a request for comment.) But he saw how algorithms that capture and process plate numbers can lead to confrontations that can easily become deadly. He joined the campaign to ban Oakland police from using license-plate readers—not just because the machines are inaccurate but because they make policing more invasive. Algorithmic technology, he said, was one facet of an entrenched system of police surveillance. Plate readers were “just a different tool layered on to inequality and discrimination and driving while Black,” he told me.
Algorithms and artificial intelligence have dramatically expanded the ability of law-enforcement institutions to identify, track, and target individuals or groups. And civil rights activists say the new technologies erode privacy and due process. Community groups are beginning to understand the ramifications of AI for privacy, discrimination, and social movements and are pushing back. Across the country, a grassroots movement is emerging to resist the secrecy of police algorithms and to demand that lawmakers ban the most intrusive surveillance schemes.
Through facial-recognition programs, for instance, an officer can grab an image of a face from a surveillance video of a protest and then instantly cross-check it against a photo database. A “faceprint” can also be used in a “face analysis” to try to extrapolate demographic characteristics, such as gender, race, or even sexual orientation, according to vendor claims analyzed by the Electronic Frontier Foundation. Beyond just identifying an individual, face-tracking software can be used in tandem with other algorithmic technologies to trace the movements of a demonstrator as they travel home from a rally.
The technology is error-prone and often discriminatory. A recent study by the National Institute of Standards and Technology found that facial-recognition software misidentified Black and Asian faces 10 to 100 times more frequently than it did white faces.
No comments:
Post a Comment