Artificial Intelligence in Policing Is the Focus of Encode Justice – Teen Vogue

Nijeer Parks was bewildered when he was arrested and taken into custody in February 2019. Apparently, hed been accused of shoplifting and attempting to hit a police officer with a car at a Hampton Inn, as the New York Times reported. But Woodbridge, New Jersey, where the crime had taken place, was 30 miles from his home, and Parks had neither a car nor a drivers license at the time, according to NBC News. Court documents indicated that he had no idea how hed been implicated in a crime he knew he didnt commit until he discovered that the case against him was based solely on a flawed facial-recognition match. According to a December report by the Times, this was the third-known instance of a wrongful arrest caused by facial recognition in the U.S. All three of those victims were Black men.

Algorithms failed Parks twice: First, he was mistakenly identified as the suspect; then, he was robbed of due process and jailed for 10 days at the recommendation of a risk assessment tool used to assist pretrial release decisions. These tools have been adopted by courts across the country despite evidence of racial bias and a 2018 letter signed by groups like the ACLU and NAACP cautioning against their use. At one point, Parks told the Times, he even considered pleading guilty. The case was ultimately dropped, but hes now suing the Woodbridge Police Department, the city of Woodbridge, and the prosecutors involved in his wrongful arrest.

These are the costs of algorithmic injustice. Were approaching a new reality, one in which machines are weaponized to undermine liberty and automate oppression with a pseudoscientific rubber stamp; in which opaque technology has the power to surveil, detain, and sentence, but no one seems to be held accountable for its miscalculations.

Stay up-to-date with the Teen Vogue politics team. Sign up for the Teen Vogue Take!

U.S. law enforcement agencies have embraced facial recognition as an investigative aid in spite of a 2018 study from MIT that discovered software error rates ranging from 0.8% for light-skinned men to 34.7% for dark-skinned women. In majority-Black Detroit, the police chief approximated a 96% error rate in his departments software last year (though the company behind the software told Vice they dont keep statistics on the accuracy of its real-world use), but he still refuses a ban.

Artificial intelligence (AI) works by supplying a computer program with historical data so it can deduce patterns and extrapolate from those patterns to make predictions independently. But this often creates a feedback loop of discrimination. For example, so-called predictive policing tools are purported to identify future crime hot spots and optimize law enforcement resource allocation, but because training data can reflect racially disparate levels of police presence, they may merely flag Black neighborhoods irrespective of a true crime rate. This is exactly what Minority Report warned us about.

Princeton University sociologist Ruha Benjamin has sounded the alarm about a new Jim Code, a reference to the Jim Crow laws that once enforced segregation in the U.S. Others have alluded to a tech-to-prison pipeline, making it crystal clear that mass incarceration isnt going away its just being warped by a sophisticated, high-tech touch.

Thats not to say that AI cant be a force for good. It has revolutionized disease diagnosis, helped forecast natural disasters, and uncovered fake news. But the misconception that algorithms are some sort of infallible silver bullet for all our problems technochauvinism, as data journalist Meredith Broussard put it in her 2018 book has brought us to a place where AI is making high-stakes decisions that are better left to humans. And in the words of Silicon Valley congressman Ro Khanna (D-CA), the technological illiteracy of most members of Congress is embarrassing, precluding effective governance.

Read the original here:

Artificial Intelligence in Policing Is the Focus of Encode Justice - Teen Vogue

Related Posts

Comments are closed.