Prominent thinkers in the fields of artificial intelligence say that predictive policing tools are not only ‘useless,’ but may be helping to drive mass incarceration.
In a letter published earlier this month the experts, from MIT, Harvard, Princeton, NYU, UC Berkeley and Columbia spoke out on the topic in an unprecedented showing of skepticism toward the technology.
‘When it comes to predicting violence, risk assessments offer more magical thinking than helpful forecasting,’ wrote AI experts Chelsea Barabas, Karthik Dinakar and Colin Doyle in a New York Times op-ed.
Predictive policing tools, or risk assessment tools, are algorithms designed to predict the likelihood of someone committing crime in the future.
With rapid advances in artificial intelligence, the tools have begun to find their way into the everyday processes of judges, who deploy them to determine sentencing, and police departments, who use them to allot resources and more.
While the technology has been positioned as a way to combat crime preemptively, experts say its capabilities have been vastly overstated.
Among the arenas most affected by the tools they say, are pretrial sentencing, during which people undergoing a trial may be detained based on their risk of committing a crime.
‘Algorithmic risk assessments are touted as being more objective and accurate than judges in predicting future violence,’ write the researchers.
‘Across the political spectrum, these tools have become the darling of bail reform. But their success rests on the hope that risk assessments can be a valuable course corrector for judges’ faulty human intuition.’
Experts say the tools have a tendency to overestimate accused peoples’ risk of violence when in fact, the likelihood of crimes committed during trials is small.
According to the the op-ed, 94 percent of people accused of a crime in Washington D.C. are released and only 2 percent of those people are arrested for violent crime afterward.
However, researchers point out that it’s not uncommon for states to detain 30 percent of people awaiting trial.
‘[The tools] give judges recommendations that make future violence seem more predictable and more certain than it actually is,’ write the researchers.
‘In the process, risk assessments may perpetuate the misconceptions and fears that drive mass incarceration.’
One of the most prominent tools used be judges is called the Public Safety Assessment, which like many other tools, crunches numbers based on criminal history, personal characteristics.
The tool flags a person based as candidate for ”new violent criminal activity’ or not.
For the technology to truly be accurate, experts say it should predict almost all people are at zero risk, given the low statistical likelihood.
‘Instead, the P.S.A. sacrifices accuracy for the sake of making questionable distinctions among people who all have a low, indeterminate or incalculable likelihood of violence,’ say experts.
To help better prevent crime, researcher suggest easing reliance on algorithms and putting resources into more holistic measures.
‘Policy solutions cannot be limited to locking up the ‘right’ people,’ the write.
‘They must address public safety through broader social policies and community investment.’
5 thoughts on “A.I. Predictive Policing Could Lead to Mass Incarceration”
people should start publicly shaming the pigs ..
try to get as many of the youth in this country awake to the lack of need for these goons
They’ll awaken. When it’s too late.
But with privatized prisons – isn’t this the point? At, near, or ABOVE maximum capacity.
Seems like it
“While the technology has been positioned as a way to combat crime preemptively, experts say its capabilities have been vastly overstated.”
Especially since it’s NOT designed to ‘combat crime’, but to incarcerate political dissidents (such as Trenchers).
Like Henry says… take the long shot – shoot the suit.
Not just Trenchers. When the Noahiders really take over (I know I know, stop constantly squawking about Noahide! 😉 ) they’ll use it to bring under the Sanhedrin anyone wearing a cross not upside down… and bear in mind, didn’t the 14th amendment or something (or Act of 1871?) re-institute slavery of a certain kind? After all, gotta have slave labor in all those private prisons they are building, and a lot of folks are gonna need jobs as jailers (unless all the correction officers are robots of course)… And they did make the movie “Minority Report” for a reason, starring Schientologist Tom Cruise…. Are those AI “Thetans’ gone yet, Tommy boy? Bwahahahah!