I read a news report today (via Drudge) claiming that software capable of identifying and/or predicting those most likely to commit crimes was to be rolled out soon.
Developed by Richard Berk, a professor at the University of Pennsylvania, the software is already used in Baltimore and Philadelphia to predict which individuals on probation or parole are most likely to murder and to be murdered.
. . .
If the software proves successful, it could influence sentencing recommendations and bail amounts.
"When a person goes on probation or parole they are supervised by an officer. The question that officer has to answer is 'what level of supervision do you provide?'" said Berk.
It used to be that parole officers used the person's criminal record, and their good judgment, to determine that level.
"This research replaces those seat-of-the-pants calculations," he said.
. . .
Beginning several years ago, the researchers assembled a dataset of more than 60,000 various crimes, including homicides. Using an algorithm they developed, they found a subset of people much more likely to commit homicide when paroled or probated. Instead of finding one murderer in 100, the UPenn researchers could identify eight future murderers out of 100.
Berk's software examines roughly two dozen variables, from criminal record to geographic location. The type of crime, and more importantly, the age at which that crime was committed, were two of the most predictive variables.
. . .
Scientifically, Berk's results are "very impressive," said Shawn Bushway, a professor of criminal justice at the State University of New York at Albany who is familiar with Berk's research.
Predicting rare events like murder, even among high-risk individuals, is extremely difficult, said Bushway, and Berk is doing a better job of it than anyone else.
But Berk's scientific answer leaves policymakers with difficult questions, said Bushway. By labeling one group of people as high risk, and monitoring them with increased vigilance, there should be fewer murders, which the potential victims should be happy about.
It also means that those high-risk individuals will be monitored more aggressively. For inmate rights advocates, that is tantamount to harassment, "punishing people who, most likely, will not commit a crime in the future," said Bushway.
There's more at the link. Very interesting reading.
Part of me is very concerned about the implications of this software for civil and human rights. I hate the thought that someone who's reformed, and who may be a very low risk for re-offending, might be highlighted by this software as a major risk, and therefore be subjected to much tighter and more intrusive monitoring than he might otherwise receive. Indeed, he might get so annoyed at the intrusion that he commits another crime almost as a way of 'getting his own back' on the system that he regards as oppressing him. I think this is a very real danger.
On the other hand, I've been a prison chaplain. Routinely we staff saw prisoners released whom we knew - knew, beyond a shadow of doubt - were still extremely dangerous to society. Their conduct behind bars, their attitude . . . everything about them screamed a warning to us: but we couldn't keep them incarcerated. They'd done their time, and no matter how sure we were that they still represented a danger to others, we weren't allowed to keep them behind bars.
If this computer system can somehow be used to independently identify such people, and the knowledge of corrections staff can be linked to its predictions, the combination might be a very effective tool to ensure that likely re-offenders are monitored so closely that their recidivism becomes almost impossible. Will this represent an invasion of their civil liberties? Yes, I'm afraid it will. Can it be justified? On strictly Constitutional grounds, no, I daresay it can't. Is it therefore wrong to employ such means? Only the courts can answer that . . . but I'd like to point out that in many states, a convicted felon automatically loses certain rights (e.g. the right to vote). Given such restrictions, should a felony conviction also carry with it the loss of the right to privacy, at least as far as likely re-offenders are concerned? I can't answer that . . . but I can recall seeing hardened offenders let go, and later reading about their arrest for committing the most violent, vicious and horrific crimes after their release.
What say you, readers? Is there any circumstance in which the use of such predictive software can be allowed to trump, or at least limit, one's human and Constitutional rights? If so, what circumstance(s)? If not, what alternative can you suggest to minimize the risk of re-offense? Let's hear your views in Comments.