Cops, judges use computer algorithim to profile "future crime"

Legion Troll

A fine upstanding poster
When historians look back at the turmoil over prejudice and policing in the U.S. over the past few years, they’re unlikely to dwell on the case of Eric Loomis. Police arrested Loomis for driving a car that was used in a drive-by shooting. Loomis was sentenced to six years in prison plus five years of probation.

Loomis’s story marks an important point in a quieter debate over the role of fairness and technology in policing. Before his sentence, the judge in the case received an automatically generated risk score that determined Loomis was "likely to commit violent crimes in the future".

Risk scores, generated by algorithms, are an increasingly common factor in sentencing. Computers crunch data—arrests, type of crime committed, and demographic information—and a risk rating is generated.

Similar tools are used to decide which blocks police officers should patrol, where to put inmates in prison, and who to let out on parole.

Loomis is a surprising fulcrum in this controversy: He’s a white man. But when Loomis challenged the state’s use of a risk score in his sentence, he cited the fundamental criticism of the tools: that they punish people for the crimes of others.

Last week the Wisconsin Supreme Court ruled against Loomis.

“The policy position that is taken is that it’s much more dangerous to release Darth Vader than it is to incarcerate Luke Skywalker.”


http://www.bloomberg.com/features/2016-richard-berk-future-crime/
 
Back
Top