ACLU to take on racist artificial intelligence

The COMPAS tool assigns defendants scores from 1 to 10 that indicate how likely they are to reoffend based on more than 100 factors, including age, sex and criminal history. Notably, race is not used.

We reanalyzed data collected by ProPublica on about 5,000 defendants assigned COMPAS scores in Broward County, Fla. (See the end of the post, after our names, for more technical details on our analysis.) For these cases, we find that scores are highly predictive of reoffending. Defendants assigned the highest risk score reoffended at almost four times the rate as those assigned the lowest score (81 percent vs. 22 percent).


imrs.php


The overall recidivism rate for black defendants is higher than for white defendants (52 percent vs. 39 percent).

Basically, the algorithm is doing its' jobs. Progressives as usual are assuming that disparity automatically means discrimination.
 
They want a programmable progressive drone? They already have so many.
 
One of the most offensively racist predictions I ever encountered: Black males have a two thirds higher likelihood of being diagnosed with prostate cancer than white males.

On what basis did "science" pull these numbers out of their bigoted ass?

(Pun unintended.)
 
Back
Top