The algorithms are the real racists: Technology is "biased" too.

Funny how no one here has yet to actually address the points made in the article




Also interesting that everyone ignored the part where one of the people interviewed conceded that the data might produce politically incorrect information even off of unbiased data
First of all, that article is fundamentally wrong because it is an article about technology... written by a woman. Second of all, the company already gave a response, explaining the data.

Bet you feel stupid now, huh?
 
Code:
switch(race) {
    case "black":
    case "latino":
    case "muslim":
        discriminate();
        break;
    default:
         privilege++;
         enjoy(privilege);
}

Can anyone code review my racist code?

That should run just fine lol
 
Definitive proof that Google (and all liberals) are the real racists:



Seriously though, maybe they didn't put enough black people in the training set. The bias is in how you select your training data, not in the outcomes that the algorithm predicts.


The person's name who is tweeting is out of control though.
 
Code:
switch(race) {
    case "black":
    case "latino":
    case "muslim":
        discriminate();
        break;
    default:
         privilege++;
         enjoy(privilege);
}

Can anyone code review my racist code?


Code:
class race{
   public:
   virtual double jail_sentence(&Crime);
}

class white: public race{
   public:
   double jail_sentence(&Crime crime){
      return crime.MIN_SENTENCE;}
}

class black: public race{
   public:
   double jail_sentence(&Crime crime){
      return crime.MAX_SENTENCE;}
}

Can someone help with my code? All the black people keep getting harsher sentences.
 
Upturn is an activist organization around technology. They are fully in the camp of data being biased. I don't think "conceded" is the word you wanted to use.
I said "conceded" because he admits even the most accurate prediction could produce a socially undesirable result with unbiased data, just that he doesn't seem to think that's what's happening and more often than not there's some bias, conscious or unconscious, in the data being fed to these machines.
First of all, that article is fundamentally wrong because it is an article about technology... written by a woman. Second of all, the company already gave a response, explaining the data.

Bet you feel stupid now, huh?
Sure do, if I knew it was written by a woman I wouldn't have read it at all.
 
Funny how no one here has yet to actually address the points made in the article
The problem of possibly biased input data isn't really related to the technology. It's the same discussion with or without those programs.
It's not like systemic racism in the US is an undeniable, well-established fact. It's discussed in politics and we'd have the same debate here.

As far as mislabeling of individuals in a high-risk group compared to individuals in a lower-risk group goes, that seems to be common sense to me.
If you had a portable device which scans your surrounding and provides real-time safety tips based on all available data, how would such a device handle a situation in which you approach a black guy on your side of the street while there's a white dude on the sidewalk on the opposite side of the street? It would tell you to switch to the other side of the street, which is not politically- but probably statistically correct since it mathematically improves your chance of not ending up as a victim of various violent crimes in which blacks are overrepresented. A single, random black individual is statistically more likely to commit certain crimes than a single, random white individual.
A good black guy who won't do nuffin walking down the street is more likely to be unnecessarily avoided based on the advice of your portable risk-assessment computer than a good (or even a bad) white guy.
This effect should be minimized as more criteria like previous offenses, age, education, employability, family status, addiction etc are taken into consideration but that doesn't mean it will be mathematically eliminated.
 
The problem of possibly biased input data isn't really related to the technology. It's the same discussion with or without those programs.
It is related to technology because this technology is used to reduce human error in the justice system but if prior bias and error aren't accounted for it can multiply that error or at the very least appear to give it an objective veneer.
It's not like systemic racism in the US is an undeniable, well-established fact. It's discussed in politics and we'd have the same debate here.
True and its worth discussing here instead of pretending the claim is that algorithms are racist instead of what the claim actually is, which is that these algorithms are being fed data that, for one reason or another, is itself already biased as a result of human imperfection at each level of its creation which is leading to these biases being reproduced by these algorithms.
As far as mislabeling of individuals in a high-risk group compared to individuals in a lower-risk group goes, that seems to be common sense to me.
If you had a portable device which scans your surrounding and provides real-time safety tips based on all available data, how would such a device handle a situation in which you approach a black guy on your side of the street while there's a white dude on the sidewalk on the opposite side of the street? It would tell you to switch to the other side of the street, which is not politically- but probably statistically correct since it mathematically improves your chance of not ending up as a victim of various violent crimes in which blacks are overrepresented. A single, random black individual is statistically more likely to commit certain crimes than a single, random white individual.
A good black guy who won't do nuffin walking down the street is more likely to be unnecessarily avoided based on the advice of your portable risk-assessment computer than a good (or even a bad) white guy.
This effect should be minimized as more criteria like previous offenses, age, education, employability, family status, addiction etc are taken into consideration but that doesn't mean it will be mathematically eliminated.
Sure that's fair and as one of the people quoted for the article says even the most accurate prediction could produce a socially undesirable result with unbiased data. However, we shouldn't necessarily assume that because these predictions are spit out by an algorithm that they're inherently unbiased. There are people involved at multiple levels here and their human imperfections means the data won't always be perfect.
 
Last edited:
Private companies are allowed to develop these programs without releasing them to the public. Do the fucking math.
 
Private companies are allowed to develop these programs without releasing them to the public. Do the fucking math.

Fucking algebra is racist, dude!

And that, of course, means we should get rid of algebra, the chancellor of California's community college system told NPR this week.

Algebra is one of the biggest hurdles to getting a high school or college degree ? particularly for students of color and first-generation undergrads.
It is also the single most failed course in community colleges across the country. So if you're not a STEM major (science, technology, engineering, math), why even study algebra?
That's the argument Eloy Ortiz Oakley, chancellor of the California community college system, made today in an interview with NPR's Robert Siegel.
http://www.washingtonexaminer.com/algebra-is-not-racist-dont-get-rid-of-it/article/2629339
 
True and its worth discussing here
OK. It's a myth and the claim is a result of willful ignorance. The more criteria are taken into consideration when evaluating the role of race in the criminal justice system, the more it becomes obvious that there aren't significant unexplainable differences.

A new study, released in Septemeber of 2016, looks at the state of Delaware. Published by the Department of Criminology at the University of Pennsylvania, submitted to the Chief Justice of the Delaware Supreme Court, and the Delaware Access to Justice Commission’s Subcommittee on Fairness in the Adult Criminal Justice System.

Initial Situation:
"African Americans are overrepresented in the criminal justice system of Delaware. African Americans represent 22% of the population of residents in Delaware. Data [...] show that African Americans represent roughly 42% of arrests, 42% of criminal convictions, and 51% of incarceration sentences.
[...]approximately 57% of inmates serving time in the Delaware Department of Correction (DOC)’s facilities were African American."

What they did:
"This study relies on administrative records to track individuals from their arrest to their corresponding court dispositions, and examines the extent to which observed racial disparities in incarceration sentences and sentence lengths remain after statistically controlling for current case and charge characteristics, contextual factors, and criminal history. This study also details what racial disparities in incarceration commitments and sentences lengths would look like if White defendants had similar characteristics as African American defendants."

Findings:
"The results from this study indicate there are significant disparities in incarceration sentences and sentence lengths for African Americans relative to Whites. These disparities decreased to levels that were practically small for sentences to incarceration in a multivariate regression model that controlled for current case and charge characteristics, criminal history, and contextual factors (gender, age,county location, public defender,and pretrial detention), as well as a multivariate reweighting method that compared African Americans to similarly situated Whites. In both comparisons the unexplained differences in the probability of receiving an incarceration sentence were less than 1% (0.5%). The disparities in effective length of incarceration sentences decreased overall to levels (under 40 days) that were no longer statistically significant in the multivariate regression model, as well as the reweighting method that compared African Americans to similarly situated Whites."
http://courts.delaware.gov/supreme/docs/DE_DisparityReport.pdf

Sure that's fair and as one of the people quoted for the article says even the most accurate prediction could produce a socially undesirable result with unbiased data. However, we shouldn't necessarily assume that because these predictions are spit out by an algorithm that they're inherently unbiased. There are people involved at multiple levels here and their human imperfections means the data won;t always be perfect.
Depends on what political philosophies you're subscribing to. If you believe that ethical decision making and justice can only be evaluated on an individual level, something I'd agree to, then somebody's freedom shouldn't depend on correctness and accuracy of predictive modeling.
If you believe, like most people on the left side of the contemporary political scale do, that some sort of common good not only exists but also tops individual rights, it's far less problematic.
Sure, some individuals have to bite the bullet but the prediction will be accurate more often than not, that's how statistics work.

Philosophically, I believe something like this shouldn't be used at all to decide over somebody's future.
From a pragmatic/consequentialist standpoint, the alternative is probably the (un)educated guess of some fat official who graduated in the lower 50% of his class, which depends on what he had for breakfast.
Much of the critique just seems to be a misunderstanding of what 'algorithm', 'bias' or 'correctness' mean.


Private companies are allowed to develop these programs without releasing them to the public. Do the fucking math.
They obviously should be open source. And the article should focus on that.
 
Fucking algebra is racist, dude!

And that, of course, means we should get rid of algebra, the chancellor of California's community college system told NPR this week.

Algebra is one of the biggest hurdles to getting a high school or college degree ? particularly for students of color and first-generation undergrads.
It is also the single most failed course in community colleges across the country. So if you're not a STEM major (science, technology, engineering, math), why even study algebra?
That's the argument Eloy Ortiz Oakley, chancellor of the California community college system, made today in an interview with NPR's Robert Siegel.
http://www.washingtonexaminer.com/algebra-is-not-racist-dont-get-rid-of-it/article/2629339
Non-STEM majors in two year programs should probably have the option of taking intermediate algebra plus a stats course in place of a full college algebra course, imo, but that's not relevant here.
 
They obviously should be open source. And the article should focus on that.
Agreed. And I would want to know who the cock christ governments are entrusting the inputs to. It's a phenomenally stupid and lazy system and hugely susceptible to error and abuse.
 
Shit like technology and algorithms is white people shit so of course it's racist lmao are you guys seriously trying to argue otherwise

#woke
racistcomputers.jpg
 
Computer algorithms use statistics to draw racist conclusions with 0 subjectivities

Those conclusions don't agree with my leftist egalitarian values, therefore the computers are flawed
 
Computer algorithms use statistics to draw racist conclusions with 0 subjectivities

Those conclusions don't agree with my leftist egalitarian values, therefore the computers are flawed
No, duncepot, people are questioning the algorithms which are created by humans. The computers didn't create the algorithms.
 
Code:
switch(race) {
    case "black":
    case "latino":
    case "muslim":
        discriminate();
        break;
    default:
         privilege++;
         enjoy(privilege);
}

Can anyone code review my racist code?

Version 2 should include sexism!
 
Non-STEM majors in two year programs should probably have the option of taking intermediate algebra plus a stats course in place of a full college algebra course, imo, but that's not relevant here.

I think if you are too stupid to pass algebra (which my terrible math ass passed in 10th grade) then you don't belong in college.
 
Code:
switch(race) {
    case "black":
    case "latino":
    case "muslim":
        discriminate();
        break;
    default:
         privilege++;
         enjoy(privilege);
}

Can anyone code review my racist code?

Looks good but this day and age race should be an array of some kind. This'll do though. Ship it.
 
Back
Top