I'm curious, does this "unconscious bias" affect only white people? Usually, this sorta thing is trotted out to condemn white men who would dare to deny that they're racist...So to convince them, you tell them they really are racist, but they're just not aware of it... As for your question, I would argue that such "biases" are fundamental to the human condition. Black, white, brown, etc.. will almost always assume things about other races and cultures. The best we can do is to try and teach our kids to respect one another and judge all men by the content of their character. Making people attend "diversity" meetings and such will only further the divide, IMHO.