Respuesta :

Answer:

The institution of slavery planted racism into American society by the segregation of the races. Black people were slaves while white people were slave-owners. This direct association of black people being beneath white people in the close history of the United States has given numerous white people and white organizations room to claim it is the way of the United States. Not only do the majority of the white American people associate slavery with black people, but even with the desegregation of black people and illegalization of slavery, the mindset has remained.