Racism in the United States dates as far back as during the colonial era; the deep-rooted feeling that people of specific colors other than white are inferior because of skin pigmentation results from the first impression. The Colonial era, especially the Europe-Africa colonialism, began in the 1880s, resulting from the 15th-century Transatlantic slave trade.
This Article is Free for Subscribers
Access 2000+ premium insights, visa updates, and global lifestyle stories all in one place.
Login if you have purchased
































