Colonialism
Colonialism is defined as the state-organized occupation of a territory and the domination of its population by force. Between the 15th and 20th centuries, European and, subsequently, US actors colonized large parts of Africa, Asia, the Americas, and Australia to exploit people and resources. Local communities were oppressed, enslaved, and killed. This was justified through racist ideologies of supposed white superiority. The German Empire also had colonies. Following World War II, decolonization processes were underway, however, they were often delayed by protracted negotiations, political resistance, or violent conflicts. Until the 1970s, many European states undermined the independence of colonized countries, resulting in the ongoing presence of structural inequalities.