What is the meaning of Colonialism?
The policy of a country seeking to extend or retain its authority over other people or territories, generally with the aim of economic dominance.
Any form of foreign influence seen as undesirable.
A colonial word, phrase, concept, or habit.
Colonial life.
Source: wiktionary.orgSearch words containing