They effect how they see things now, they force them into signing over there rights and land and if they don’t they declare war and the colonizers almost always win. Once they take over they usually will turn the people who where originally there into slaves and make them work, the colonizers will also put in place a new government so they can gain all the power