Home More advice Politics & Law
The native people where exactly? Usually the Europeans come to places and claim that they own it because it's their right, then they force their religion onto the natives and tell the natives to abandon their culture... They take everything and they give them diseases and laws and blah blah blah. They claim that the natives are "primitive" even though they're extremely intelligent (like the Hawaiians being able to sail canoes without compasses/natives in Africa navigating "barren" lands without any modern devices/etc). It's just sad..... The natives always welcomed the Europeans and then everything is taken away from them...