What Does Manifest Destiny Mean?

Answer

Manifest destiny was a theory in the United States in the 19th century. According to the theory, white Christian Americans gave themselves the right to take whatever they wished because god had chosen them. It was meant to make the native Americans inferior to the white Christians and extend their territory.
1 Additional Answer
Ask.com Answer for: what does manifest destiny mean
Manifest Destiny
NOUN
1.
the belief or doctrine, held chiefly in the middle and latter part of the 19th century, that it was the destiny of the U.S. to expand its territory over the whole of North America and to extend and enhance its political, social, and economic influences.
Source: Dictionary.com
About -  Privacy -  Careers -  Ask Blog -  Mobile -  Help -  Feedback  -  Sitemap  © 2014 Ask.com