from The American Heritage® Dictionary of the English Language, 4th Edition
- n. A policy of imperialistic expansion defended as necessary or benevolent.
- n. The 19th-century doctrine that the United States had the right and duty to expand throughout the North American continent.
from Wiktionary, Creative Commons Attribution/Share-Alike License
- n. The political doctrine or belief held by the United States of America, particularly during its expansion, that the nation was destined to expand toward the west.
- n. The political doctrine or belief held by many citizens of the United States of America that their system is best, and the idea that all humans would like to become Americans.
- n. The belief that God supports the expansion of the United States of America throughout the entire North American continent except Mexico.
The phrase was first used primarily by Jackson Democrats in the 1840s to promote the annexation of much of what is now the Western United States. (Wiktionary)