American imperialism is a term referring to the economic, military, and cultural influence of the United States on other countries. The concept of an American Empire was first popularized during the presidency of James K. Polk who led the United States into the Mexican–American War of 1846, and the eventual annexation of the territories like California and the Gadsden purchase.
Read more about American Imperialism: Imperialism and Empire, American Exceptionalism, Imperialism At The Heart of U.S. Foreign Policy, Views of American Imperialism, U.S. Military Bases, Benevolent Imperialism, Factors Unique To The "Age of Imperialism", Debate Over U.S. Foreign Policy
Famous quotes containing the word american:
“I ask you to join in a re-United States. We need to empower our people so they can take more responsibility for their own lives in a world that is ever smaller, where everyone counts.... We need a new spirit of community, a sense that we are all in this together, or the American Dream will continue to wither. Our destiny is bound up with the destiny of every other American.”
—Bill Clinton (b. 1946)