Western United States

The Western United States, commonly referred to as the American West or simply "the West," traditionally refers to the region comprising the westernmost states of the United States. Because the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier. Since then, the frontier moved further west and the Mississippi River was referenced as the easternmost possible boundary of the West.

The West mostly comprises arid to semi-arid plateaus and plains and forested mountains.

In the 21st century, the states which include the Rocky Mountains and the Great Basin to the West Coast are generally considered to comprise the American West.

Read more about Western United States:  Region and Concept, Demographics, Natural Geography, History and Culture, Major Metropolitan Areas, Politics

Famous quotes containing the words united states, western, united and/or states:

    The United States have a coffle of four millions of slaves. They are determined to keep them in this condition; and Massachusetts is one of the confederated overseers to prevent their escape.
    Henry David Thoreau (1817–1862)

    Writers, you know, are the beggars of Western society.
    Octavio Paz (b. 1914)

    I thought it altogether proper that I should take a brief furlough from official duties at Washington to mingle with you here to-day as a comrade, because every President of the United States must realize that the strength of the Government, its defence in war, the army that is to muster under its banner when our Nation is assailed, is to be found here in the masses of our people.
    Benjamin Harrison (1833–1901)

    Colonel [John Charles] Fremont. Not a good picture, but will do to indicate my politics this year. For free States and against new slave States.
    Rutherford Birchard Hayes (1822–1893)