» »

Dictionary Meaning Of West



Formerly, that part of the United States west of the
Alleghany mountains; now, commonly, the whole region west of the
Mississippi river; esp., that part which is north of the Indian
Territory, New Mexico, etc. Usually with the definite article.





Related Article


Posted By KellyChi On 02:01 Sun, 21 Jan 2018

Comments:0 || Views:

Comment
Name






.....................

Please LOGIN or REGISTER To Gain Full Access To This Article