Ask Question
5 February, 11:48

How did the definition of west change in the years 1800 to 1860?

+3
Answers (1)
  1. 5 February, 15:06
    0
    The "West" began as any land between the Appalachians and the Mississippi River. After the Louisiana Purchase in 1803, the West was now any of the new territory beyond the Mississippi and north of Spanish territory in the south (Mexico, Texas, etc). By the time of the Civil War, the United States' territory extended to the Pacific and most of the landmass was what we now recognize as the continental USA.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did the definition of west change in the years 1800 to 1860? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers