Ask Question
5 May, 01:23

Why did many Americans feel it was important for the United States to gain control of Florida?

+3
Answers (1)
  1. 5 May, 02:17
    0
    Britain and Spain ruled it plus slaveships were brought in.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Why did many Americans feel it was important for the United States to gain control of Florida? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers