Ask Question
1 August, 18:50

How did America get Florida?

+1
Answers (1)
  1. 1 August, 21:34
    0
    In 1763 the Treaty of Paris was signed by England, France and Spain and it resulted in England gaining the Florida Territory. But when England formally recognized the colonies' independence (as the United States) in 1783, the Florida Territory was returned to Spain without clear definition of its boundaries.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did America get Florida? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers