Ask Question
2 August, 20:26

How did the united States gain the territory if Florida

+2
Answers (1)
  1. 3 August, 00:08
    0
    In 1763 the Treaty of Paris was signed by England, France and Spain and it resulted in England gaining the Florida Territory. But when England formally recognized the colonies' independence (as the United States) in 1783, the Florida Territory was returned to Spain without clear definition of its boundaries.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did the united States gain the territory if Florida ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers