Ask Question
16 May, 10:21

Did britain gain control of the west after the french and indian war

+4
Answers (1)
  1. 16 May, 12:24
    0
    The French and Indian War began in 1754 and ended with the Treaty of Paris in 1763. The war provided Great Britain enormous territorial gains in North America, but disputes over subsequent frontier policy and paying the war's expenses led to colonial discontent, and ultimately to the American Revolution. - Google
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Did britain gain control of the west after the french and indian war ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers