Ask Question
25 July, 08:22

What did the war do to the relationship between the american colonies and england?

+3
Answers (1)
  1. 25 July, 10:34
    0
    American colonies eventually become independent creating the United States of America and establishing their own laws.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “What did the war do to the relationship between the american colonies and england? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers