Ask Question
28 September, 07:14

What was the role of America after WWI?

+4
Answers (1)
  1. 28 September, 10:51
    0
    Under President Woodrow Wilson, the United States remained neutral until 1917 and then entered the war on the side of the Allied powers (the United Kingdom, France, and Russia). The experience of World War I had a major impact on US domestic politics, culture, and society.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “What was the role of America after WWI? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers