Ask Question
4 November, 01:03

How did World War I change the lives of American women

+5
Answers (1)
  1. 4 November, 03:49
    0
    World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities. The American public felt a great sense of nationalism and patriotism during the war as they were unified against a foreign threat.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did World War I change the lives of American women ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers