Ask Question
6 March, 05:13

Did the war change the role of women in American society

+2
Answers (1)
  1. 6 March, 06:59
    0
    Women's work in WW1. During WWI (1914-1918), large numbers of women were recruited into jobs vacated by men who had gone to fight in the war. New jobs were also created as part of the war effort, for example in munitions factories.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Did the war change the role of women in American society ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers