Ask Question
31 July, 10:42

How did world war i change women's roles in the united states?

+2
Answers (1)
  1. 31 July, 11:20
    0
    Women were greatly valued in their ability to work and jobs for women became easier to come by and pay was increased
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did world war i change women's roles in the united states? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers