Ask Question
25 November, 01:09

How did World War 1 change women's roles in the United States

+5
Answers (2)
  1. 25 November, 01:43
    0
    Gave them more jobs and a chance to not just be a "stay at home mom" they got to work in factories (they took over the "mens" jobs
  2. 25 November, 04:53
    0
    World War one changed women's roles by giving them the ability to get other jobs other than working at home. Women were able to not only gain some of he rights as a man did but they were able to work if they wanted to. Typically women were expected to stay at home and take care of household things such as cooking, cleaning and taking care of the kid (s) but the war opened opportunities for women to get jobs and do things outside of the typical household.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did World War 1 change women's roles in the United States ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers