Ask Question
16 May, 23:08

How did life change for women in the United States after World War I started?

+3
Answers (2)
  1. 16 May, 23:51
    0
    Because many men were sent off to war, women took the men's positions in the workplace and so after the war, women had more non-traditional roles at work
  2. 17 May, 02:42
    0
    It changed the lives of women a lot. During the War, most men were drafted into the war, which opened up a ton of jobs for women. Women were able to now volunteer/work for the Marines, Army, and Navy and take jobs left behind by men. So, their lives changed because the War gave them jobs and other oppurtuinites.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did life change for women in the United States after World War I started? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers