Ask Question
16 March, 03:31

How did world war i change women's roles in the united states? women received greater educational opportunities. women fought alongside men in the military. women replaced men in the workforce. women earned more money than men.?

+5
Answers (1)
  1. 16 March, 07:15
    0
    During the World War I, women were more involved in forming organizations that worked to bring relief to areas that were war-torn especially those countries in Europe. They dedicated themselves in supporting and expanding the war effort. So, if I have to choose among the choices, I'll go with the third choice.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did world war i change women's roles in the united states? women received greater educational opportunities. women fought alongside men ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers