Ask Question
27 June, 14:24

How did the role of women in the United States change during and after

World War II?

+2
Answers (1)
  1. 27 June, 16:26
    0
    World War II provided unprecedented opportunities for American women to enter into jobs that had never before been open to women, particularly in the defense industry ... After the war, many women were fired from factory jobs. Nevertheless, within a few years, about a third of women older than 14 worked outside the home.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How did the role of women in the United States change during and after World War II? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers