Ask Question
28 January, 09:48

how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply

+5
Answers (1)
  1. 28 January, 13:34
    0
    Society became more open, and women experienced greater freedom.

    Women began to seek out new careers.

    Women challenged old traditions by doing things such as changing their clothing style.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers