Ask Question
29 July, 02:17

What social changes took place in the United States after World War II? What role did the war play in those changes?

+3
Answers (1)
  1. 29 July, 03:31
    0
    There were two marking social changes that took place in the United States after the World War II, and as a result of the same. One of them was that the women started to get much more opportunities in life, especially in the working field, which resulted in bigger economic independence of the women, and improvement of their rights. The other one was that the people of other races, especially the African Americans, also gained they rights, as well as much more opportunities in life, as with the women, especially in the working field. The main reason why this happened was that lot of the men were sent to war, and led to big shortage of labor force. In order for the economy to continue to run and grow, the owners of the companies started to employ the people they had available, and in abundance, and that were the women and the African Americans.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “What social changes took place in the United States after World War II? What role did the war play in those changes? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers