Ask Question
15 June, 10:45

What changes began happening for America once world war 2 was over

+4
Answers (2)
  1. 15 June, 14:03
    0
    The Red Scare and Cold Was came after WW2. Anti-Communist sentiments grew quite quickly which led to many witch hunts for communists in the government. Famous ones like Hollywood 10 and McCarthyism was popularized
  2. 15 June, 14:14
    0
    Economic prosperity was the best thing that happened after the war for the USA. Other changes included minorities trying to fight for their civil rights including African Americans, Latinos and women.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “What changes began happening for America once world war 2 was over ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers