Ask Question
1 February, 17:11

How have women right changes affected American society? Consider the family structure, economic health, and the strength of the work force. Use specific moments of history as examples. In your opinion, has greater gender equity improved American society? Consider how the changing role of women has changed the identity of American society in Americans' eyes and the world's.

(answer has to be at least a paragraph)

+5
Answers (1)
  1. 1 February, 17:34
    0
    In the early 1900's women almost had no rights at all, from not being able to vote to not even to have day jobs along with their husband. On August 18,1920 all of that changed due to congress passing the 19th amendment, which granted the right of women to vote. The entire country changed that day women everywhere were overjoyed. Most men did not agree with the amendment however, they felt as if they were the dominant gender and that women were not supposed to have those rights. Now women make up about fifty percent of the votes which in the U. S. is about 162 million women voting.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “How have women right changes affected American society? Consider the family structure, economic health, and the strength of the work force. ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers