Ask Question
7 July, 02:50

Which event finally brought the United States into World War II?

a. Japan's attack on Pearl Harbor

b. Germany's invasion of France

c. Britain's attack on Gibraltar

d. Italy's invasion of Greece

+2
Answers (1)
  1. 7 July, 06:22
    0
    It was "a. Japan's attack on Pearl Harbor" that finally brought the United States into World War II, since this was a direct attack on a United States military establishment, which was an indisputable act of war.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Which event finally brought the United States into World War II? a. Japan's attack on Pearl Harbor b. Germany's invasion of France c. ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers