Ask Question
26 July, 03:10

Hawaii became a US territory after?

American planters overthrew the Hawaiian government.

the United States seized the islands from Spain.

the United States bought the islands from the

royal family.

+2
Answers (1)
  1. 26 July, 05:37
    0
    The correct answer is A. American planters overthrew the Hawaiian government.

    Explanation:

    The territory of Hawaii was officially annexed to the U. S. in 1898; this was only possible because the previous government in Hawaii, which was a monarchy, was overthrown. This occurred in 1893 as planters on the island including mainly American planters and natives organized to end this type of government and make the Queen Liliuokalani leave the throne because they did not agree with the actions of the queen. After this monarchy ended, a new government began in Hawai and some years later the U. S. created a treaty to annex the territory, which was considered a strategic location for war. Thus, Hawaii became a US territory after American planters overthrew the Hawaiian government.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Hawaii became a US territory after? American planters overthrew the Hawaiian government. the United States seized the islands from Spain. ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers