59.6k views
25 votes
What is your opinion about Hawaii being annexed by the United States? Explain.

User TheBook
by
5.3k points

2 Answers

2 votes

Answer:

During the time of annexation, the American Imperial policy was to expand throughout the pacific. As much as imperialism is bad, Hawaii was a strategic territory during WW2 and would've easily fallen into Japanese hands if the US did not annex it. Now Hawaii serves as one of the most economically prosperous states and one of the most visited. So as bad as it sounds, the US was in the right to annex the Hawaiian islands and benefitted both the country and state in the long run.

User Dogsgod
by
4.8k points
8 votes
I don’t think Hawaii should be annexed by the United sates because Hawaii is a tropical place and is a get away for us citizens. Many of our natural resources come from the untied states and in result
User Tao Zhyn
by
5.1k points