Answer:
During the time of annexation, the American Imperial policy was to expand throughout the pacific. As much as imperialism is bad, Hawaii was a strategic territory during WW2 and would've easily fallen into Japanese hands if the US did not annex it. Now Hawaii serves as one of the most economically prosperous states and one of the most visited. So as bad as it sounds, the US was in the right to annex the Hawaiian islands and benefitted both the country and state in the long run.