Answer:
Hawaii had become an American territory during 1989 after the Spanish-American War.
Step-by-step explanation:
Along with Manifest Destiny, after the war, the United States had made an unofficial empire in the Pacific. With the Philippines as its biggest colony, Hawaii was much closer to home therefore it was annexed. Hawaii held a geographical strategic value and eventually in the future would turn into an economically prosperous state.