Final answer:
The United States did not practice imperialism in Hawaii.
Step-by-step explanation:
The United States did not practice imperialism in D. Hawaii. While the United States did practice imperialism in Africa, Latin America, and Asia, Hawaii was not included in these efforts. The United States acquired Hawaii in 1898 through annexation, making it a part of its empire.