The concept of the American Empire was first popularized as a result of the Spanish-American War of 1898 years. The last decades of the nineteenth century were a period of rapid economic growth for the United States. The largest American corporations, not content with the dominant position in the internal life of the country, sought to penetrate foreign markets and subordinate to its influence neighboring territories. The United States was particularly interested in the Caribbean and Latin America, seeking to weaken the positions of the European powers in these regions.