During the early 20th century, imperialism became an idea that began to sprout in America. Imperialism was a country's act of expanding its power and influence to international land. Similar in many other eras in American history, many people supported different sides of the nation's new journey. Anti-imperialism was against spreading American policy, culture and power into foreign lands because they strongly believed in isolationism [mainly to avoid possible conflict that would arise from involving themselves in other unknown areas]. Some also believed that it was unjust to dictate rules and have control over areas in the world. Instead, they encouraged focusing on domestic improvement and avoiding possible immigration.