The ideological goals of the fascist powers in Europe during World War II and the growing aggression of Germany led many Americans to fear for the security of their nation, and thus call for an end to the US policy of isolationism. After World War II, the US became fully interventionist.