Answer:
The Vietnam War had a profound effect on America. The war also drastically decreased Americans' trust in political leaders. In foreign policy, the U.S. suffered from the so-called Vietnam Syndrome: a fear of getting involved in foreign ground wars that might become long, bloody stalemates with no foreseeable end.