12.0k views
0 votes
Which country has influenced foreign policy in the United States the most since the end of World War II?

Japan
Russia
France
Britain

User Jay Askren
by
5.4k points

1 Answer

4 votes
Japan :))))))))))))))))))
User Wojo
by
4.8k points