12.0k views
0 votes
Which country has influenced foreign policy in the United States the most since the end of World War II?

Japan
Russia
France
Britain

User Jay Askren
by
3.6k points

1 Answer

4 votes
Japan :))))))))))))))))))
User Wojo
by
3.1k points