12.0k views
0 votes
Which country has influenced foreign policy in the United States the most since the end of World War II?

Japan
Russia
France
Britain

User Jay Askren
by
8.1k points

1 Answer

4 votes
Japan :))))))))))))))))))
User Wojo
by
7.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.