76.8k views
24 votes
True or False?

The United States and Japan were still enemies AFTER World War II?

User Saad Ali
by
5.2k points

1 Answer

9 votes

Answer: False, post war world II the nations forged a strong alliance.

Step-by-step explanation:

User EgyEast
by
5.4k points