76.8k views
24 votes
True or False?

The United States and Japan were still enemies AFTER World War II?

User Saad Ali
by
7.8k points

1 Answer

9 votes

Answer: False, post war world II the nations forged a strong alliance.

Step-by-step explanation:

User EgyEast
by
7.6k points