America helped Japan I THINK probably in the first world war. So America emerged as the most powerful country in the world. America had power of Foreign policies. They also had a much stronger effect after the shock of unexpected world war 2 I think, yes two. BTW the answer at the bottom helped me tbh. I forgot some of this