It was terrible.
Before, it wasn't all so bad. They weren't enemies but they weren't exactly allies either. But after Japan attacked the Pearl Harbor, every relation with each other was severed as this was an act of betrayal.
After the Hiroshima bombing, Japan fell to defeat and gave up. I mean...who wouldn't? The two bombs left the nation shook and unable to bounce back.
After WW2, their relations got better after the Allies Nations (where Japan was NOT in) helped the Japanese reform.
Now, United States and Japan are considered one of the closest allies.