Submit a question to our community and get an answer from real people.
Submit

What ended the war between the United States and Japan?

Report as

The war between the United States and Japan ended after the Atomic bombings of Hiroshima and Nagasaki in August 1945. The two countries later improved their relations in the years after World War II. During this time, they became interdependent economic partners.

Helpful Fun Thanks for voting Comments (0)
Report as
Add a comment...

The atomic bomb that was dropped on Hiroshima & Nagasaki.

Helpful Fun Thanks for voting Comments (0)
Report as
Add a comment...

Fat Man and Little Boy.

Helpful Fun Thanks for voting Comments (0)
Report as
Add a comment...
Do you have an answer?
Answer this question...
Did you mean?
Login or Join the Community to answer
Popular Searches