menu
Qammunity.org
Login
Register
My account
Edit my Profile
Private messages
My favorites
After the american war the united states emerged as a __ __ after the victory over Spain
Ask a Question
Questions
Unanswered
Tags
Categories
Ask a Question
After the american war the united states emerged as a __ __ after the victory over Spain
asked
Jun 24, 2021
186k
views
1
vote
After the american war the united states emerged as a __ __ after the victory over Spain
History
college
Nathan Merrill
asked
by
Nathan Merrill
8.3k
points
answer
comment
share this
share
0 Comments
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
1
Answer
2
votes
Answer:
The United States emerged as a world power as a result of victory over Spain in the Spanish American War. The United States gained possession of the Philippines, Guam, and Puerto Rico.
Step-by-step explanation:
Ambrosia
answered
Jun 29, 2021
by
Ambrosia
7.2k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
← Prev Question
Next Question →
No related questions found
Ask a Question
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.
9.4m
questions
12.2m
answers
Other Questions
What goal of the constitution was also a goal of the Magna Carta?
is it true or false that after the american revolution conflicts in the northwest territory erupted between remaining british soldiers and native americans
How did world war 1 affect the racial and ethnic makeup of american cities
What was an effect of nationalism in Europe in the early 1900s?
What were the positive and negative effects of Egypt being imperialized by Britain.
Twitter
WhatsApp
Facebook
Reddit
LinkedIn
Email
Link Copied!
Copy
Search Qammunity.org