7.3k views
3 votes
What happened to american politics after barack obama became president? republicans in congress respected the wishes of the elected president. congress tipped entirely in favor of the democratic party. the two political parties became even more divided. obama and congress worked out a number of important compromises.

2 Answers

7 votes

Final answer:

After Barack Obama became president, American politics became more divided, with Republicans in Congress often resisting his wishes. However, Obama was able to work out some important compromises with Congress.

Step-by-step explanation:

After Barack Obama became president, American politics became even more divided. Republicans in Congress often resisted Obama's wishes, leading to political gridlock and difficulty passing important legislation. Despite these challenges, Obama was able to work out some important compromises with Congress.

User Deerox
by
8.8k points
1 vote

Final answer:

The two political parties became even more divided after Barack Obama became president, which resulted in political gridlock that hindered legislative progress despite some significant accomplishments like the TARP program and Obamacare.

Step-by-step explanation:

After Barack Obama became president, American politics saw an era of heightened partisan division and political gridlock. The Republicans retained control of the House of Representatives, and the razor-thin Democratic majority in the Senate made it difficult to secure major legislative victories.

Both parties became more ideologically polarized, with moderate members being marginalized and increasing unwillingness to compromise.

Despite this, some significant accomplishments like the TARP program and Obamacare were enacted. The challenges of governing with a divided Congress were profound, necessitating negotiation and often resulting in a standstill on various issues.

User Rmmoul
by
8.2k points