Answer: This question is a matter of opinion. In my personal opinion, I don't think that the United States is racist any more than other places in the world, but there are people, mostly the media, who want it to be. Honestly answer me this: Walking down the street do you see a black person and think "there's a black person!". And if you go to a store and the person who helps you is Mexican, do you leave becuase of that? If one single person says yes to that then they are unlike any person I have ever met. America isn't racist but if the newscaster says it is then everyone will think so, that's the job of a newscaster. So no, I don't think america would be less racist if black people had more political power, I think it would be less racist if the media decided to focus on something else. I hope I am not alone in this opinion and if this is not the answer you are looking for then you can rate it low or delete it, but I hope we are not so lost that we can't agree on something so simple as this.
Step-by-step explanation: