Some people are informed however there are people who believe that health care isn't a basic human right. Why is that many other countries like Canada, France, and Denmark are giving their citizens free health care and they're in less debt then us? Our constitution states that all men are created equal but that isn't evident today. Why is that? Why is it that people who are wealthy have access to good health care but the working class doesn't?