In the United States, health care is not recognized as a basic human right. This is largely because the U.S Constitution and the Bill of Rights do not recognize access to health care as a basic human right. Nevertheless, the government has made efforts to increase access to health care particularly through legislative action