54.4k views
5 votes
Is healthcare an American right ?

User Nechoj
by
4.3k points

2 Answers

7 votes

Answer:

no it is not a right

Step-by-step explanation:

Under the American system you have a right to health care if you can pay for it, i.e., if you can earn it by your own action and effort. But nobody has the right to the services of any professional individual or group simply because he wants them and desperately needs them.

User ZecKa
by
4.2k points
3 votes

Answer:

no, healthcare is not a right, only health insurance

Step-by-step explanation:

User Hurb
by
4.2k points