91.2k views
0 votes
What do you believe is the role of the government in creating a healthier America?

User Hjblok
by
8.9k points

1 Answer

2 votes
the role of the government trying to make a healthier america is actually making it worse with in schools and certain types of restaurants and such as the living quarters of the butcher animals in the raise houses or on the farms the animals are not in healthy environments such as a wild animal or a free ranged beef cow which has more flavor and and is healthier to eat 
User Jensph
by
9.0k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.