gmwilliams profile image 82

As an American, do you believe that healthcare should be left to the individual or should healthcare


be an institutionalized, mandating that everyone pay taxes in order to subsidize such healthcare? Is American healthcare better left to the individual or should the government be increasingly involved in healthcare?

 

sort by best latest

bradmasterOCcal profile image78

bradmasterOCcal says

You can help the HubPages community highlight top quality content by ranking this answer up or down.

3 weeks ago
 |  Comment

2 answers hidden due to negative feedback. Show