I'm not entirely sure why it became an employer's responsibility to provide health insurance. It makes sense, healthy employees are more productive - but a requirement to operating a business?? Isn't that up to the individual, and possibly the government?
No other industrialized nation has this arrangement... and I think if we are going to fix healthcare issues in this country, we need to examine the bizarre way it is set up - the disconnect, and yet required relationship between the patient, doctor, insurance company, employer, and the government... have I left anyone out? Oh yes, lawyers who can sue the doctor, lawyer, or employer.
I hope I'm not coming across as bitter - but it's a complicated problem, and that's not going to change unless we restructure the whole system.