Should Businesses be Required to Pay for Health Care Insurance – No

Should businesses be required to pay for health care insurance?

Absolutely not. Is health care insurance unbelievably high? Absolutely. Are ridiculous law suits causing the malpractice insurance to increase, which causes the premiums to increase? Absolutely. But why should a business owner be the individual responsible for this part of their employee’s life and have to carry that economic burden? And why stop there? Why not have them pay for life insurance or better yet the college education of their employee’s children?

Many business owners realize the value of health benefits and strive to be able to afford offering this perk to their employees. As a new business owner I am looking forward to the day when I am big enough to warrant a few employees. However, if the government was to mandate I had to offer health insurance to my employees I would have to prolong that avenue of my business as well as the growth of my company. The domino effect of that decision is wide spreading. I wouldn’t hire employees or I would only hire one, my business would become more stagnate in its growth because of the lack of assistance, the people I would have hired would have to find another job, their economic position becomes stagnate, and the list goes on and on.

Now, flip that around and allow me to choose whether or not I offer health insurance. The market would require me to offer some kind of benefits, whether it was more pay or health insurance it would have to compensate for what my potential employee would receive elsewhere. My options would be simple: pony up for the extra benefits or hire an employee who is not as well educated, experienced or generally suitable for the job. The latter is not the option I would want to take.