Do Companies have to Provide Health Insurance – No

In order to answer the question “Should businesses be required to pay for health care insurance?” we need to back-track a little and look at the history of commerce and government in our country.  The very minute English settlers traded products in the original 13 colonies, American business and capitalism was born.  The idea that one was free to determine his or her own future with his or her own unique abilities and hard work is what has set America apart from the rest of the world for more than two centuries.  

As America has grown up, our government has grown bigger and more powerful.  In the past, the idea of paving your own way or hard work being the way to succeed, was the majority opinion.  As that government grew, so did welfare.  As that government continued to flex it’s muscles, it began to give more and more handouts.  As these handouts became commonplace, our attitude as a nation began to change as well.  Yes, for the most part, we still believe in American exceptionalism . We believe that, if we put our minds to it, Americans can accomplish almost anything.  Unfortunately, years and years of welfare from an ever growing government, have also helped turn many Americans into dependents of the state.  

We now expect things we don’t earn.  We believe that it is our right to receive services…just for being an American citizen.  Even though we pretend it’s not the case, for the most part we elect the politician who promises us the most.  

This was very much at the forefront during 2010.  The Health Care debate raged on and on, throughout the summer, and both sides were heated.  Those in favor of the new, massive health care bill spoke of the need for government to step in and help those less fortunate.  Those against the new bill argued that the bill would allow the federal government to minimize the private sector, thereby tearing down what has made this country a great economic power…private business owners, small business, and the free market.  

So, the question has a more more defined root.  We now live in a society that expects to be given things.  We believe that, because we live here in this country, those who have are morally required (and should be legally required) to give up of what they have earned in order to provide for the “less fortunate” of our society.  

Businesses are here to make money.  Period.  A private, for profit, business has one goal that stands above the rest:  Make a profit.  Call it greedy, call it terrible, call it immoral, but that is why business begins.  Someone has an idea/talent and they decide they want to make a living with that idea/talent.  The use their own capital and resources, they take risks and chances that could possibly negatively affect them if they don’t succeed, and if all of the planets align, they reap the benefits of their talent/capital/risks.  Here is something I want you all to read and burn into your brains:

BUSINESSES ARE NOT AROUND TO PROVIDE JOBS OR BENEFITS

Say it to yourself 15 times and remember this.  Jobs provided by business are wonderful consequences of a business that is profitable.  Businesses that are able to provide benefits such as health insurance are able to do so because they have enough profit to justify the expenditure.  Business should never be required to give anything other that a place to work while the business is profitable and successful to a point in which the business feels able to do so.  Anything extra a business offers is gravy.  The minute business is required to take less profit to pay for things such as health insurance, is the minute you see companies go out of business, lay off workers, or dramatically increase the prices of their goods in order to compensate for the increased benefit cost.  

Once we realize, as a society, what business is here to do we will be much better off.  Business should never be “required” to provide health insurance coverage.  Hopefully they are profitable enough to do so….