Currently all employers are required to offer employees mandated benefits such as Social Security deductions and disability benefits, Unemployment insurance, Medicare deductions, and Workers' Compensation. There may be more depending upon the size of the employer. Do you think all employers regardless of size should be required to provide these benefits and more? Certain employers starting next year will be required to provide health insurance. Do you agree with this? What benefits, if any, do you think should be legally mandated?
My assumption of what you are asking for is simply an informed opinion based on the structure of your questions.
I will provide two perspectives:
If I agreed with the conservative stance. I believe that we are at a happy medium without mandated health coverage. The items that are required to be covered: Social Security, Disability, Unemployment, Medicare, and Workers' Compensation benefit both the employee and the employer, by transferring risk off of the employer and onto the government/insurer. In addition these items are well established and companies are use to paying them. Adding health insurance to the mix is dangerous as it is cost intrusive and can hamper job growth and progress. Further it has a ...
This short write-up provides a liberal and conservative opinion on government enforced employer required payroll items such as health care, disability insurance, unemployment, and Social Security. This writeup is informed opinion and contains no sources.