What is the job of an employer?
Hopefully this topic is not controversial. I was listening to Mike Rowe (the host of Dirty Jobs) the other night on Huckabee. They were talking about the need to increase minimum wages and provide healthcare to all workers so the employees can live their own version of a content lifestyle. Mike Rowe stated plainly that he does not think the employer should be entitled to provide for basic necessities of the employee. So I guess the employer is to provide a pay for an honest day's work. The retirement/insurance should be overseen by the employee in their individual life.
I have very little say because I have never had health insurance provided by my employer but I have had my taxes taken out towards medicare/government stuff. I am starting my own retirement fund with a roth ira so I don't know the other side of the coin. I've only worked part-time because I am an undergrad. There is the problem with part-time work. It seems that part-time work was intended for those who already had a stable financial bearing and needed some extra money instead of something that is suppose to compromise the entire financial plan for the person. But presently part-time jobs are more prevalent that full time due to the poor economy at the moment so some people seem to be expecting the same benefits of full time with part time work.
So is the job of an employer to provide retirement/insurance? Are such prices already taken out of the paycheck and into a fund already? Should the employee have full responsibility over their retirement and insurance plans? Is the employer just entitled to give a safe environment, appropriate hours and a paycheck?