Image via Wikipedia
What really has me irked is that the insurance companies have embarked on a mission of disinformation and in many cases out right lies. And yet, Americans are believing this. What has the insurance companies done for you lately? Doesn't it matter that a good 1/3 or more of your paycheck per month is paying them? I don't get it. Come on people, look beyond the lies and base your opinions on fact.
It has been shown that if every American put in 1% of their income to the government healthcare plan every American will be covered. Majority of the European countries and Canada have a national healthcare plan that ensures every one of its citizens can rest assured that their health emergencies won't leave them broke.
In a modern society it is every man, woman, and child's right to have adequate healthcare coverage. No more should people be deciding whether or not to pay the rent, buy groceries, or heat their homes at the expense of purchasing prescriptions.
You may not agree, but sooner or later you too will face a medical emergency. When that happens, wouldn't it be nice to know you won't have to make those choices?

No comments:
Post a Comment