Google Search Engine

Sunday, September 13, 2009

Mandated Health Insurance

Definition: Mandated health insurance is when one is mandated, or required to purchase health insurance. This is the term most often used in the new health insurance reform legislation by the Obama administration.
The idea behind mandated health insurance is that if the government required every citizen to purchase health insurance it would reduce overall costs. The overall costs would be reduced for many reasons. One way it would reduce costs is that people with health insurance tend to get preventative care and use a primary doctor. By getting preventative care, more expensive health care costs are usually later avoided. In addition, by using a primary care physician instead of just going to the emergency room, costs are dramatically reduced again since an emergency room visit is much more expensive.
Also Known As: shared responsibility, required health insurance
Examples: Leslie did not have health insurance. She often went to the emergency room when she was sick. The costs were usually not paid by Leslie but absorbed by the other hospital patrons which over time raised the cost of health care. If Leslie were mandated to buy health insurance by the government, she probably would visit a primary care physician instead of the emergency room and eventually health care costs would stabilize.

No comments:

Post a Comment