• @Empricorn@feddit.nl
    link
    fedilink
    49 months ago

    True, but their job is to not pay claims! Literally. Maybe someone can find the video of the practicing doctor who deals with insurance companies every day. He reminds us that insurance companies are a for-profit business. Despite the veneer their marketing creates, not one of them wants to make you healthier. There's no profit in that. They want to extract as much money from you as possible, until you are dead.

    I will admit that there's greed in other places as well. Being uninsured in America is even worse in many ways. So can we please have the option that costs the country and us less, and provides better outcomes: universal healthcare.

    • Boozilla
      link
      fedilink
      English
      29 months ago

      I agree. The only time insurance cares about your health is when a cheaper preventative can avoid an expensive claim later. And even then, they are micro-managing medical practices for their own selfish interests. Some things should simply not be profit-driven. Healthcare, education, infrastructure, national defense, first responders, etc.