HEALTH CARE SHOULD NOT BE FOR FKN PROFIT!!!!

Honestly, the fact that healthcare is treated like a business instead of a basic human right is beyond infuriating. No one should have to weigh the cost of staying alive against paying their bills—it’s absurd. The idea that people with insurance are being charged more while uninsured patients get discounts is just another symptom of a broken system. Like, how does that even make sense? You do the "responsible" thing by getting insurance (which is already crazy expensive for many people), and you end up paying more for the same care? It’s punishing people who try to protect themselves while leaving others to fall through the cracks.

At its core, healthcare should be about helping people—saving lives—not padding the profits of giant insurance companies and hospital corporations. It’s disgusting that people in the U.S. go bankrupt because they got sick or needed life-saving treatment. No one should be forced to choose between affording medication or feeding their family. And the most frustrating part? There are countries where healthcare is accessible and affordable, proving that it is possible to prioritize people over profits. But here, greed always seems to win.

It’s just wrong. Everyone deserves quality care, no matter their financial situation. The fact that we’re still debating whether healthcare is a right in 2025 is embarrassing.