Our Sick Healthcare System

Our healthcare system is both a source of pride and embarrassment. It is undeniably the most advanced healthcare system in the world. Our doctors are the world’s finest; our hospitals provide a level of healthcare services unequaled by any other country; and our Center for Disease Control and National Institute of Health are the envy of the public health agencies of all other nations. Nevertheless, our healthcare system is also an embarrassment because it’s ranked relatively low among those of other developed nations even though our healthcare expenditures per capita far exceed theirs. Why is this and how did the nation that was the first to harness nuclear energy and the first to put a man on the moon come to fail so badly in attending to the health of its citizens?

The Facts

There is no shortage of ratings of national healthcare systems and those rankings fluctuate from year to year.  National ranking of healthcare systems vary greatly and that’s because healthcare systems differ greatly by country and region and each rating agency uses different criteria and weightings in determining its rankings. Among the current national rankings of healthcare system is the one for 2024 published by CEOWorld Magazine which ranks the  U.S. healthcare system 15th in the world. An even more negative assessment was recently issued by the Commonwealth Fund which accords of the U.S. healthcare system an overall ranking of 69th.

Clearly, there are wide difference among nations when it comes to the number of hospitals, doctors and nurses per capita. For example, the U.S. ranks 31st in hospital beds per capita and 3rd in the number of ICU beds per capita. It also ranks 41st in terms of the number of doctors per capita and 9th in terms of the number of nurses per capita. These differences greatly influence how long a patient must wait before receiving medical care. In the U.S. the speed and manner in which a patient is treated is also largely a function of the patient’s wealth and connections.

One area in which the U.S. has no rival is the percentage of its GDP that is expended on healthcare. The U.S. spends 17.8% of its GDP on healthcare which is virtually twice that of all other nations. More importantly, Americans spend $12,555 per capita for their healthcare needs. By comparison, Switzerland, which ranks second highest, spends $8,049 per capita on the healthcare needs of its residents, and the remaining developed nations have average per capita healthcare expenditures of approximately $6,000.

Despite the U.S.’s higher healthcare expenditures, its health outcomes are assessed to be no better than those in other developed countries. Indeed, most rating agencies find that the U.S. actually performs worse in some common healthcare metrics like life expectancy, infant mortality and unmanaged diabetes. These findings, however, assume that all populations are essentially the same and that any differences in outcomes are attributable to the effectiveness of the country’s healthcare system. That, however, may not always be the case. For example, the roughly three-year shorter life expectancy and higher rate of diabetes experienced in the U.S. may be largely attributable to our high level of gun violence and our consumption of sugar-laden and highly processed foods.

Nevertheless, healthcare experts attribute most of the poor results achieved by the U.S. healthcare system to their findings that as much as 30% of U.S. healthcare expenditures represent unnecessary, ineffective, overpriced and wasteful services. That’s largely because our high costs of healthcare services generally renders them unavailable to roughly 28% of adult Americans.

The Symptoms and The Sources of the Problems

The symptoms of deficiencies in the U.S. healthcare system are generally the very ones cited by the organizations that rank the world’s healthcare systems and all have their roots in our higher costs of those services. Those higher costs, however, are in turn the products of our history, our culture and our political system that strongly favors free-market activity.  As discussed below, it is highly questionable whether our essentially free-market system is even compatible with the effective delivery of healthcare services.

As in other countries, healthcare in the U.S. began with individual doctors. In the early 18th century hospitals began to spring up, many of which were established by the Catholic church. Later, not-for-profit and governmental entities began opening hospitals. Today, the U.S. has over 6,000 hospitals, the vast majority of which are located in metropolitan areas largely because thinly populated areas cannot economically support hospitals fortified with a full array of medical technology and doctors with a wide range of specialized knowledge. This disparity accounts for some of the poor results bedeviling our healthcare system.

Healthcare services around the world were initially paid for by the individuals seeking them. Because healthcare costs can be economically ruinous to individuals with serious medical problems, in the 1930s private insurance companies began to offer healthcare insurance. By 1940, approximately 9% of Americans had purchased some form of private health insurance. Although the growth in the healthcare insurance market began slowly, it got a big boost in the U.S. during World War II when wage and price restrictions were imposed by the Roosevelt administration in an attemp to prevent runaway inflation.

Because so many young men had joined the war effort, all businesses experienced hiring difficulties. Unable to raise the level of wages they were paying, many sought to entice new employees by offering healthcare insurance which did not violate the wartime wage and price controls. This quirk of history allowed the U.S. health insurance industry to grow into a powerful economic and political force which, combined with Americans’ predilection toward a free-market economy, effectively cemented private health insurance into our nation’s healthcare system. By contrast, the European nations recovering from the ravages of the war took another path and chose to establish government controlled or financed systems that provided uniform healthcare coverage for all citizens.  Our overwhelming reliance on privately issued healthcare insurance thus became the single biggest factor allowing the healthcare systems of other countries to operate at lower costs and achieve superior overall (but not necessarily individual) results.

Another important factor raising the costs and availability of healthcare services in the U.S. is the impact of medical malpractice litigation. More so than any other country, the U.S. has a large and highly profitable army of attorneys ready, willing and able to seek justice on behalf of persons who may have been injured by others. Among the favored targets of these attorneys has been the medical profession in which the quality of practice was very uneven. This made it easy to prove that a doctor or hospital that did not utilize the knowledge possessed or care utilized by their peers was guilty of MALPRACTICE and was liable to their patients for the damages they had suffered and would suffer.

Although these lawsuits were a plague tormenting our medical profession, they also had two important side effects: (a) they caused the quality and costs of healthcare services to be greatly increased, and (b) they ignited a need for a wide range of better diagnostic equipment. This latter by-product of medical malpractice litigation also changed the way medicine is practiced in the U.S., prompting doctors to spend less time questioning and examining their patients and to rely more heavily on medical tests to formulate their diagnoses.

While the impact of medical malpractice litigation continues, the danger of runaway verdicts awarded by lay jurors has largely been brought under control. At least 27 states and the District of Columbia have now established a requirement that all medical malpractice cases first be reviewed by a state approved medical malpractice board. Other states have mandated mediation of medical malpractice claims.

The Problem Areas

Unnecessary Remedial Actions. The high cost of healthcare in the U.S. has a number of causes. At the top of the list is the way medicine is now being practiced as a result of the threat of medical malpractice claims. You might think of it as “defensive medicine” where the efforts of medical personnel appear to be more focused on protecting themselves and the organizations they serve from liability claims than on helping their patients return to good health. Not only do doctors allow their fears of ruinous malpractice claims to impact their professional judgment, but hospitals and other healthcare facilities allow their own self-interests to affect the way their patients are served. They not only tolerate, but encourage, their staffs to generate additional revenues by taking unnecessary efforts and performing excessive tests. Moreover, healthcare insurers compound this problem by computing their payments for healthcare services on the basis of the number and nature of the specific tasks performed, thereby creating an economic incentive for efforts which do little to enhance the effectiveness of the care provided to patients.

Excess Administrative Tasks. The U.S. annually spends over $900 per patient on healthcare administrative costs — four times more than the average of other developed countries. At the heart of this problem is the highly fractured nature of our system of healthcare funding which is provided by over 1,000 healthcare insurers in addition to the federal government’s Medicare and Obamacare programs and state governments’ Medicaid programs. Each of these funding agencies has its own rules, mandated procedures and paperwork requirements which not only cause wasteful efforts on the part of those who administer healthcare practices but also frequently cause delays in providing the care needed by their patients. A single-payer system like those utilized in almost all of the other developed countries would greatly reduce the administrative burdens placed on healthcare providers in the U.S.

Healthcare Profiteering. Healthcare in the U.S. was traditionally provided by independent doctors and the nurses and administrators they hired to enable them to expand their services. Even when doctors joined together to create medical practice groups they were the sole owners with an economic interest in making their practices profitable. Similarly, early hospitals were formed by religious or non-profit or governmental entities essentially eliminating any motive to increase their net economic results.

That has now changed. Today, of the slightly more than 6,000 hospitals in the U.S., approximately 20% are owned by for-profit entities.  With respect to the slightly more than 4,000 Medicare-enrolled hospitals the percentage of for-profit entities owning hospitals is 36% and is increasing as many governments are gradually disposing of their hospitals. While for-profit hospitals tend to be more efficiently operated than those that are government-owned, the profits they generate for themselves can more than off-set that advantage and lead to many Americans not being served.

In addition, a little over a third of the for-profit hospitals are now owned by private equity groups which acquired them in “leveraged buyouts” transactions (i.e., acquisitions financed with high levels of debt secured by the hospital’s equipment and facilities). This places their hospitals under a lot of pressure to maximize their net revenues so that they can service their high levels of indebtedness.  As a result, the prices they charge for their services are increased and the level and quality of those services are reduced. These trends have been the subject of a number of studies. One such study, led by researchers at Harvard Medical School published in December 2023, confirms that “patients are more likely to fall, get new infections, or experience other forms of harm during their stay in a hospital after it is acquired by a private equity firm.”

High Costs of Prescription Drugs. Another factor which increases the costs of healthcare in the U.S. is the costs of prescription drugs which represent roughly 11% of U.S. healthcare costs.  This is far more than prescription drug expenditures in other countries. For example, in 2020 retail drug prices for 20 selected brand-name prescription drugs in the U.S. were found to be more than 2 to 4 times higher than in Australia, Canada, and France. This explains why many Americans seek to order their prescription drugs from Canadian pharmaceutical distributors.

At the root of this problem is the large number of healthcare funding organizations in the U.S.   This leaves those organizations with little individual bargaining power in negotiating prescription drug prices with drug manufacturers. That problem is compounded by the fact that the Medicare system, which is the nation’s largest purchaser of prescription drugs, has been prohibited by law from negotiating drug prices. This is a problem not faced in other developed countries which have single-payer healthcare systems and which place limits on the prices of prescription drugs. Unlike foreign governments, the U.S. Congress has been unwilling to institute ceilings on drug prices. This is both due to (a) the Congress’ preference to leave commercial markets unencumbered by governmental regulation and (b) the extraordinary lobbying power of the U.S. pharmaceutical industry. Unfortunately, there are three other factors that make this situation even worse.

First, foreign developers of pharmaceuticals (of which there are many), in an effort to maximize their sales in the lucrative U.S. healthcare market, tend to license the drugs they develop to U.S. pharmaceutical manufacturers. This eliminates potential price competition from foreign drug manufacturers and reduces the amount of competition a U.S. pharmaceutical company will have to face from other drugs designed to achieve the same results.

Second, the U.S. places a limit of 20 years on the patent protection of products. Patents give pharmaceutical companies a window of freedom from competition from generic drug manufacturers which is necessary to encourage them to expend the vast sums of money needed to develop a new drug. Even though patents technically remain in effect for 20 years, for prescription drugs their effective life is generally only 10-12 years as the first 8-10 years are consumed seeking to have the drug approved for sale by the FDA. Once a patent expires, generic drug manufacturers flood the market with copycat products forcing the prices of the original patented product down to below 10% of the price that was being charged when its patent expired. This relatively short period in which to recoup development costs places pressure on drug manufacturers to raise their prices as fast as the market will absorb.  This also accounts for why there are so many advertisement for pharmaceutical products on TV.

As one might expect, pharmaceutical companies blame their high costs of bringing their drugs to market (over $100 million per drug) for the high prices which they charge. This argument, however, doesn’t justify the high profit margins they enjoy which on average are literally twice as high as those of non-pharmaceutical U.S. companies. This was the conclusion of a cross-sectional study comparing the profits achieved by 35 large pharmaceutical companies with those of 357 large non-pharmaceutical companies. That study found that the median pre-tax profits (as a percentage of revenues) of U.S. drug manufacturers was 13.8% compared with 7.7% for non-pharmaceutical companies.

A third factor raising the price of prescription drugs is the insertion of pharmacy benefit managers (PBMs) into the process of distributing prescription drugs. PBMs serve as a go-between healthcare insurers and drug manufacturers, negotiating the price of drugs on behalf of healthcare insurers and developing the formulary of drugs to be covered by each insurer. They also negotiate with drug distributors and pharmacies as to the prices they will charge the public. Three PBMs represent nearly 80% of the drugs sold in the U.S. This places them in a strong bargaining position which they apparently don’t always use for the benefit of the health insurers they represent or the ultimate consumers of prescription drugs. This was pointed out in a recent New York Times article.

There’s hope that a reduction in prescription drug prices may be coming. Included in the recently enacted Inflation Reduction Act (IRA) is a provision that allows Medicare to use its considerable bargaining power to reduce the prices of a limited number of prescription drugs. In addition, the IRA  permits the president to penalize drug manufacturers that raise drug prices faster than the prevailing rate of inflation. This week President Biden invoked that provision with respect to 64 prescription drugs.

The Future of U.S. Healthcare

I confess that I am an economic determinist which means that I believe that, just as gravity forces water to seek its lowest level, humans constantly strive to reduce their costs for necessary items. This makes me optimistic that the U.S. will ultimately (but not necessarily quickly) find ways to reduce its per capita healthcare costs and their related adverse effects relating to the availability and quality of healthcare services.

The ability of the Medicare program to negotiate the price of a limited number of prescription drugs is certainly a sign that change is coming to the way healthcare is being provided in the U.S. There are also signs of change in the way healthcare services are being delivered. Hospital are rapidly acquiring the practices of medical specialists. This means that cost-cutting efforts and economies of scale will be employed to reduce medical costs and make healthcare services more efficient and more affordable. The corporations running hospitals and satellite medical practices are now doing for healthcare services what Henry Ford did for car manufacturing. They are reorganizing their doctors and support personnel into teams with each team having specific functions and all members of the same team being interchangeable. They are also employing expensive technology not only to help diagnose patient problems but also to better coordinate the services they provide and to communicate with their patients.

Another approach to reducing healthcare costs and improving  outcomes is by placing a greater emphasis on preventive care. This is an approach that professional malpractice insurers have successfully employed in trying to reduce the frequency and severity of malpractice claims against lawyers, accountants, architects and engineers. Similarly, greater emphasis on preventive care could both lower the costs and enhance outcomes in the practice of medicine. While preventive care would likely lead to the discovery of medical problems not previously detected by patients, dealing with them at an early stage should prove far more cost-effective than allowing them to fester into serious conditions. It would also have the side benefit of making patients more productive in their everyday endeavors.

These measures, however, will only have a relatively small impact in reducing the costs of healthcare services. Major reductions will require replacing our multitude of healthcare insurers with a single-payer system like those adopted by virtually every other developed nation. Such a system would offer universal coverage so that all Americans would receive the healthcare they need. This change would also greatly reduce the administrative burden placed on healthcare providers and eliminate roughly 20% of healthcare costs represented by the monies absorbed by healthcare insurers.

To accomplish this change will require the American public to overcome the pressure (and cries of “socialism”) that U.S. legislators will surely receive from the nation’s healthcare insurers, hospital corporations and pharmaceutical manufacturers. Even though overcoming industry resistance will be a monumental task, it could be accomplished if the American public were made more aware of the costs of healthcare. That could be achieved by simply mandating that every hospital patient receive a copy of their hospital’s itemized bill. These are frightening documents containing a myriad of excess charges, such as charges of literally hundreds of dollars for placing a single band-aid on a cut finger. While this proposal would also face heavy political opposition, it would be an important first step to building the type of popular consensus necessary to transition healthcare in the U.S. to a single-payer system providing universal coverage.

Previous
Previous

Our Unbridled Supreme Court

Next
Next

Macroeconomics For Sports Fans