Skip to Content, Navigation, or Footer.
Thursday, April 25, 2024
The Observer

Health outside the hospital: Medicine in the 21st century

Healthcare policy occupied a central position in the national discourse during the reform efforts of 2009 and 2010. That it has maintained this position more than a year later reflects a variety of factors.

First, no one was truly satisfied with the compromise that culminated in the Affordable Care Act. Liberals bemoaned the lack of any real push for a single-payer system, or even a government-run public insurance option. Conservatives decried the bill as an entrenchment of the status quo — a ruinously expensive entitlement plan that does nothing to fundamentally address the skewed incentives that are driving healthcare costs ever upwards in a seemingly unstoppable march.

Second, in addition to the acrimony over the ACA, Medicare (and healthcare more generally) figures prominently in the new national preoccupation — the national debt.

Healthcare is an incredibly tricky issue to deal with from a policy front. Would free market mechanisms reduce healthcare costs? Most likely. But healthcare resists the alluringly passive logic of market mechanics. Many of us have an innate feeling that healthcare is an inalienable right, something to which all citizens, especially of a developed nation, deserve access. We reject the notion that patients should have to decide between bankruptcy and the best cancer treatment available. We hope that our culture has progressed to a point of refinement and civility enough to value making sure people can at least enjoy their health — the most basic, yet most important pillar of happiness.

And so we are stuck. There's no perfect policy prescription waiting to be plucked out of the ether. Stanford University physician Walter Bortz goes so far as to say in his new book "Next Medicine" that current medicine is "irrelevant," and claims that there exists a "basic mismatch between human biology and capitalism."

Quotes such as these make the situation seem quite grim indeed. All sorts of solutions have been proposed by technocrats — accountable care organizations (ACOs) to better coordinate care between specialists and to put an end to the ruinous incentives engendered by the fee-for-service system; greater training of primary care physicians; insurance exchanges. Yet each of these solutions leaves a vague feeling of dissatisfaction. You can put lipstick on a pig, but it's still a pig.

There is one cause for great hope amidst the bleakness: information technology.

But how can information technology possibly bend the cost curve, especially given the fact that expensive technologies are one of the main contributory factors to rising costs? The answer is in the prevention of hospital visits, and more broadly, the prevention of illness itself. Developments in information technology promise not only to improve care in hospitals, but to keep patients out of hospitals in the first place. This is the goal of 21st century medicine: prevention. In this paradigm, going to the hospital in the first place is a failure of the healthcare system, no matter how superbly coordinated the care once there.

This is a grand idea: We're talking about not only the typical refrains about information technology in healthcare (electronic medical records), but also using IT to fundamentally alter the way we interact with sickness and health.

Imagine that you feel sick. If the discomfort is severe enough, you'll eventually go to the hospital. The doctor will see you, diagnose the problem, recommend treatment and you'll leave, minus $100 or more if you visit the emergency room (most of which you don't see, since you only deal with the insurance co-pay).

Inject information technology into the equation. You type your symptoms into an app on your smart phone or computer. Computers immediately analyze gigantic data sets of similar patient presentations (Columbia University is already attempting to incorporate IBM's "Watson" technology from game show fame on Jeopardy into the clinic) and doctors remotely give a preliminary diagnosis. For some cases, this will be very straightforward. The doctor sends an electronic prescription, you go to the pharmacy, get your medicine and that's that. No hospital visit required. There's already a name for this movement: mobile health, or mHealth.

This conception of healthcare is very attractive. But the above example is still reactionary — you get sick first and then take action. The real seduction of information technology is its capability to prevent illness in the first place. Bell's law & Moore's law dictate that computers will continually become smaller and more powerful.

Researchers at the University of Michigan recently designed the first complete millimeter-scale computer device: a pressure sensor for glaucoma patients that can be implanted into the eye. It contains a radio that communicates with the outside world, letting clinicians know when pressure in the eye reaches high levels. Imagine this paradigm applied to other fields of health — we could have tiny devices that recognize characteristics of disease or sickness in our bodies with wireless communication capabilities. Before we even consciously realize we are sick, our computer, wirelessly connected to the devices inside our body, tells us something is wrong. Prevention becomes a whole lot easier.

Medicine is a funny profession. Its ultimate goal is to make itself irrelevant — to eliminate sickness and cure diseases. Think about the inherent oddness of that. It's one of the few occupations where participants actively strive towards — and hope — that their craft will one day be unnecessary. It's an idea as masochistic as it is noble. Medicine is currently at a bizarre point in its history, its future densely clouded with uncertainty.

On one side, analysts foresee an upcoming shortage of physicians as the population ages, counseling medical schools to expand enrollment. Others believe that advances in biotechnology and nanotechnology will essentially end medicine as we know it — Ray Kurzweil predicts that in the 2030s we will have tiny little "nanobots" coursing through our bloodstream and keeping us healthy. Others think that we will eventually be able to swallow a few pills of robotic parts that can assemble inside our body and perform surgery.

These ideas are both highly speculative and controversial, but they are indicative of a larger uncertainty in the medical community. Where are we going? Will medicine bankrupt us? Where will advances in biotechnology and robotics take us? What will doctors do in fifty years?

Regardless of the larger questions surrounding where medicine is headed, true reform should strive for boldness, for a visionary rethinking of the way in which we deal with health and sickness. The system we have in place is a product of the 20th century — a time when diagnostics were in their nascent phases, the intricacies of disease were mostly shrouded and "internet" was an unknown word. We are moving rapidly into 21st century biomedicine, with further refinements in our control of genetics, further advances in both the accessibility and scale of computing and further precision of diagnostic tools. These developments will allow us to conceptualize medicine not as a battle against sickness, but a battle to maintain health. Using expensive 21st century tools in a 20th century system — getting sick and then going to the hospital — is unsustainable. As technology advances, so too must medicine. This crucial paradigm shift is necessary to extend equitable and affordable care to everyone. Call it the Affordable Care Age.

Edward Larkin is a senior majoring in biological sciences and classical civilization. He can be reached at elarkin1@nd.edu

The views expressed in this column are those of the author and not necessarily those of The Observer.


The views expressed in this column are those of the author and not necessarily those of The Observer.