Pages

Monday, February 14, 2022

Bringing Medicine Into The 21st Century

Breaking Healthcare's Rules: Selecting The Best Technology

Newsletter cover image

In the 21st century, all but one U.S. industry has used information technology (IT) to cut costs, increase access to products and services, and improve quality.

Healthcare is the lone exception. For decades, medical costs have risen faster than inflation—with spending now above $4 trillion annually. For patients, accessing medical care is both time consuming and burdensome. Meanwhile, U.S. healthcare lags other wealthy nations in nearly all measures of quality, including life expectancy and childhood mortality.

Modern technologies could help solve these problems. So, why haven’t they?

One answer involves the technology, itself. Take the electronic health record (EHR), which has become a symbol of what’s wrong with tech in medicine. Though EHRs can improve collaboration among doctors, give patients fuller access to their medical data and reduce clinical errors, they rarely do. Instead, these systems are cumbersome and clunky, and they sit (literally) between doctors and patients. Year after year, the Medical Economics survey of “things ruining medicine for physicians” rates EHR usability at or near the top of the list.

But form and function aren’t the only barriers to widespread tech adoption in healthcare. Also standing in the way is an unwritten rule that governs the relationship between doctors and technology—a rule that has held firm for centuries. 

This article, part of a newsletter series called Breaking The Healthcare Rules, explains this rule and offers a viable solution.

Rule 3: The best technology preserves the status of the doctor

The expression “lay hands on the sick and they will recover” dates back to biblical times when the hands of healers were believed to have curative powers. In the millennia that followed, physicians embraced the tradition of laying hands on patients.

By the 18th century, doctors took great pride in their ability to assess a patient’s temperature using only their hands. This skill took years of training to master, helped distinguish doctors as experts and boosted the prestige of the entire profession.

Around that same time, Daniel Fahrenheit invented a new device called the thermometer, which could measure body temperature within one-tenth of a degree.

What happened next was a seminal moment in medical history. Rather than welcoming Fahrenheit’s technological wonder with open arms, doctors dismissed it as clunky, cumbersome and painfully slow to calibrate. Indeed, the first-gen version was all those things. But those design flaws don’t explain why physicians ignored—and outright denied—the thermometer’s potential to help patients.

In reality, doctors saw the device as a threat to their professional status and relative importance. If just anyone could accurately determine a patient’s temperature without years of hands-on training, then physicians would lose a big part of what makes them special. To preserve their status, doctors spent the next 130 years fighting to keep the thermometer out of the exam room.

Wanted: Technology that elevates the doctor’s status

In the centuries since, doctors have given preference to technologies that boost their reputation.

Consider the industry-wide obsession with operative robots. These multimillion-dollar machines look like space-aged command centers with doctors (and only doctors) sitting in the captain’s chair, directing the movements of several large robotic arms.

It’s easy to see the appeal: These machines are incredibly cool and the surgeons who use them are seen as rock stars on the cutting edge. Medical journals overflow with descriptions of new and interesting applications for these technologies. It’s therefore no surprise that the surgical robotics markets is projected to grow by 42% annually over the next decade.

Here’s the problem: Independent research from 39 clinical studies has determined that robot-assisted surgeries have only modest clinical advantages over other approaches. They have so far failed to extend life expectancy or significantly reduce surgical complications.

Looking objectively at the impact this technology has on patients, the operative robot is a dud. But for the reputation of physicians using it, the machine is a megahit.

Good for patients, bad for physicians?  

In sharp contrast to surgical robotics, there are several modern technologies that could positively and powerfully transform patient care. Yet, most generate lukewarm to negative reactions from physicians. Here are two examples.

Telemedicine

Prior to the pandemic, only 1 in 10 patients had experienced a virtual visit with a doctor. That changed at the onset of Covid-19, when physician offices were forced to close.

Suddenly, telehealth accounted for 70% of all visits and—to the surprise of doctors and patients alike—the experience was resoundingly positive. Physicians resolved patient problems faster and more effectively than before. Patients, meanwhile, enjoyed the added convenience and most (75%) expressed high satisfaction with virtual care.

Yet, in the months that followed, telemedicine usage receded to almost pre-pandemic levels, accounting for just over 10% of patient visits today (not including virtual mental health).

The problem isn’t the technology. It’s what the technology represents. Telehealth constitutes a threat to the physician’s office, a place where the doctor’s prestige is on full display. Physicians take great pride in seeing their names on the front door, embossed in bold letters. Even the “waiting room” communicates the importance of the doctor’s time.

Telemedicine strips these status symbols from the doctor-patient experience.

And so, even though virtual care offers patients greater convenience with no evidence of quality issues, doctors undervalue and underuse it. Unlike what we’ve seen with research on surgical robotics , you won’t find journal articles in which clinicians attempt to push the boundaries of telehealth.

AI and data analytics

Computing speeds continue to double every couple of years. It’s a phenomenon known as Moore’s Law, and it means that tools like artificial intelligence (AI) and data analytics are becoming smarter and more capable of transforming healthcare delivery.

Already, AI has been shown to interpret certain X-ray studies (mammograms and pneumonia) more accurately than skilled radiologists. In the future, computers with machine-learning capabilities have the potential to make diagnostic readings both better and faster than humans.

Meanwhile, data analytics (which inform evidence-based algorithms) have the power to dramatically improve physician performance. When doctors consistently follow science-based guidelines, they achieve far better clinical outcomes than on their own. With these tools, physicians have the opportunity to lower mortality rates from heart attacks, stroke and cancer by double digits. But, as with the thermometers of centuries before, you won’t find physicians clamoring for these tools, either.

Instead, you’ll hear doctors from every specialty denounce the use of computerized checklists and algorithmic solutions as “cookbook medicine,” just some recipe to be followed. They argue that data analytics and AI will make every doctor average, ignoring the fact that the “new average” would be vastly better than today’s.

No matter how better the results, technologies that tell doctors what to do are seen as a threat to the profession. Invariably, physicians reject them.

Selecting the best tech with forced transparency

Transparency is the best first step toward breaking the outdated rule of technology in healthcare. Here’s how it might look.

In partnership with a highly respected agency like the National Institutes of Health (NIH), scientists would analyze the scientific merits of various healthcare technologies. The list might include the surgical robot, along with telemedicine, AI, proton-beam accelerators, wearable heart monitors, PET scanners and chatbots for self-diagnosis, among others.

Researchers would review published data, analyze each technology and publish a cost-benefit rating, similar to what you’d find in Consumers Reports.

Though this exploratory body wouldn’t have regulatory power—the way the FDA has authority over drug approvals—it would nonetheless serve an important function. This process would provide an unbiased evaluation of the most promising tools for patients.

To improve healthcare in the areas of cost, access and quality, we must measure technologies by their impact on the health of patients, not their impact on the status of medical professionals.

 

No comments: