The year is 578 A.D.
The place: Constantinople.
The patient: Justin II, emperor of the Eastern Roman Empire.
The great and feared ruler has a stone wedged in his urethra. Crying out in pain, Justin II begs the physicians of his court to do something, anything! The doctors gather around him, nervous and rightly so. Their king has a long history of murdering his enemies.
The physicians, worried they’ll lose their heads if something goes wrong, hatch an ingenious plan. They insist that the emperor, himself, hand over the scalpel, thus signifying his consent to go forth with the operation.
Medical historians point to this moment as the birth of “informed consent,” the process of acquiring a patient’s permission before starting treatment.
Informed consent today: Bad form
One and a half millennia later, informed consent still plays a part in the doctor-patient relationship, but not a meaningful one.
Doctors today view informed consent as yet another administrative burden, heaped on a pile of other bureaucratic and regulatory tasks that serve only to slow them down. In the 21st century, informed consent is most often a boilerplate document, rife with legalese and a litany of potentially negative patient outcomes, up to and including death.
In reality, these consent forms aren’t used to educate patients about the real risks they face nor do they exist to make patients equal partners (or even participants) in the healing process. The form’s perceived lack of importance is exemplified in many academic medical centers by how often the task of acquiring the patient’s consent is relegated to the lowly intern.
Informed consent could be a powerfully effective document, used to help patients understand their disease and increase their commitment to healthy living. These measures could improve clinical outcomes and, in doing so, increase physician satisfaction and fulfillment, too.
But first, doctors must recognize what’s standing in the way of success. This article, the seventh in a series, spotlights an outdated and unwritten rule of healthcare, one physicians have followed for decades.
The rule: Doctors tell patients what to do (and patients should comply)
For most of the 20th century, the doctor-patient relationship constituted a series of simple and straightforward exchanges. When a patient broke a bone, the doctor reduced the fracture and cast it. When a patient came in with strep throat, the doctor prescribed penicillin. For more complicated medical matters, there was little a physician could do.
That changed in the 1970s, ‘80s and ‘90s, thanks to rapid advances in science and technology. As physicians accrued ever-greater medical expertise, the knowledge gap between doctors and patients widened. Before long, the balance of power tilted heavily in the physician’s favor.
Over these decades, the public grew increasingly reliant on (and deferential to) physicians. Patients looked to the doctor to decide what was best. And upon rendering a decision, physicians expected patients to comply, a viewpoint that persists today in the American Medical Association’s Journal of Ethics: “In many fields (e.g., law, education, economics), it is generally accepted that decisions are best made by experts.” And so, for doctors, “Utilizing paternalism selectively in decision making is not only necessary but obligatory.”
Then along came the internet
With the online information boom of the late-20th century, patients began using the internet to research their own medical problems, weigh their treatment options and—more than ever—question the authority of their doctors.
These days, patients hesitate to just comply with their doctor’s orders. They prefer, instead, to hear what the doctor thinks and then decide for themselves whether or not to adhere. Quite often, patients don’t. As much as half of the time, Americans don’t take their medications as prescribed. Likewise, up to 75% of physical therapy patients choose not to complete their treatment plans. Among people with psychiatric illnesses, rates of adherence to medications and counseling have now fallen to dangerously low levels.
These gaps in care have serious consequences: Americans are among the sickest people in the developed world with the highest rates of chronic disease and the lowest life expectancies. Doctors, meanwhile, are frustrated, fatigued and dissatisfied (a phenomenon known as burnout).
Updating informed consent for the 21st century
Patients may be more demanding and consumer-driven than ever, but that doesn’t mean they’re better informed at the point of care. In fact, 1 in 3 patients with a chronic disease don’t understand their own illness because they struggle to comprehend what their doctor tells them.
Closing the knowledge gap while improving the doctor-patient relationship will require better communication and greater trust. Neither will happen until there is a more equal balance of power in the exam room.
Though it won’t be easy to convince doctors to adopt such a power-sharing model (and no regulatory body can mandate it), physicians may be persuaded by the probability of better clinical outcomes. Here’s a means to that end.
Introducing the ‘informed commitment’ process
Leaders at a World Health Organization symposium presented evidence that patients are more likely to adhere to a treatment plan when they are involved in its creation, fully informed about the details and part of the solution.
So, rather than approaching the informed consent process as a bureaucratic task, doctors and patients would benefit from a three-step process I’m calling informed commitment:
1. Inform then listen. Research demonstrates that patients often misunderstand their doctors. The medical profession has made strides this century toward better educating patients about their disease, the risks and benefits of treatment, and potential complications—all while striving to weed out medical jargon. But it’s not enough for doctors to inform. They also must listen, asking patients to explain what they’ve heard and will happen next. This allows the physician to clarify any misunderstandings.
2. Get the patient to commit on paper. Research has shown that people are 42% more likely to achieve their goals when they put them on paper. Therefore, the second step involves the patient writing down (a) what treatment they’re consenting to and (b) what they, themselves, must do to maximize their health going forward.
3. Review the plan together. The final step would be for the physician to review the information the patient has provided, both verbally and on paper, and discuss any outstanding gaps in understanding.
Doctors follow the paternalistic approach, in part, because they believe it saves time. They assume that telling patients what to do is the fastest and easiest approach. In reality, paternalism is a prescription for poor adherence that leads to preventable complications.
The “informed commitment” process will demand more of the doctor’s upfront time. But 20 minutes spent helping patients understand their illness, and their role in the healing process, will pay off significantly. That’s because doctors will save hours of time not having to address the consequences of medication nonadherence or their patients’ failure to follow treatment plans. And with better outcomes, physicians will regain the professional and personal satisfaction that comes with helping people live longer, healthier lives.