Making Health Decisions in the Face of Uncertainty


Making Health Decisions in the Face of Uncertainty

Red pill or blue pill?

Source: Pietro Jeng/Pexels

There is a range of quality in health research published today. By “quality” I mean two things: (1) the reliability of the results (if someone else did a similar study would they arrive at the same conclusions?) and (2) the validity of the study (was it designed in such a manner that the findings likely reflect a legitimate relationship, or are they likely an artifact of sampling bias, measurement error, etc.?).

Whenever researchers publish their findings in a scientific journal, they are asked to provide some context of the quality of their study. This context is often provided in a section of the manuscript called “strengths and limitations.” Sometimes critics of health research will cite these limitations as evidence that a study is of poor quality. 

But consider for a minute what this text actually means: this is the authors self-identifying the weaknesses of their work for others to know. In what other setting, besides scientific research, do you routinely see such public displays of self-scrutiny? Do restaurants note the least-tasty dishes on their menus? Do sports teams discuss their weakest plays? Of course not. Chefs and coaches surely know what those weaknesses are, but they don’t (publicly) acknowledge them in the same way (thus spawning food critics and sports analysts).

Publicly acknowledging weakness is not a flaw of science: It is its greatest strength. These acknowledgments help inform the types of studies needed in the future, akin to a relay race—except scientific progress isn’t linear and this race doesn’t have a finish line. 

But the fact that health research is uncertain is frustrating, particularly if you are trying to eat “better,” become “happier,” live “longer,” or pursue any of the other goals that such research can potentially inform.

In my own life, I use a risk/benefit framework for evaluating whether to change my behavior in response to scientific research in pursuit of becoming healthier/happier/older. That is, I ask myself: What is the risk of adopting some new behavior versus the potential benefit?

Last month in JAMA Oncology, I wrote about my personal experience of being diagnosed with early-onset colorectal cancer.* Like many people who have experienced cancer, I crave certainty that it will never return, despite knowing that such certainty is not achievable. So I try to do things—things that are based on evidence—to reduce the risk of recurrence. 

Since the rising incidence of colorectal cancer in young adults is a fairly new phenomenon, there aren’t evidence-based guidelines yet for what people can do to reduce the likelihood of recurrence in cases like mine. So I read studies with the knowledge that they are an imperfect representation of my situation, and decide—with advice from my doctor—if I should adopt some behavior.

This came up recently regarding my consideration of whether to start taking vitamin D supplements or not. Numerous observational studies (i.e., studies where researchers don’t randomly assign people to take a supplement or not, but instead just compare people who happen to take a supplement to people who happen not to) have found evidence that vitamin D protects against the development of various cancers. But other types of studies, including those that account for genetic variation in vitamin D metabolism, have indicated no such protective effect. But these are studies of new cancers—I’ve already checked that box. I want to know if vitamin D might protect against a recurrence

Sharon McCutcheon/Pexels

There’s nothing uncertain about how adorable this face is.

Source: Sharon McCutcheon/Pexels

Lucky for me, a few months ago there were two large, well-conducted randomized controlled trials (called AMATERASU and SUNSHINE) published on this very question. Such trials are the “gold standard’ of scientific inference in medical research. I would have my answer at last, right? Not quite. 

Here’s an excerpt of the abstract of the JAMA article that reported the results from the AMATERASU trial—with my interpretation after each piece of information: 

“The 5-year relapse-free survival was 77% with vitamin D vs 69% with placebo (hazard ratio [HR] for relapse or death, 0.76; 95% CI, 0.50-1.14; P = .18).”

On average, people randomized to take vitamin D were 8 percentage points more likely to live five years without a recurrence. But this apparent benefit could have occurred by chance.

“The 5-year overall survival in the vitamin D vs placebo groups was 82% vs 81% (HR for death, 0.95; 95% CI, 0.57-1.57; P = .83).”

Most people in the trial, regardless of what they were assigned to take, were alive five years later.

So up to this point, there’s no evidence that vitamin D reduced the likelihood of cancer recurrence or lengthened lives.

But now, the kicker:

“In the subgroup of patients with baseline serum 25(OH)D levels between 20 and 40 ng/mL, the 5-year relapse-free survival was 85% with vitamin D vs 71% with placebo (HR for relapse or death, 0.46; 95% CI, 0.24-0.86; P = .02; P = .04 for interaction).”

People vary in how much vitamin D they have in their bodies before they start the trial. We might expect that any benefit of vitamin D supplementation would only be observed for people who had lower levels of vitamin D in their bodies to start with.

Indeed, for people who had lower vitamin D levels before starting the trial, 85% of those who took the supplement did not have a recurrence or die, compared to 71% of those who took the placebo. For these people, the difference in recurrence/survival was larger than was expected by chance alone.

The SUNSHINE trial, which tested whether high vs. low-dose vitamin D supplementation would prevent against recurrence of advanced colorectal cancer, also reported ambiguous results, which I won’t go into here.

Overall, no clear evidence from these trials that vitamin D supplementation protects against recurrence of colorectal cancer, on average. But also no evidence that vitamin D supplementation increases the risk of death or other major adverse outcomes.

Pixabay

Source: Pixabay

So what did I do? 

Well, I spoke with my doctor. We discussed the risks of taking vitamin D supplementation.

This is important: Supplement makers generally don’t report “side-effects” of their products, only nebulous benefits like “provides immune support.” FDA-approved medications have to describe potential side-effects. This is what (ostensibly) allows doctors and patients to make informed decisions regarding whether the expected benefit of a medication is worth the potential risks. But supplement makers don’t have to disclose potential risks because they aren’t regulated by the FDA. Since they don’t have to, they don’t: Just like the chef won’t tell you that their alfredo sauce—as the polite hosts of The Great British Baking Show would say—is “not worth the calories.”

So I talked with my doctor, had my vitamin D levels tested, and the rest of the story is safely stored in my HIPAA-protected medical record.

Uncertain is not a synonym for invalid. Openly acknowledging uncertainty is the fuel of scientific progress. But that is cold-comfort when patients and clinicians have to make decisions about action (or inaction) in the face of this uncertainty. High-quality research is the foundation of the risk/benefit healthcare discussions that we all have to traverse, even if that research rarely, if ever, provides certainty.

*I also wrote about the importance of distinguishing the search for factors that explain individual-level differences versus those that explain population-level changes in the distributions of disease—a topic for next time!


Source link