Does walking 10 minutes per day extend your life? (Hint: No.)

Either I completely dozed for the past ten years, or the author of a recent news article pulled data from some farcical study. Let’s take a closer look. 

Read Time 7 minutes

Have you heard of the all-new fountain of youth? It was all over the New York Times, “Walking Just 10 Minutes a Day May Lead to a Longer Life.” I have to be honest, I was excited the moment my eyes caught the title of this article; I felt like a kid on Christmas morning, ready to tear the wrapping off of a big present. Perhaps not in the way most people would think, but because the blazing red sirens going off in my head were telling me that I was in for another game of “tear the article apart.” This was going to be fun. Either I completely dozed for the past ten years, or the author of this news article pulled data from some farcical study. How in the world does walking 10 minutes a day extend my life? But I try to have an open mind, so I read into this study performed by Saint-Maurice et al. 2021… and I do have a couple of things to say.

I understand why the media does this.

Before I begin, I want to acknowledge that I understand why the media does this. They like to sensationalize these studies because it helps attract readers. After all, it caught my attention. Moreover, this is a health study that aims to improve people’s lives, so maybe the author also has an altruistic motive. The flashier the title, the more eyes it draws. The more eyes it draws, the more people will want to become physically active. And hopefully, this results in longer lives. I get it. The problem is that without critical understanding of the limitations of observational studies, the author of the news article didn’t recognize major flaws to the study. If results from scientific papers are not thoroughly scrutinized and the media further sensationalizes the findings, then they run the serious risk of exacerbating misinformation and causing more harm than good.

About the study.

The NYT article was based off of a study suggesting a dose-dependent relationship between increasing daily physical activity 10-30 min and lower risk of death among US adults, regardless of baseline activity level. From 2003 to 2006, 4840 NHANES participants aged 40-85 years were evaluated based on accelerometer data recorded for 7 days. Let me repeat that, the data was recorded for only 7 days. A total of 1165 deaths were reported in the mortality follow-up (mean of 10 years) completed in 2015. The participants were categorized into 8 groups according to the amount of moderate-to-vigorous physical activity (MVPA) they performed, as measured by accelerometers (i.e. total daily activity of 20-39 minutes, 40-59 minutes, and so on). From a previous publication, they found accelerometer measurements surpassing a certain threshold correlated well with a sustained VO2 expenditure equivalent to MVPA. Activities such as gardening, mowing the lawn, and walking are sufficient to be considered as MVPAs.

To estimate the number of annual deaths prevented from increasing MVPA, the authors basically multiplied the U.S. annual deaths of those aged 40-84 in 2003 with the rate of deaths within each MVPA group and designated the result as the population attributable fraction (PAF). Furthermore, the PAF was calculated by multiplying the prevalence of MVPA in each group with the hazard ratio (HR, or risk of death) of each of the 8 MVPA groups, which were adjusted for covariate factors including age, sex, race, education level, BMI, diet, alcohol use, smoking status, comorbidities (e.g., diabetes, heart disease, cancer, chronic diseases), mobility limitations, and general health. In other words, these values approximate the likelihood of death given a person’s physical activity level and from demographic factors or disease. From this calculation, the authors concluded that increasing physical activity by 10, 20, or 30 minutes per day was associated with a decrease in U.S. annual deaths of 6.9% (111,174), 13.0% (209,459), and 16.9% (272,297), respectively.

Allow me to restate this. Essentially, by combining an estimate (MVPA prevalence) with another estimate (adjusted HR) to create an estimate (PAF) applied to mortality statistics, the study claims that approximately 110,000 annual deaths can be prevented if US adults aged 40-85 years increased their moderate-to-vigorous activities by 10 minutes a day. Just 10 minutes?! I have no doubt that exercise and MVPA improves our health and potentially longevity, as there is no shortage of evidence to support this, but it is absurd to make a jump claiming an additional 10 minutes of physical activity a day can save 110,000 lives annually. There are a litany of issues with this study, but let me share a select few that may help you understand why observational studies like these purport outrageous findings.

Issue #1: only 7 days of data?!

One positive aspect to the study was that it relied on data from accelerometers for physical activity measurements, a far more accurate strategy than using questionnaires which rely on participant memory. However, accelerometer recording lasted for only 7 days. The authors have basically taken one week’s worth of physical activity information and extrapolated it to the participant’s lifetime. In other words, in order to calculate the HR between physical activity level groups, they assume that participants who had an average daily MVPA of 0-19 minutes would have continued to remain just as sedentary over the many years of follow-up (or until their deaths) – and that participants who logged over 140 minutes of MVPA per day would remain just as active.

This assumption is patently invalid. How many people reduce their physical activity as they age? After a major injury? How many people increase their activity because they joined a sports league or wanted to improve their health? Further, the one-week monitoring period itself may have marked a significant deviation from participants’ typical weekly activity level as a result of the Hawthorne effect – the phenomenon in which participants alter their behavior as a response to the knowledge that they are being observed. The mere fact that participants were aware of their activity trackers increases the likelihood that some may alter their activity level in response.

Extrapolation from one week to a full lifetime is also a concern for the plethora of covariates included in the HR analysis. The data were originally gathered from the NHANES 2003-2006 studies, which for the most part was a 1-day questionnaire, with possible follow-up on certain questions after one week. But there is no way that taking a survey at one time point accurately reflects what will happen years after. Take diet, for example: how often does someone maintain their exact diet for more than a decade? Ridiculous.

Issue #2: Healthy user bias

Another major concern for this study is the healthy user bias, which occurs when participants of a treatment are healthier because of factors other than the treatment’s effect. Put more simply, people who are health-conscious in one area (e.g., exercise) are more likely to be health-conscious in others (e.g., diet). In this study, we can assume that physical activity is the treatment, so those who are more active may be healthier due to other causes, such as choosing a healthy diet or abstaining from alcohol, for instance. Though the authors state that they factored many covariates into their analysis, no amount of statistical correction can ever account for every possible variable (pollution exposure and sleep habits, for example, may both impact health but were ignored in this analysis). These other variables may partially or even fully account for the observed differences in mortality across activity groups. Other important examples of such variables are age and illness: older participants and those who suffer from illness are less likely to be very active than young, healthy participants, but they are also more likely to die during the course of the study due to factors largely unrelated to activity level, such as pre-existing disease. Because there are numerous factors affecting overall health, it is difficult to isolate the effects of just physical activity without conflating its effect with other covariates.

Issue #3: Correlation vs. causation

Finally, one of the biggest problems with this and nearly all observational studies is the question of correlation vs. causation, a source of virtually limitless misinterpretation and misrepresentation of scientific results. Correlation simply implies there is some association between two variables; they tend to vary in coordination with each other. Causation, on the other hand, implies that one variable directly affects another variable. Determining causation usually requires interventional studies, especially of a randomized design. In the case of activity level effects on mortality, this would ideally involve random assignment of individuals to either a control group (in which participants maintained their normal daily activity level) or treatment groups (in which they must add an additional 10-30 minutes of daily MVPA above baseline), continuing these practices until the end of the follow-up period. Comparison of mortality rates in the control and treatment groups would then permit us to draw some conclusions about the causality of any potential effect.

This study instead provided a statistical model based off of observed correlations only, they did not test to see if there was a direct relationship between physical activity and mortality. In other words, this study involved no intervention to establish causality, and it is therefore incorrect to state that 10 minutes of physical activity can prevent 110,000 deaths.

The bottom line.

With all the limitations of this publication, I’m shocked that it made its way through scientific peer review, but worse still is how the media twisted it further. The New York Times article titled “Walking Just 10 Minutes a Day May Lead to a Longer Life” is a gross misinterpretation of what the study revealed. There is no evidence for cause and effect as the title implies. The study simply found a correlation between participants with higher MVPA and fewer deaths. Furthermore, their calculations of HRs were based on flawed estimations, and adapting the results of this small study to the entire U.S. population multiplies the inaccuracy of the study.

So what are we to do when we see these enticing articles with flashy headings that claim doing X will result in Y? As the saying goes: if it seems too good to be true, it probably is. So instead of blindly trusting what the author writes, use these opportunities to hone your skills in reading these pieces with a more critical eye. (If you’re unsure where to start, consider reviewing my previous newsletter series on studying studies or AMA #30.) Ensure that the methods are solid and cover what the study aims to prove. Can their methods address the question they’ve posed? Be wary of confounding variables that the authors overlooked. Look for evidence that demonstrates condition X directly causes condition Y. These skills will help you to sort through the piles of articles selling snake oil, and eventually, identifying their flaws can even be fun. So go ahead: tear the wrapping.

Disclaimer: This blog is for general informational purposes only and does not constitute the practice of medicine, nursing or other professional health care services, including the giving of medical advice, and no doctor/patient relationship is formed. The use of information on this blog or materials linked from this blog is at the user's own risk. The content of this blog is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Users should not disregard, or delay in obtaining, medical advice for any medical condition they may have, and should seek the assistance of their health care professionals for any such conditions.
Facebook icon Twitter icon Instagram icon Pinterest icon Google+ icon YouTube icon LinkedIn icon Contact icon