In a world where the population aged 60 or over doubled in the last 30 years, and is expected to double again by 2050, how’s that for a sensational headline? The story, written in the journal Nature, is referring to a study published by Aging Cell. Amazingly, only 137 of you sent it to me within the first 24 hours of its release.
Nine healthy men, given a cocktail of human growth hormone (hGH), metformin, DHEA, vitamin D3, and zinc for 1-year, shed about 2.5 years off their biological ages, according to an analysis of their epigenome.
As a result of this study, I’ve had more people than usual ask the following questions:
Should I be taking hGH? Should I be taking metformin? Should I be taking DHEA?
To address these questions (and others) will be a bit of an undertaking, so I’m breaking this topic up into at least two emails. In today’s email, I want to explain the study’s purpose, how it was done, what it found, as well as some of the nuts and bolts behind it, and—most importantly—propose a framework for evaluating studies in general. I’ve covered a lot of the groundwork in the Studying Studies series so I may sound a little like a broken record in places. That said, if you are tired of being held hostage by the media’s interpretation of science, you will need to buck up and learn this stuff. The Studying Studies series is the starting point. I realize it may seem like Groundhog Day for you to see more prose from me about how to think about studies rather than the tactical bits we think we can immediately extract and employ from them. Just remember, it’s better to learn how to fish than to be given … you get it.
On to the study.
The stated purpose was to investigate the possibility that using hGH in a population of men in their 50s and early 60s can prevent or reverse signs of the gradual deterioration of the immune system that has been attributed to natural age development (i.e., immunosenescence). The trial, dubbed Thymus Regeneration, Immunorestoration, and Insulin Mitigation, or TRIIM, reveals its aims. (Note that nothing in the initial aim of the study dealt with assessing the impact of the hormone/drug cocktail on the epigenome, for which all the attention has been generated.)
The thymus, a gland located in the middle of the upper chest, converts white blood cells from bone marrow into T-cells, which play a central role in the immune response. The “T” in T-cells is named after the thymus. As it turns out, the thymus reaches its maximum size by the end of the first year of life. After that, the thymus decreases in size and activity, particularly after puberty, in a process referred to as thymic involution. Along with the decrease in size and activity of the immune system with age comes an associated functional decline. The lead investigator of the study, Greg Fahy, wanted to see if he could regenerate the thymus and restore immune system function using hGH.
All things equal, a more youthful immune system would suggest greater longevity. But there was a catch with using hGH. The investigators worried that using hGH to regenerate the thymus might induce hyperinsulinemia (high insulin) and noted a “diabetogenic” effect of growth hormone. Hyperinsulinemia and diabetes are obviously not desired side effects, regardless of how much thymic regeneration takes place. So Fahy and his colleagues added metformin and DHEA to try and counter these potential effects. Vitamin D3 and zinc were also added as a hedge against cancer and inactive thymulin, according to Fahy (personal communication, email).
It’s not a surprise that the investigators chose metformin as a drug that can aid in “Insulin Mitigation” (the “IM” in TRIIM; for a nice overview of why, revisit the interview with Nir Barzilai), but DHEA? This was news to me. After doing a little, I mean a lot of digging, I would say there is not much in the way of evidence supporting the use of DHEA as an insulin lowering agent. According to a related article, it appears Fahy was working off his own hypothesis. Young people have higher growth hormone without an increase in insulin, and Fahy believed this to be due to them having higher levels of DHEA. Fahy tested this on himself by taking hGH alone for a week and found his insulin levels elevated by 50%. He then added DHEA and the increase was apparently reversed.
In the TRIIM study 9 men, ages 51-65, first took hGH alone (0.015 mg/kg, or ~3 IU for a person weighing 70 kg) 3-4 times per week for one week and then added DHEA (50 mg) the next week, similar to Fahy’s n=1. The week after that, the same doses of hGH and DHEA were combined with metformin (500 mg). At the start of the fourth week, the doses were individualized based on each participant’s particular responses. (To put the hGH dosing into context, while it’s individualized, athletes using it for performance enhancement may take 10-25 IU 3-4 times a week and “longevity” clinics may prescribe somewhere in the ballpark of 1-2 IU/day.) The goal of this titration approach was to maximize IGF-1 and minimize insulin by varying each of the hormones and drugs. The study didn’t reveal what the effect DHEA hay have had after week 2, so we contacted Fahy to check. He wrote that the results with DHEA were qualitatively the same but quantitatively different, with each person having their own specific response (personal communication, email).
It’s important to highlight that not only was this study multifaceted in the number of independent variables introduced (i.e., hGH, metformin, DHEA, vitamin D3, zinc), it was also personalized, since the subjects did not all receive the same dose of each agent. It’s possible (actually, likely) that all nine subjects were consuming a different cocktail in terms of the dosing of hGH, DHEA, and metformin. Also, it was a very small sample size and lacked a control group, consisting entirely of 9 healthy (see Supplement 2 for exclusion criteria) 51-65-year-old men.
So why, you might (rightly) ask, all the media hype for a very small, not especially well-controlled preliminary/exploratory study?
The investigators reported a mean “epigenetic age” approximately 1.5 years less than baseline after the 1-year intervention. In other words, their epigenetic age got 1.5 years younger while their chronological age obviously went up another year. For example, let’s say “John” entered the trial with a chronological and epigenetic age of 60. After the trial his chronological age is 61 and his epigenetic age is 58.5. Presumably, he increased his life expectancy (LE) by ~2.5 years, or got ~2.5 years younger biologically, depending on how you look at it. And it’s exactly for this reason that this study is being talked about at all.
Which brings us to the framework I would suggest you apply to every study you read or attempt to evaluate. In a study like this, lacking a control group and utilizing a surrogate outcome (i.e., something other than actual morbidity or mortality), such an analysis is essential. Let’s walk through the possible outcomes with respect to the intervention (the independent variables) and biological aging using the epigenetic clocks (the dependent variable). So now consider a 2×2 matrix of the following scenarios:
(i) the dependent variable (the clock) is a correct (i.e., representative) output measurement versus it is not.
(ii) the independent variables (the cocktail of inputs) did versus did not lead to the outcome we saw.
Again, the former question is necessary whenever evaluating a study with surrogate (i.e., not “hard”) outcomes and the latter question is essential in the absence of a control group.
The exercise, then, is to evaluate each of the 4 quadrants in this matrix and ultimately decide, for yourself, which one has the highest probability of being correct. This is the scientific method. It is not absolute. There are no “proofs.” It’s all about probabilities. Let’s start with the assumption that there was no foul play by anyone involved in the study. In this case, either:
1. The intervention accounted for the improvement, or
2. Something other than the intervention accounted for the improvement.
In the first case, there are many scenarios, and in the latter, there are also many scenarios. In the first case, it may be that the metformin alone accounted for the improvement, or the hGH alone, or there was a synergistic effect between the hGH, DHEA, and metformin, or perhaps one compound in the cocktail was detrimental, but the other compounds more than made up for it. And, remember, not only was there no control group, there was no consistency in the intervention. Everyone got their own signature cocktail. In the second case, it could be the Hawthorne effect at play. This is a type of bias where individuals change aspects of their behavior in response to knowing that they’re being observed. Maybe the participants changed their eating, sleeping, or exercising, for example, which confounded the experiment.
So this tells us how to consider the inputs to the study, but what about the output? Next, we consider if there was some sort of epigenetic clock malfunction? Here, we’ll consider the next two scenarios:
3. The clock estimate accurately represents biological age, or
4. The clock estimate is inaccurate.
Either we’re not being fooled and the clock is accurately picking up a change in mortality risk in this study or we’re being fooled and the clock is malfunctioning for some reason. We’ll pick this up next week (or the week after) to assess the likelihood of each matrix quadrant.
Oh, and I almost forgot, what may have gotten lost in the shuffle is whether the treatment showed promise for TRIIM, the intended aim of the study. After 1-year of treatment, there was “highly significant” evidence of a restoration of thymic functional mass along with improvements in age-related immunological parameters, based on MRI imaging and favorable changes in monocytes and T-cell changes. Insulin levels were reportedly controlled, so as far as preliminary studies go, it’s an intriguing finding, with certainly a lot more to learn.
One thing that is necessary in order to evaluate a study reportedly reversing epigenetic aging is understanding epigenetic clocks, the central tool used in this study. Steve Horvath, a researcher at UCLA, developed an algorithm bearing his name, the Horvath Clock, that shows a relationship between DNA methylation (DNAm) and aging. As cells age, there appears to be a relatively predictable pattern of alterations in DNAm, which are epigenetic changes (meaning they do not involve alterations in the DNA sequence, but rather changes in the methyl groups attached to DNA).1The first epigenetic clock was developed by Horvath and the results were published in 2013. There are generally three steps in the formation of an epigenetic clock. First, DNA methylation levels of CpG sites are measured. Second, a weighted average is formed using machine learning. Third, the average is transformed into units that correspond to “years,” in order to compare “epigenetic age” to chronological age.
Today, there are several different clocks that make different predictions (e.g., time to cardiovascular disease, mortality) and some combine other biomarkers with DNAm into a composite biomarker. As Horvath and colleagues write in their publication introducing one of the clocks, technically speaking, DNAm GrimAge is a mortality risk estimator, “metaphorically speaking, it estimates biological age.”2When Fahy and colleagues open their paper with the line that epigenetic clocks “can now surpass chronological age in accuracy for estimating biological age,” in essence, they’re saying this clock can more accurately tell you when you’re going to die than what your chronological age tells you.
This is the first time Horvath has seen a reversal in epigenetic clocks in a human trial. You may be asking yourself: can humans really reverse their biological age and did the study actually show this? According to this article, “Humans Can Reverse Their Biological Age, Shows a ‘Curious Case’ Study,” the answer to your two-part question is yes and yes.
However, I have a different interpretation.
In order to obtain the DNAm data, the investigators used frozen samples of peripheral blood mononuclear cells (PBMCs). So the clocks only looked at PBMCs (peripheral blood mononuclear cells), consisting of lymphocytes and monocytes, to determine epigenetic age. We don’t know what happened to any of the other cells and tissues (e.g., liver, muscle, adipose tissue) in the body. According to Horvath, the concordance of epigenetic age acceleration between PBMC and other cells and tissues is only weak to moderate: the average correlation of age acceleration is ~0.3 for many cross-tissue comparisons (Steve Horvath, personal communication, email).
This may (or may not) be an important issue. The clock may be accurate in estimating the epigenetic age of lymphocytes, but what about all of the other cells and tissues in the body? Perhaps the lymphocytes have a more youthful phenotype after 1-year, but this does not mean the rest of the body followed suit, which would likely be a prerequisite of reversing “global” aging. Henry Ford is said to have scavenged junkyards to find out which components had outlasted the useful life of his cars, then insisted that in new models these components should be replaced with cheaper materials. Likewise, imagine if we extend the life of the epithelial cells in our intestines, but our neurons and cardiomyocytes give out far before the stem cell reserves in our intestines. The point here is that it’s important to make the distinction between impacting one component of a process and impacting the process as a whole.
Which brings up another issue. Epigenetic alterations are considered to be one of nine hallmarks of aging. If the treatment in the study did in fact modulate this component of aging and increases longevity, it’s profound. But we should not lose sight of the fact that aging is considered to be a process with more mechanisms than DNAm. For example, rapamycin is considered by many to be the most effective and reproducible compound to directly target the aging process, but according to Horvath and his colleagues, none of their clocks detected an anti-aging effect of rapamycin. This suggests that at least one method of known life extension may evade detection by a DNAm clock that purportedly estimates changes in biological age.3We may need to revisit this topic as I’m hearing and reading about some conflicting views on rapamycin and its effects on epigenetic age. There are some anecdotes of people seeing significant epigenetic aging reversal taking rapamycin based on commercially available tests and at least one study suggesting a “slowing” of the epigenetic clock with rapamycin. But by and large, most experts we have spoken with tell us the experimental data, which is sadly often left unpublished, is in the other direction—no change at all.
Remember, this study was aimed at regenerating the thymus. Through MRI, the investigators reported an overall increase in the thymic fat-free fraction (TFFF) and an increase (to a lesser degree) of bone marrow fat-free fraction (BMFFF). The changes are said to be consistent with a specific reversal of thymic involution. After puberty, much of the functional tissue of the thymus is replaced by adipose tissue. Essentially, it appeared that the treatment resulted in some adipose tissue in the thymus being replaced by more functional tissue.
The investigators also measured the number of different immune cell populations (i.e., monocytes, lymphocytes), however they did not measure their functionality, which is critical. The number of immune cells may be less important than the functionality of those cells.
I’m left wondering if the reversal of an epigenetic clock that estimated DNAm patterns based on PBMCs is a result of thymic regeneration, but I can only wonder since the study lacked a control group.
Last point: methylation of DNA is part of the story, but it does not tell us about transcription (making RNA out of DNA) and translation (make protein out of RNA). So there are many things that need to happen to translate the changes in DNA to a phenotype—good or bad—which we can’t measure. In fact, we’re not sure what the predicted phenotype is from this DNA change.
Let’s now put all of this behind us and assume the epigenetic clocks work perfectly.
What is the probability that the intervention accounted for the improvement in the aging clocks? Asked another way, what are the odds you, too, should be on this cocktail of hGH, metformin, and DHEA? Conversely, what are the chances that something (or some things) other than the intervention contributed to the improvement?
It’s virtually impossible to answer these questions because there was no control group. The control group would minimize the effects of any variables other than the cocktail. What are the chances that these individuals were engaging in other behaviors that may have accounted for some or all of the improvement? We know these nine guys were healthy, but what might be most telling about their health status is their baseline epigenetic age.
Remember John, our hypothetical example of a participant who entered the trial with a chronological and epigenetic age of 60? It turns out that while his chronological age is representative of the participants in the trial, his epigenetic age is not, based on some of the estimations of the epigenetic clocks.
In fact, if you look at the baseline epigenetic age of the study population (Table 1), minus their chronological age, based on one of the clocks (DNAm PhenoAge), it was -17.5 years. This means that John’s epigenetic age entering the study was more likely to be 42.5 years-old. This certainly seems to indicate that these individuals are healthier (or younger) than the average person.
Table 1 (abridged). Epigenetic aging characteristics of the study population at baseline (Fahy et al., 2019).
EA = epigenetic age; A = chronological age; 0 = at zero months (trial onset); results given in years
Presumably, these nine participants are engaging in (or avoiding) behaviors that have already slowed or reversed their epigenetic clocks. The fact that after 12 months their epigenetic age was lower than baseline could be attributed to whatever it was that they were doing prior to the intervention in this study that made them younger in the first place. Again, a control group is absolutely essential to mitigate some of this bias, and was unfortunately missing in this trial. If we’re looking at exceeding healthy people doing exceedingly healthy things in the first place, it’s possible we still would have seen an improvement in epigenetic age if instead of metformin, hGH, and DHEA, we gave them Fruit Loops, Lucky Charms, and Frosted Flakes.
The conclusions for me therefore are:
1) While it may seem like I’m dismissive, I believe these epigenetic clocks are a very interesting tool and a boon to the field for the study of aging. I just think we need to be careful in our analysis and interpretation and ideally use them in triangulation with other signatures of aging, including functional immune assays.
2) This study does not convince me to put even one of my patients on the cocktail of hGH, metformin, and DHEA used in this study, despite how many have asked. This is a small, short, and uncontrolled preliminary study that can’t determine cause-and-effect between the many independent variables in the study and a reversal of aging represented by epigenetic clocks (that are solely estimating at one specific blood cell type in isolation). Time may prove me overly cautious, which would be wonderful, but not today.