May 24, 2020

Understanding science

The importance of red teams

“The first principle is that you must not fool yourself—and you are the easiest person to fool.” —Richard Feynman

Read Time 5 minutes

“The first principle is that you must not fool yourself—and you are the easiest person to fool.” This quote from Richard Feynman’s 1974 Caltech commencement address is a welcome earworm repeating in my head whenever I’m thinking about probabilities and decision-making.

At the time of this writing, at least as it pertains to science, most of us are only thinking about SARS-CoV-2 and COVID-19. What policies will do more good than harm to individuals and society in the short- and long-term? You are only fooling yourself if you don’t think there’s a high degree of uncertainty about the best path forward.

In these times, alternative and opposing opinions on the problems and solutions surrounding the pandemic need to be heard, not silenced. Yet, popular platforms may not be allowing this to happen. YouTube’s CEO, for example, at one point reportedly said, “anything that goes against WHO recommendations” on the COVID-19 pandemic would be “a violation of our policy,” and those videos would be removed. Twitter also updated its policy, broadening its definition of harmful content to include, “content that goes directly against guidance from authoritative sources of global and local public health information.” Facebook updated its site, “Removing COVID-19 content and accounts from recommendations, unless posted by a credible health organization.”

I completely understand these positions and, on balance, they probably do more good than harm, however, they may come at a cost down the line (or even right now). If history is any guide, censoring any opinions that contradict institutional authorities or the conventional wisdom often doesn’t end well. However, as the German philosopher Hegel put it, the only thing we learn from history is that we learn nothing from history. While COVID-19 presents us with a particularly thorny case of decision making based on scientific uncertainty, this issue is perennial in science. (If you want to read a good article arguing for debate and alternative viewpoints specific to the case of COVID-19, check out this one co-authored by Vinay Prasad and Jeffrey Flier in STAT, the former who I am scheduled to interview in the coming months.)

We are our own worst enemies when it comes to identifying any shortcomings in our hypotheses. We are victims of confirmation biasgroupthinkanchoring, and a slew of other cognitive biases. The worst part is that we are often unaware of our biases, which is why we’re the easiest people to fool. As painful as it seems, considering problems and solutions from a perspective that contradicts our own is one of the best ways to enhance our decision-making. But thinking this way, deliberately and methodically, is a practice, and though it’s really hard, it is necessary in order to sharpen our cognitive swords.

In the early 19th century, the Prussian army adopted war games to train its officers. One group of officers developed a battle plan, and another group assumed the role of the opposition, trying to thwart it. Using a tabletop game called Kriegsspiel (literally “wargame” in German), resembling the popular board game Risk, blue game pieces stood in for the home team—the Prussian army—since most Prussian soldiers wore blue uniforms. Red blocks represented the enemy forces—the red team—and the name has stuck ever since.

Today, red teaming refers to a person or team that helps an organization—the blue team—improve, by taking an adversarial or alternative point of view. In the military, it’s war-gaming and real-life simulations, with the red team as the opposition forces. In computer security, the red team assumes the role of hackers, trying to penetrate the blue team’s digital infrastructure. In intelligence, red teams test the validity of an organization’s approach by considering the possibility of alternative hypotheses and performing alternative analyses. A good red team exposes ways in which we may be fooling ourselves.

“In science we need to form parties, as it were, for and against any theory that is being subjected to serious scrutiny,” wrote the scientific philosopher Karl Popper in 1972. “For we need to have a rational scientific discussion, and discussion does not always lead to a clear-cut resolution.” Seeking evidence that contradicts our opinion is a sine qua non in science.

Popper pointed out that a scientist’s theory is an attempted solution in which she invested great hopes. A scientist is often biased in favor of her theory. If she’s a genuine scientist, however, it’s her duty to try and falsify her theory. But she will inevitably defend it against falsification. It’s human nature. Popper actually found this desirable, to distinguish genuine falsifications from illusory ones. A good blue team keeps the red team honest.

Generally, the more we can introduce and consider opposing views into our thinking, the more we can rely on the knowledge we’re trying to build. But—and this is a very, very big BUT—not all opposing views are equal. Recognizing the difference between scientific (worthy of debate, though often still incorrect over time) claims and pseudoscientific (not worthy of debate, as the very foundations on which they sit are not pulled from the disciple of science or the scientific method) claims is crucial and a failure to do so makes the following exercise futile. At no time is this distinction between “good” science worthy of debate and “junk” science worthy of the skip/delete button simultaneously more important, and more difficult, to appreciate than it is today, where we find the barrier to signal (e.g., good science) and noise (e.g., bad science, or pseudoscience) propagation to be essentially non-existent. How do we differentiate between reasonable and baseless views? This is trickier than it seems, because if we’re not vigilant, we may simply dismiss opposing views as quackery just because they happen to contradict our opinion.

 

§

 

In his 2016 Caltech commencement address, Atul Gawande highlighted five hallmark traits of pseudoscientists: (1) conspiracy, (2) cherry-picking the data, (3) producing fake experts, (4) moving the goalposts, and (5) deploying false analogies and other logical fallacies. “When you see several or all of these tactics deployed,” said Gawande, “you know that you’re not dealing with a scientific claim anymore.” Learning how to dismiss some ideas while embracing others is a topic that deserves far more ink than spilled here, but now, more than ever, we’re awash with ideas and opinions that shouldn’t be taken seriously.

There are legion examples of this, and I would suggest you pick one, and go through the exercise. Let’s consider here the claims that the Apollo moon landings were hoaxes, staged by the U.S. government and NASA. How many of the five boxes, above, get checked in an effort to explain this claim?

  1. Conspiracy: The moon landings were faked because of the space race with the Soviet Union, NASA funding and prestige, and/or distracting public attention away from the Vietnam War.
  2. Cherry-picking: Any appearance of potential photographic and film oddities are evidence of a hoax while rebuttals can be ignored or dismissed since they’re obscuring the “truth.” (Check out this slideshow of several iconic “hoax photos.”)
  3. Fake experts: Amateurs examining pictures, seeking, and finding, anomalies that necessitate comprehensive knowledge of photography and lunar terrain (which they lack).
  4. Moving the goalposts: NASA should be able to provide pictures of the Apollo landing sites to confirm the event—but when such pictures surface—NASA must’ve faked them.
  5. Logical fallacies: If any of the footage (or any other evidence of the moon landings) appears faulty, it must be fabricated—no other possibilities exist.

Alternatively, the tens of thousands of individuals that worked on, or were involved with, the Apollo program did not all conspire to fake six crewed moon landings between 1969 and 1972. The supposed oddities of photographs and film can be logically explained and the totality of the evidence is consistent with genuine moon landings. These explanations, and this evidence, come from people skilled and knowledgeable in their respective fields (i.e., experts). No moving of goalposts or logical fallacies required. (I wrote about my fascination with conspiracy theorists and the moon landing in a previous email.)

 

§


What are some things you can do to incorporate red teaming into your mental models? Deliberately assigning people to a red team, or even red-teaming your own opinion, is a way of “gamifying” adversarial ideas that otherwise may seem too intellectually painful to confront. Getting into the habit of performing a premortem on your ideas—envision what can go wrong before the start—is another effective way to test them. Reading literature and consuming media related to good (and bad) critical thinking and reasoning helps. (I included a list of some of my favorite reading materials in this post.) Participating in a journal club can help you consider alternative views from yours, and see things in a new light. (We do this internally and are considering filming them on Zoom for our podcast subscribers.)

Red teaming is more of a mindset to maintain tonically, rather than an obscure tactic to pull out for special occasions. Charlie Munger, Warren Buffett’s right-hand man, encapsulated this mental model during his 2007 USC Law School commencement address: “I’m not entitled to have an opinion on [a] subject unless I can state the arguments against my position better than the people do who are supporting it.”

 

– Peter

Disclaimer: This blog is for general informational purposes only and does not constitute the practice of medicine, nursing or other professional health care services, including the giving of medical advice, and no doctor/patient relationship is formed. The use of information on this blog or materials linked from this blog is at the user's own risk. The content of this blog is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Users should not disregard, or delay in obtaining, medical advice for any medical condition they may have, and should seek the assistance of their health care professionals for any such conditions.

11 Comments

  1. I like the idea of a stystematic approach for distinguishing science from psuedoscience and am not a conspiracy theorist but there are still times we have to be careful. Regarding points 2 and 3, cherry picking and fake experts, sometimes in a world of controlled or limited release of information, all one can do is cherry pick as an amateur. Consider 5G, when the SMEs are also the ones who have a vested interest in making 5G ubiqutious it may require a layman to research it. At least until the layman has generated enough interest until other SMEs or knowledgeable people are exposed to the alternative hypothesis and research it themselves. Wilful ignorance is easy but dangerous. I can be wilfully ignorant of theoretical problems with 5G or any other ubiquitous technology/pollution/food source etc but this is dangerous if any turn out to be harmful. The 5 suggestions made for evaluating claims are a great starting point, i suppose we also have to ask *why* is someone cherry picking? *why* are they moving the goalposts?

    Just maybe they have a valid reason.

  2. I really enjoyed this article. I think I am going to call the fellas over for a game of risk! (With masks on ofcourse)

  3. This is a very thought provoking and welcome discussion.
    I have been increasingly concerned about the censorship that I have seen occurring in our culture and particularly during this period of time. I would so welcome debate on the best path forward but it seems so socially unacceptable to have opinions or even wonderings that do not not conform to CNNs daily message. That in itself is troubling. Also concerning is how different organizations, educational, media, medical, scientific and political are being funded and one wonders if any conflicts of interest are present. There must be other people like me who are thinking of these issues but there seems to be no place for debate as there is only one acceptable narrative. Thank you Peter and team for all you are doing.

  4. Yes, it is good to debate or look at all sides of an issue, and to be suspicious of your own biases. However, the problem with over-thinking and playing the devil’s advocate is that it can freeze you into a state of inertia. There are always minutiae to examine, and who knows what research tomorrow will bring? Why not wait just a few more days to write that article or get out and protest? Navel gazing can be as bad as running amok and this article, although correct on the surface, seems to be muddying the waters, not clearing them. This is particularly important with COVID-19. When world events are moving very fast, it is necessary to keep pace, even though it doesn’t allow time to pull out the board games. Because this is definitely not a game. Although conspiracy is the first item in the list, I think the author really means conspiracy theory, which is more derogatory. It’s a term that has come to suggest that anyone who holds the unpopular view is irrational, even if solid evidence to the contrary is available. A weaponized term to silence the opposition when, in fact, there actually might be a conspiracy going on. Yesterday’s conspiracy theory is today’s history.

  5. Simple – have the FCC require that all post be cited and content producers must provide said citations – just like research paper back in college. The issue here isn’t control of information, rather is the framework to produce abuse of disinformation and the clear harm it HAS cause and will continue to cause. The mere fact people actually believe COVID19 is a hoax – is of serious concern. We are failing in addressing a key rule of creditable arguments – the argument has to be creditable.
    Example – High Intensity Health guy – making claims the current SIP’s are a population control agaisnt people’s rights – a grossly inaccurate and misinformed statement – ask ANYONE who has studied law.

    • Question 1 If someone is deceived do they know it?

      Question 2 What is the best way to present the truth?

  6. Excellent article, Peter. We have used red-teaming at our fund for about a year now and it certainly works to sharpen the saw and have better decision trees. But frankly one of the biggest unexpected benefits to me was that it gets all of us acclimated to hearing excellent counter-arguments on a regular basis. Much like how a deadlift session feels when you haven’t lifted in months, criticism of your idea or theory is something that takes regular conditioning to hear well. Once everyone understands the process, and the job we are counting on them to do as the red team, it can also improve morale and buy-in of the entire group. Critique, testing, deconstructing, criticism, steelmanning…these should all be part of scientific rigor. Show me the leader that invites and feeds off of constructive criticism and fresh perspectives, and I’ll show you one who is getting the right answers faster and more often than his/her peers.

    • Andy, excellent addition to a succinct and timely blog post. Thank you, both.

  7. This is the perfect webpage for anyone who wishes to understand this topic.
    You realize a whole lot its almost hard to argue with
    you (not that I personally will need to?HaHa).
    You certainly put a brand new spin on a topic that has been written about for decades.
    Wonderful stuff, just excellent!

  8. Filming/recording Attia Medical journal club meetings? Yes, please! (I’d prefer that you put these out as podcasts, with the slides in the show notes; I’ll never bring myself to sit in front of a screen and passively watch the session).

Facebook icon Twitter icon Instagram icon Pinterest icon Google+ icon YouTube icon LinkedIn icon Contact icon