The Importance of Red Teams


“The first principle is that you must not fool yourself—and you are the easiest person to fool.” This quote from Richard Feynman’s 1974 Caltech commencement address is a welcome earworm repeating in my head whenever I’m thinking about probabilities and decision-making.

At the time of this writing, at least as it pertains to science, most of us are only thinking about SARS-CoV-2 and COVID-19. What policies will do more good than harm to individuals and society in the short and long-term? You are only fooling yourself if you don’t think there’s a high degree of uncertainty about the best path forward.

In these times, alternative and opposing opinions on the problems and solutions surrounding the pandemic need to be heard, not silenced. Yet, popular platforms may not be allowing this to happen. YouTube’s CEO, for example, at one point reportedly said, “anything that goes against WHO recommendations” on the COVID-19 pandemic would be “a violation of our policy,” and those videos would be removed. Twitter also updated its policy, broadening its definition of harmful content to include, “content that goes directly against guidance from authoritative sources of global and local public health information.” Facebook updated its site, “Removing COVID-19 content and accounts from recommendations, unless posted by a credible health organization.”

I completely understand these positions and, on balance, they probably do more good than harm, however, they may come at a cost down the line (or even right now). If history is any guide, censoring any opinions that contradict institutional authorities or the conventional wisdom often doesn’t end well. However, as the German philosopher Hegel put it, the only thing we learn from history is that we learn nothing from history. While COVID-19 presents us with a particularly thorny case of decision making based on scientific uncertainty, this issue is perennial in science.

We are our own worst enemies when it comes to identifying any shortcomings in our hypotheses. We are victims of confirmation bias, groupthink, anchoring, and a slew of other cognitive biases. The worst part is that we are often unaware of our biases, which is why we’re the easiest people to fool. As painful as it seems, considering problems and solutions from a perspective that contradicts our own is one of the best ways to enhance our decision-making. But thinking this way, deliberately and methodically, is a practice, and though it’s really hard, it is necessary in order to sharpen our cognitive swords.

In the early 19th century, the Prussian army adopted war games to train its officers. One group of officers developed a battle plan, and another group assumed the role of the opposition, trying to thwart it. Using a tabletop game called Kriegsspiel (literally “wargame” in German), resembling the popular board game Risk, blue game pieces stood in for the home team—the Prussian army—since most Prussian soldiers wore blue uniforms. Red blocks represented the enemy forces—the red team—and the name has stuck ever since.

Today, red teaming refers to a person or team that helps an organization—the blue team—improve, by taking an adversarial or alternative point of view. In the military, it’s war-gaming and real-life simulations, with the red team as the opposition forces. In computer security, the red team assumes the role of hackers, trying to penetrate the blue team’s digital infrastructure. In intelligence, red teams test the validity of an organization’s approach by considering the possibility of alternative hypotheses and performing alternative analyses. A good red team exposes ways in which we may be fooling ourselves.

“In science we need to form parties, as it were, for and against any theory that is being subjected to serious scrutiny,” wrote the scientific philosopher Karl Popper in 1972. “For we need to have a rational scientific discussion, and discussion does not always lead to a clear-cut resolution.” Seeking evidence that contradicts our opinion is a sine qua non in science.

Popper pointed out that a scientist’s theory is an attempted solution in which she invested great hopes. A scientist is often biased in favor of her theory. If she’s a genuine scientist, however, it’s her duty to try and falsify her theory. But she will inevitably defend it against falsification. It’s human nature. Popper actually found this desirable, to distinguish genuine falsifications from illusory ones. A good blue team keeps the red team honest.

Generally, the more we can introduce and consider opposing views into our thinking, the more we can rely on the knowledge we’re trying to build. But—and this is a very, very big BUT—not all opposing views are equal. Recognizing the difference between scientific claims and pseudoscientific claims is crucial and a failure to do so makes the following exercise futile. At no time is this distinction between “good” science worthy of debate and “junk” science worthy of the skip/delete button simultaneously more important, and more difficult, to appreciate than it is today, where we find the barrier to signal (e.g., good science) and noise (e.g., bad science, or pseudoscience) propagation to be essentially non-existent. How do we differentiate between reasonable and baseless views? This is trickier than it seems, because if we’re not vigilant, we may simply dismiss opposing views as quackery just because they happen to contradict our opinion.

In his 2016 Caltech commencement address, Atul Gawande highlighted five hallmark traits of pseudoscientists: (1) conspiracy, (2) cherry-picking the data, (3) producing fake experts, (4) moving the goalposts, and (5) deploying false analogies and other logical fallacies. “When you see several or all of these tactics deployed,” said Gawande, “you know that you’re not dealing with a scientific claim anymore.” Learning how to dismiss some ideas while embracing others is a topic that deserves far more ink than spilled here, but now, more than ever, we’re awash with ideas and opinions that shouldn’t be taken seriously.

What are some things you can do to incorporate red teaming into your mental models? Deliberately assigning people to a red team, or even red-teaming your own opinion, is a way of “gamifying” adversarial ideas that otherwise may seem too intellectually painful to confront. Getting into the habit of performing a premortem on your ideas—envision what can go wrong before the start—is another effective way to test them. Reading literature and consuming media related to good (and bad) critical thinking and reasoning helps. Participating in a journal club can help you consider alternative views from yours, and see things in a new light.

Red teaming is more of a mindset to maintain tonically, rather than an obscure tactic to pull out for special occasions.

Charlie Munger, Warren Buffett’s right-hand man, encapsulated this mental model during his 2007 USC Law School commencement address: “I’m not entitled to have an opinion on [a] subject unless I can state the arguments against my position better than the people do who are supporting it.”

Peter Attia, MD

Receiving his medical degree from Stanford University, Attia spent five years at Johns Hopkins as a general surgery resident; two years at the National Cancer Institute at NIH in Bethesda, Maryland as a surgical oncology Fellow. He co-founded and served as president of the Nutrition Science Initiative, before founding Attia Medical, PC, a practice focusing on the applied science of longevity and optimal performance.

Photo courtesy of


Subscribe to Our Newsletter

  • White Facebook Icon

© 2023 by TheHours. Proudly created with