January 10, 2018

Understanding science

Why we’re not wired to think scientifically (and what can be done about it)

What is it about being human that conflicts with being scientific?

Read Time 8 minutes

Author’s note: This post was originally published in May, 2014. It has been updated to reflect my current thinking on the topic. Perhaps the best addition, by popular demand, is Rik’s coffee recipe (click the 1st inline footnote).

§

In 2012, I was having dinner with a good friend, Rik Ganju, who is one of the smartest people I know. And one of the most talented, too—a brilliant engineer, a savant-like jazz musician, a comedic writer, and he makes the best coffee I’ve ever had.1Here is the coffee recipe, courtesy Rik. I make this often and the typical response is, “Why are you not making this for a living?” Look for Vietnamese cinnamon, also known as Saigon cinnamon; you need two big dashes, if that. You need real vanilla (be careful to avoid the cheap versions with added sugar). Best is dissolved in ethanol; if that doesn’t work for you get the dried stick and scrape the pods. Then find a spice store and get chicory root (I’m a bit lazy and get mine on Amazon). You’ll want to replace coffee beans with ~10% chicory on a dry weight basis. If you’re on a budget, cut your coffee with Trader Joe’s organic Bolivian. But do use at least 50% of your favorite coffee by dry weight: 50-40-10 (50% your favorite, 40% TJ Bolivian, 10% other ingredients [chicory root, cinnamon, vanilla, amaretto for an evening coffee]) would be a good mix to start. Let it sit in a French press for 6 minutes then drink straight or with cream, but very little–max is 1 tablespoon of cream. The Rik original was done with “Ether” from Philz as the base. I was whining to him about my frustration with what I perceived to be a lack of scientific literacy among people from whom I “expected more.” Why was it, I asked, that a reporter at a top-flight newspaper couldn’t understand the limitations of a study he was reporting on? Are they trying to deliberately mislead people, or do they really think this study which showed an association between such-and-such, somehow implies X?

Rik just looked at me, kind of smiled, and asked the question in another way. “Peter, give me one good reason why scientific process, rigorous logic, and rational thought should be innate to our species?” I didn’t have an answer. So as I proceeded to eat my curry, Rik expanded on this idea. He offered two theses. One, the human brain is oriented to pleasure ahead of logic and reason; two, the human brain is oriented to imitation ahead of logic and reason. What follows is my attempt to reiterate the ideas we discussed that night, focusing on the second of Rik’s postulates—namely, that our brains are oriented to imitate rather than to reason from first principles or think scientifically.

One point before jumping in: This post is not meant to be disparaging to those who don’t think scientifically. Rather, it’s meant to offer a plausible explanation. If for no other reason, it’s a way for me to capture an important lesson I need to remember in my own journey of life. I’m positive some will find a way to be offended by this, which is rarely my intention in writing, but nevertheless I think there is something to learn in telling this story.

The evolution of thinking

Two billion years ago, we were just cells acquiring a nucleus. A good first step, I suppose. Two million years ago, we left the trees for caves. Two hundred thousand years ago we became modern man. No one can say exactly when language arrived, because its arrival left no artifacts, but the best available science suggests it showed up about 50,000 years ago.

I wanted to plot the major milestones, below, on a graph. But even using a log scale, it’s almost unreadable. The information is easier to see in this table:

Formal logic arrived with Aristotle 2,500 years ago; the scientific method was pioneered by Francis Bacon 400 years ago. Shortly following the codification of the scientific method—which defined exactly what “good” science meant—the Royal Society of London for Improving Natural Knowledge was formed. So, not only did we know what “good” science was, but we had an organization that expanded the application, including peer review, and existed to continually ask the question, “Is this good science?”

While the Old Testament makes references to the earliest clinical trial—observing what happened to those who did or did not partake of the “King’s meat”—the process was codified further by 1025 AD in The Canon of Medicine, and formalized in the 18th century by James Lind, the Scottish physician who discovered, using randomization between groups, the curative properties of oranges and lemons—vitamin C, actually—in treating sailors with scurvy. Hence the expression, “Limey.”

The concept of statistical significance is barely 100 years old, thanks to Ronald Fisher, the British statistician who popularized the use of the p-value and proposed the limits of chance versus significance.

The art of imitation

Consider that for 2 million years we have been evolving—making decisions, surviving, and interacting—but for only the last 2,500 years (0.125% of that time) have we had “access” to formal logic, and for only 400 years (0.02% of that time) have we had “access” to scientific reason and understanding of scientific methodologies.

Whatever a person was doing before modern science—however clever it may have been—it wasn’t actually science. And along the same vein, how many people were practicing logical thinking before logic itself was invented? Perhaps some were doing so prior to Aristotle, but certainly it was rare compared to the time following its codification.

Options for problem-solving are limited to the tools available. The arrival of logic was a major tool. So, too, was the arrival of the scientific method, clinical trials, and statistical analyses. Yet for the first 99.98% of our existence on this planet as humans—literally—we had to rely on other options—other tools, if you will — for solving problems and making decisions.

So what were they?

We can make educated guesses. If it’s 3,000 BC and your tribemate Ugg never gets sick, all you can do to try to not get sick is hang out where he hangs out, wear similar colors, drink from the same well—replicate his every move. You are not going to figure out anything from first principles because that isn’t an option, any more than traveling by jet across the Pacific Ocean was an option. Nothing is an option until it has been invented.

So we’ve had millions of years to evolve and refine the practice of:

Step 1: Identify a positive trait (e.g., access to food, access to mates),

Step 2: Mimic the behaviors of those possessing the trait(s),

Step 3: Repeat.

Yet, we’ve only had a minute fraction of that time to learn how to apply formal logic and scientific reason to our decision making and problem solving. In other words, evolution has hardwired us to be followers, copycats if you will, so we must go very far out of our way to unlearn those inborn (and highly refined) instincts to think logically and scientifically.

Recently, neuroscientists (thanks to the advent of functional MRI, or fMRI) have been asking questions about the impact of independent thinking (something I think we would all agree is “healthy”) on brain activity. I think this body of research is still in its infancy, but the results are suggestive, if not somewhat provocative.

To quote the authors of this work, “if social conformity resulted from conscious decision-making, this would be associated with functional changes in prefrontal cortex, whereas if social conformity was more perceptually based, then activity changes would be seen in occipital and parietal regions.” Their study suggested that non-conformity produced an associated “pain of independence.” In the study-subjects the amygdala became most active in times of non-conformity, suggesting that non-conformity—doing exactly what we didn’t evolve to do—produced emotional distress.

From an evolutionary perspective, of course, this makes sense. I don’t know enough neuroscience to agree with their suggestion that this phenomenon should be titled the “pain of independence,” but the “emotional discomfort” from being different—i.e., not following or conforming—seems to be evolutionarily embedded in our brains.

Good solid thinking is really hard to do as you no doubt realize. How much easier is it to economize on all this and just “copy & paste” what seemingly successful people are doing? Furthermore, we may be wired to experience emotional distress when we don’t copy our neighbor! And while there may have been only 2 or 3 Ugg’s in our tribe 5,000 years ago, as our societies evolved, so too did the number of potential Ugg’s (those worth mimicking). This would be great (more potential good examples to mirror), if we were naturally good at thinking logically and scientifically, but we’ve already established that’s not the case. Amplifying this problem even further, the explosion of mass media has made it virtually, if not entirely, impossible to identify those truly worth mimicking versus those who are charlatans, or simply lucky. Maybe it’s not so surprising the one group of people we’d all hope could think critically—politicians—seems to be as useless at it as the rest of us.

So we have two problems:

  1. We are not genetically equipped to think logically or scientifically; such thinking is a very recent tool of our species that must be learned and, with great effort, “overwritten.” Furthermore, it’s likely that we are programmed to identify and replicate the behavior of others, rather than think independently, and independent thought may actually cause emotional distress.
  2. The signal (truly valuable behaviors worth mimicking)-to-noise (all unworthy behaviors) ratio is so low—virtually zero—today that the folks who have not been able to “overwrite” their genetic tendency for problem-solving are doomed to confusion and likely poor decision making.

As I alluded to at the outset of this post, I find myself getting frustrated, often, at the lack of scientific literacy and independent, critical thought in the media and in the public arena more broadly. But, is this any different than being upset that Monarch butterflies are black and orange rather than yellow and red? Marcus Aurelius reminds us that you must not be surprised by buffoonery from buffoons, “You might as well resent a fig tree for secreting juice.”

While I’m not at all suggesting people unable to think scientifically or logically are buffoons, I am suggesting that expecting this kind of thinking as the default behavior from people is tantamount to expecting rhinoceroses not to charge or dogs not to bark—sure it can be taught with great patience and pain, but it won’t be easy in short time.

Furthermore, I am not suggesting that anyone who disagrees with my views or my interpretations of data frustrates me. I have countless interactions with folks whom I respect greatly but who interpret data differently from me. This is not the point I am making, and these are not the experiences that frustrate me. Healthy debate is a wonderful contributor to scientific advancement. Blogging probably isn’t. My point is that critical thought, logical analysis, and an understanding of the scientific method are completely foreign to us, and if we want to possess these skills, it requires deliberate action and time.

What can we do about it?

I’ve suggested that we aren’t wired to be good critical thinkers, and that this poses problems when it comes to our modern lives. The just-follow-your-peers-or-the-media-or-whatever-seems-to-work approach simply isn’t good enough anymore.

But is there a way to overcome this?

I don’t have a “global” (i.e., how to fix the world) solution for this problem, but the “local” (i.e., individual) solution is quite simple provided one feature is in place: a desire to learn. I consider myself scientifically literate. Sure, I may never become one-tenth a Richard Feynman, but I “get it” when it comes to understanding the scientific method, logic, and reason. Why? I certainly wasn’t born this way. Nor did medical school do a particularly great job of teaching it. I was, however, very lucky to be mentored by a brilliant scientist, Steve Rosenberg, both in medical school and during my post-doctoral fellowship. Whatever I have learned about thinking scientifically I learned from him initially, and eventually from many other influential thinkers. And I’m still learning, obviously. In other words, I was mentored in this way of thinking just as every other person I know who thinks this way was also mentored. One of my favorite questions when I’m talking with (good) scientists is to ask them who mentored them in their evolution of critical thinking.

Relevant aside: Take a few minutes to watch Feynman at his finest in this video—the entire video is remarkable, especially the point about “proof,”—but the first minute is priceless and a spot on explanation of how experimental science should work.

You may ask, is learning to think critically any different than learning to play an instrument? Learning a new language? Learning to be mindful? Learning a physical skill like tennis? I don’t think so. Sure, some folks may be predisposed to be better than others, even with equal training, but virtually anyone can get “good enough” at a skill if they want to put the effort in. The reason I can’t play golf is because I don’t want to, not because I lack some ability to learn it.

If you’re reading this, and you’re saying to yourself that you want to increase your mastery of critical thinking, I promise you this much—you can do it if you’re willing to do the following:

  1. Start reading (see starter list, below).
  2. Whenever confronted with a piece of media claiming to report on a scientific finding, read both the actual study and the media, in that order. See if you can spot the mistakes in reporting.
  3. Find other like-minded folks to discuss scientific studies. I’m sure you’re rolling your eyes at the idea of a “journal club,” but it doesn’t need to be that formal at all (though years of formal weekly journal clubs did teach me a lot). You just need a good group of peers who share your appetite for sharpening their critical thinking skills. In fact, we have a regularly occurring journal club on this site (starting in January, 2018).

I look forward to seeing the comments on this post, as I suspect many of you will have excellent suggestions for reading materials for those of us who want to get better in our critical thinking and reasoning. I’ll start the list with a few of my favorites, in no particular order:

  1. Anything by Richard Feynman (In college and med school, I would not date a girl unless she agreed to read “Surely You’re Joking, Mr. Feynman”)
  2. The Transformed Cell, by Steve Rosenberg
  3. Anything by Karl Popper
  4. Anything by Frederic Bastiat
  5. Bad Science, by Gary Taubes
  6. The Structure of Scientific Revolutions, by Thomas Kuhn
  7. Risk, Chance, and Causation, by Michael Bracken
  8. Mistakes Were Made (but not by me), by Carol Tavris and Elliot Aronson
  9. Thinking, Fast and Slow, by Daniel Kahneman
  10. The Method of Multiple Working Hypotheses,” by T.C. Chamberlin

I’m looking forward to other recommendations.

Disclaimer: This blog is for general informational purposes only and does not constitute the practice of medicine, nursing or other professional health care services, including the giving of medical advice, and no doctor/patient relationship is formed. The use of information on this blog or materials linked from this blog is at the user's own risk. The content of this blog is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Users should not disregard, or delay in obtaining, medical advice for any medical condition they may have, and should seek the assistance of their health care professionals for any such conditions.

295 Comments

  1. I suggest you take a look at two works of Richards J. Heuer, Jr. They are, respectively, The Psychology of Intelligence Analysis, and Analysis of Competing Hypotheses. The former is an excellent treatise on how an intelligence analyst should form and test hypotheses. The latter is a procedural program (done with the Palo Alto Research Center – PARC) that allows some automation of the process.

  2. This is wildly off the subject, but what is the function of the almond milk in your ice cream recipe?

  3. I learned a lot about how deluded us humans are by reading the book “You Are Not So Smart” by David McRaney

  4. I loved the post and all the comments, discovered some excellent video about Feynman and his book is just awesome. Actually I loved the post, because me and me wife have been struggling with this a lot recently.
    People will just replicate mainstream ideas regardless of whether they are true, politics, rumors about politicans, nutrition etc and on all fronts we do stuff differently (really no bread? really? your dog eats raw meat AND bones? etc)
    You can even bring a package of solutions for their problems, if those have surfaced already (overweight, bad skin, smelly dog), with all the science behind it and they will just go on with what they are doing already. It is so incredibly hard to convince people using the scientific way, they just dont lead the literature and it just confuses them.
    Enough of the complaining….

    One important thing I need to mention, the pain of independence is definitely experienced by my wife, this seem to be a phenomenon and I know some others who are impacted. It does not impact me that much, I have always been a free thinker and I have the warrior gene (which may reduce all kinds of fears and maybe that type of pain as well). It would be very interesting to have those people genotyped.

    Just as in nutrition, in thinking and decision making there could be huge genetic impact how people act on information.

    Finally, this may well be coupled with the https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect that you overestimate what you know if you are inept and a bunch of other biases (like the availability bias).

  5. I subscribed to and read Scientific American for several years. Lots of great stuff in there, and a wide range of topics. And while maybe not strictly scientific, I really love all of Douglas Hofstadter’s books.

    It’s hard to grasp the numbers in your table. And while my numbers may be off a little, have a look at this picture. Here you can see the difference between 1000 years, 10,000 years, and 2.5 million years.
    https://captaincheeto.com/2.5.html

  6. I started down this path of the study of nutrition after reading GCBC by Taubes. The book was recommended to me by my son (we were both educated as engineers) about 5 years after it was first printed. I was stunned by what I read, and I actually read it all again to make sure I absorbed as much as possible.

    So looking back, a well educated person like me, skilled in science and R&D, was just eating the standard American diet, trying to consume good portions of veggies and fruits and whole grains. Why not, after all, my doctor told me to eat well and to avoid saturated fats, and the media just confirmed all this and added the stuff about avoiding salt and too much red meat, eggs, etc. Why would I believe that my doc and the media were wrong? I was not an expert in the science of nutrition, and professional nutritionists kept repeating the conventional wisdom. Why question it? I could have, I was smart enough, but why? Where does the spark come from to question scientific advice? If I had been unhealthy, obese, diabetic, on drugs that made be feel bad, maybe I would have been more motivated to question, but I was none of these.

    In my case I was fortunate that my son lent me his copy of GT’s book. It opened my eyes to the likely deception, and started me on a mission of self-education and diet change. It was not a lack of ability or even curiosity which inhibited me from questioning. It was the total and naive belief in the medical profession. I never questioned the advice I received because it came from what I believed to be reputable and credible sources, and that is the real problem. Our personal medical practitioners are not well educated on the subject of nutrition so they just naively regurgitate the conventional wisdom. The media is in the business of selling information. Why should they be concerned about the validity of what they are reporting, especially since it comes from bona fide research reports and helps sales? As you mentioned in your blog, there are forces at work which tend to influence the kind of research that is done and the way the results are reported. Money and politics have been influencing our existence and welfare for a long time. A big lie has been perpetrated on us and continues to be, and the vast majority of the population are not skilled enough to realize it. Even those who are skilled will likely not question it. Also, the young and still healthy cohort has no incentive to question. It has almost nothing to do with emotional stress, our evolution, or our genes. Most people just naturally believe what the (medical) authority figures are telling them. Only when the odd (very likely scientifically educated) person with a reason to be curious starts to question the wisdom does the situation change.

    Yes, we all learn to copy our neighbours when we observe a possible benefit to us, because in the absence of any valid and good reason to do otherwise, that is our best course of action. However, when it comes to complicated issues such as nutrition, who will you copy? Your neighbour? Or will you just follow the advice of your medical professional who is, after all, an expert? My experience tells me that it will be the latter for most of us.

    Probably everyone who reads this blog is an outlier and has learned to reason and question in the search for truth. What is it that we all have in common? Maybe the answer lies there.

    Peter, great blog in general (not just this specific issue) and one of my top two.

  7. I suggest The Fatal Conceit by F.A. Hayek as a counterpoint to logic and reason leading to the best conclusions.

  8. Hi Peter,

    Been catching up with your work via Gary Taubes. All very interesting and useful.

    You’re right to say ‘We’re not wired for science’. In fact, we’re pretty much not wired for anything that makes us distinctly human. What we have is the benefit of history and society to overcome these failings of individual thought. History, because we can stand on the shoulders of giants, as Newton put it, through science and the scientific method. Society, because we have a shared desire to try to come to the truth – or at least, the best version of the truth we can come by given our material and intellectual development. But there are some important social changes happening which tend to undermine those two foundations of scientific development.

    One such change is relativism or post-modernism or whatever you call it, which says that multiple points of view can be in some sense ‘right’ and that ‘overarching metanarratives’ are somehow oppressive. This might have been kind of interesting for a while in saying that what currently goes for how we live or how we organise society might not be the best way of doing things, that we should be tolerant, live and let live. But it bleeds into every kind of intellectual discourse and it’s very unhelpful, with researchers both within and between spheres of research operating in ‘silos’ and engaging in ‘group think’, with little desire to reach beyond them.

    Another is the rise of scientism. We forego the scepticism of science in favour of the expertise of scientists. This is crucial at a time when other sources of authority have been discredited. Politicians rarely seem to say a thing now without claiming it is ‘evidence based’, and pull out a scientific expert to justify their views. Even worse is the way that scientists have cottoned on to this, and have become activists. The world of obesity research is among the worst for this kind of thing. ‘We need to ban X’ or ‘We need to change the guidelines to Y’, rather than really grappling with causes for problems and resolving them. As Taubes effectively points out, Ancel Keys was the godfather of activist science, and it has had a baleful influence on actual research in many respects.

    Rob

    • Rob: two obvious examples that leap to mind when you discuss “relatvism” are the anti-vaccination advocates (who believe their “data” are just as valid as that of the establishment scientists) and the climate change deniers.

    • Somehow, both groups believe that the scientific consensus is “oppressive” to their views.

  9. I have been trying to find out what you mean by ‘the scientific method’. I cannot find a definition but it seems that is has to do with interpreting statistics in the right manner. For me the scientific method is something else:

    The scientific method is a way to ask and answer scientific questions by making observations and doing experiments.
    The steps of the scientific method are to:
    Ask a Question
    Do Background Research
    Construct a Hypothesis
    Test Your Hypothesis by Doing an Experiment
    Analyze Your Data and Draw a Conclusion
    Communicate Your Results
    It is important for your experiment to be a fair test. A “fair test” occurs when you change only one factor (variable) and keep all other conditions the same.

    from https://www.sciencebuddies.org/science-fair-projects/project_scientific_method.shtml

    • Marijke, you can change more than one variable at a time. The art of doing this is called DOE (Design of Experiments) if you want to search on this. The important thing is to change variables in a way that does not cause confounding when you want to compute effects. The test or tests will still be fair. I teach beginner to advanced courses on this, and you are welcome to send your email if you have questions.

      • Rick, I am interested in what you teach as I am conducting some experiments in acupuncture (attempting to arrive at first principles there). How do I learn more about your work?

    • Hi, Peter et al.,

      There are many great thoughts, posts, and comments here, but for the sake of doing justice to science (especially when communicating it to the public), it is perhaps wise to make sure that we do not just focus on test by *experimentation. Experiments can be great, and I think we can all agree that if done appropriately, they are effective ways to test hypotheses. However, regarding “the scientific method” mentioned above, we often test hypotheses without doing experiments. The geologist cannot make a mountain just as the astrophysicist cannot make a star; but this does not mean that we cannot test hypotheses about historical events. When we relegate science to a process that “does experiments” then we automatically push any question about *history to the edge of where non-science begins. In this way, it is perhaps better to emphasise that science is a systematic method of empirically-based hypothesis testing, without the explicit emphasis (or requirement?) that we necessarily perform experiments. Also, you are likely familiar with this, but here is a good ref to illustrate that science is not the linear process that it is often made out to be: https://undsci.berkeley.edu/article/scienceflowchart. No surprise, each element here has its own history, too; they have all come together to make science (today) all of what we know and love.

      Cheers,
      Brian

  10. Surely hunter-gatherers and early agrarians had some cognitive abilities beyond just “mimicking” their successful neighbors. What about following signs and tracks on a hunt? Inventing and refining tool usage? Navigation across land or water? Studying the behavior of other people or animals, inferring motivations, and predicting future behavior? They may not have conceived of it in formal, logical terms, but it all seems more complex than just mimicry.

    • Sure, but I guess the question is what proportion of those in said societies were doing the problem solving and what what proportion were following? I don’t know the answer, of course, but I’d guess the vast minority were leading and the majority were going along for the ride.

    • Chuck, the claim is that the brain was and still is oriented towards imitation over establishing facts from first principles. “Oriented towards” is much different than “destined to” or “limited to”. That claim does not say people did not have observation skills, creative potential, or could not make cause-and-effect links.

      I still stand by this: the world spent $503 billion dollars on advertising last year – how many of those ads featured logical arguments? How many featured celebrity endorsements (a subtle invitation to imitate)? If logic were a powerful, primary force why not appeal to it?

      Secondly ‘refining’ and imitation are not that much different. Adding jalapenos to hummus and calling it jalapeno hummus is refining – yes – one can focus on the creative twist, or one can also see the imitation at the root of it.

      There is a lot going in imitation or getting in sync with each other that is not conscious: why do British speakers of English speak with a British accent, Americans with their American accent, Australians with their accent? No one has to.

  11. I found a talk Leonard Susskind did on Richard Feynman – https://www.youtube.com/watch?v=hpjwotips7E. He concludes his talk by saying: “how should we honor Richard Feynman? By getting as much bologna out of our sandwiches as we can.” Even though the low carb community might argue that we should get as much sandwich out of our bologna as we can, I thought it was a decent metaphor, and some interesting anecdotes

  12. Hunting 101

    I’m quite the hunter and being quite the hunter – I plan my hunt’s -I plan like a madman – crazied – sniffing the air for the slightest tinge of prey –

    But first I look at all the ad’s in the paper -( all the stores I go to send me their weekly specials) – this is how a real hunter does it

    Then I saddle up my 88 Chevy Celebrity and I ride like wind – ya – I ride like the wind knowing that – that Penzoil Ultra Platinum will keep my pony going –

    By the time I stroll to the store door I’m in a killing frenzy -nothing can stop my pure bloodlust – so I stealhfully enter and head to lettuce isle where I bag 6 heads of lettuce at 2 for dollar –

    It was a good hunt – I head home and feast on my lettuce dipped in Valentino’s hot sauce or mustard or most likely – both

    I live for these hunt’s – riding like wind in my 88 Chey Celebrity with the Penzoil Ultra Platinum and my High quaility Chevron gas keeping me going –

    Being a hunter – No – being thee Hunter I know bad gas and bad oil are for lesser mortals –

    Pretty soon the urge to hunt again will come and while I’m at it – I’ll stop at the Taning Booth : not only am I a great hunter – No– thee Hunter – I’m also taned up – and head on down to Winco and get my Baker’s Chocolate and Grapefruit Diet Pop – and then ride like wind – ya – I ride like the wind in my 88 Chey Celebrity with Penzoil Ultra Platinum and Chevron gas in it – till home – where the Gas furnace with the cover door is off and I can stare at camp fire (the pilot light) –

  13. Peter,
    I understand that each person is individual, so declarations of specific amounts of carbs, proteins and fats is not to be taken as universal law, but is there a ratio of fat to protein to carbs that should be followed to reach and maintain ketosis? In one of your posts I read that more than 150 grams of protein was too much and 50 grams of carbs was too much, but I have not been able to find a breakdown of intake that includes fat intake. Could you also talk about sources for fat that should be used as well as those that need to be avoided (I think the plant based are out of bounds–but I have not found a list that says where olive oil should be in a diet.)

    Thanks

    JJustice

  14. Dear Peter,
    My scientific curiosity has been sparked by a new lecture on You Tube by Dr Douglas C Wallace called “A Mitochondrial Etiology of Metabolic and Degenerative Diseases, Cancer and Aging”. (Sorry, don’t know how to put in a link!)
    There is lots of engaging observational scientific information which I think could be very relevant for fans of your approach to nutrition. Most intriguing of all is a quick reference Dr Wallace makes to ketogenic diet during the Q & A.
    I only stumbled on the lecture because my interest in metabolic problems is personal; I am delighted this blog gives me a chance to recommend such an incredibly academic lecture with big-picture thinking to like minded people.

  15. Good post.
    My recommendation: building on Bastiat, I highly recommend Human Action by Ludwig Von Mises.

  16. Hi Peter, Great post as usual. While we are on the topic of critical thinking, there’s a fundamental premise in this discussion on health that I haven’t found much evidence behind. I wonder if you have stopped to ponder the science behind the “healthy” weight. There are people with heart disease, diabetes, cancer etc. who are a “healthy” or “normal” medically defined weight. Further cause for pause is the 2013 study in JAMA that suggests being slightly overweight actually reduces your mortality risk. Perhaps a certain amount of fat storage is a good thing when it comes to health? Or perhaps consuming fat, even when not in ketosis, has some healthy benefits? The body certainly resists losing those final 10-15 pounds much more than the initial 10-15 pounds in a obese/overweight individual.
    p.s. My dad got me reading Feynman growing up, his way of thinking outside the box has always stuck with me.

    • Hi Caitlin,
      are a few of us healthier because of being overweight, or are a few of us leaner because of illness?

      This is a letter (by Walter C. Willett, MD et al) to the editor of the JAMA article:

      To the Editor: In their meta-analysis of BMI and mortality “to inform decision making in the clinical setting,” Dr Flegal and colleagues1 found that mortality was not increased up to a BMI of less than 35.

      We believe their study is flawed. Their comparison group (BMI of 18.5-<25) contains persons who are lean and active, heavy smokers, frail and elderly, and seriously ill with weight loss due to their disease, as well as Asian populations historically undernourished and burdened by infectious diseases.

      In my opinion, if the flaw is true that study shouldn't have been published.

    • Hi Peter,
      the authors say that “overweight was associated with significantly lower all-cause mortality” (association) but Caitlin wrote that “being slightly overweight actually reduces your mortality risk” (causality).

      May be I am wrong about that, but the authors may be right about association but causality has not been proven (may be people with an increased mortality risk tend to have a lower weight and not the other way). The letter to the editor I quoted may be interpreted in that way: if the authors of the JAMA article included in the low BMI group people “seriously ill with weight loss due to their disease, as well as Asian populations historically undernourished and burdened by infectious diseases”, what you would see is not that being overweight is protective but instead that ill people tend to weight less than healthy people.

      Moreover, ill people contribute to the “being lean is bad” account, bodybuilders (high BMI and may be healthier than the standard guys) contribute to the “being overweight” is good. A lot of confusion that doesn’t lead me to believe that been overweight is good (causality), not even when we talk about a slight overweight. Does the study really tell us anything?

      (please excuse my english as my mother tongue is spanish and not english)

    • Vicente, Thank you for the additional info on the JAMA study. My point was not that there was causation but that we use this biomarker too heavily. Thinking outside the finer points of the study, it doesn’t change the fact that a slender non-smoking marathon runner in his early 40s can have a heart attack or Dwight Howard, not only a person of “healthy” weight but a pro athlete, can be prediabetic (https://sports.yahoo.com/blogs/nba-ball-dont-lie/prior-doctor-intervention-dwight-howard-eating-equivalent-24-215009519–nba.html). We tend to think in terms of “if some is good, than more is better” especially when it comes to losing weight. Where did this concept of a “healthy” “normal” weight come from? It strikes me as interesting that someone like Howard could be viewed as healthier (super athlete!) than someone who was slightly overweight, but with no other biomarkers of disease. My main thought is that sometimes our public discussion on health focuses too much on weight loss and not enough on what is actually beneficial for the body.

    • Hi Caitlin,
      from reading “Wheat belly” I got the idea that visceral fat tissue was a real danger to our health. May be there is not a specific section in the book related to visceral fat, but I think the message was here and there.

      I have the feeling a lot of people assume being fat is not that bad because they think it is impossible to change their weight: they do the officially-right thing and they gain weight. When you see that your goal is impossible, you change your goal: “I can’t lose weight, therefore being fat is not that bad”.

      But, as I see the world, nobody really wants to be fat. Am I wrong? When I see fat people I don’t think it’s their choice, I think they are told to eat the wrong way.

      For me it is OK to tell people: “try to be lean because it is good for your health. There is a real chance for you: forget about carbs (except vegetables) and processed foods. But if eating right doesn’t work for you, then sure, you should try to be happy with what you have”.

    • I think that’s the interesting thing about the ketogenic diet. It’s not the traditional “eating right” diet. You do cut out a lot of obviously ‘bad’ food, but you also eat a LOT of fat. Personally, I did lose weight, but I also felt better all around: higher energy levels, better sleep, clearer thoughts. I did not do the ketogenic diet strictly to lose weight, but to curb a B12 deficiency that was severely impacting my memory and mood. Previously, I ate “healthy” by mainstream standards (low-fat, lots of fruits and veggies) but I was really struggling from a true “health” perspective. Sometimes, I think the overweight are the lucky ones because they have a glaring reason to search for answers. Should someone who’s overweight reach for a cigarette to control their appetite? Is being a “healthy” weight desirable at all costs? I think not… and in the case of ketosis, we still don’t know a lot about it. Is it best to be as skinny as possible by being in ketosis continuously? Maybe. Maybe not. My point is, weight is only one biomarker. We don’t know for sure what the long-term cause-and-effect is of merely being slender, but we do know that fat, as a macronutrient, is great for the body and the brain. I think about the people without weight as a biomarker and see how they struggle with health… what if ketosis solves a lot of more health problems than just obesity?

  17. This is an especially thoughtful essay which has some fascinating overlap with classic contrarian philosophy. Thank you for your insights!

  18. Nitpick: the chart likely misplaces the appearance of language. You say that the best evidence suggests that language first arose “about 50,000 years ago”, but the source you linked to says that fully developed language was in place by “at least 50,000 years ago”, and possibly much earlier. The time of the first thing that the article describes as a possible sign of language use was 2.4 million years ago.

    • Alex, I agree with you. I remember something from physical anthropology that the anatomy of homo sapiens shows adaptation for speech. 50,000 years even for “fully developed language” sounds way off, not even an educated guess.

  19. I was actually just wondering about this issue. Since about about the age of 14 I’ve noticed I seemed to approach things much different to others. I think genetically I am predisposed to be a critical thinker but a point
    I would like to raise is reading. I’ve heard that the human brain in the process of reading a story will actually associate itself with what is happening – basically, in a way your mind is shaped just as though the story happened to you. Not sure if there is any scientific research to back that up.

    While in itself not relevant, what happens when you step into the shoes of a critical thinker through story? Coincendatally at the age of 14 is when I really got into reading, specifically books that usually had a very observant but neutrally bias major character. Books like stranger in a strange land (an incredibly interesting book), vertical run, Robison Crusoe etc.

  20. Relative Risk(mathematical puke)
    There are a couple things about this post that bother me –

    I do not care for the idea that I evolved from a monkey – I don’t like monkeys – their ugly – their stupid – they have tiny penis’s and for some unfathomable reason scientists like to study them

    on the other the hand I’m okay with evolving from a horse – I’m okay with being hung like a horse – but hey – that’s just me and for that matter every other male on the planet who’s fine being a male –

    Aside from these distractions – something else is on my mind –

    and it’s Relative Risk or Mathematical Crap – because that is what it is – it has zero useful function – zero reason for exixting –

    and yet it is used in almost or every scientific study done – why – why – Why for Godsake – Why ?

    Because apparently scientists who do studies are really Idiots –

    So if smart people can not Think Critically – where does that leave the rest of us ?

    • we did not evolve from monkeys! they are a completely separate branch in the primate family tree!!! Argh.

Facebook icon Twitter icon Instagram icon Pinterest icon Google+ icon YouTube icon LinkedIn icon Contact icon