January 10, 2018

Understanding science

Why we’re not wired to think scientifically (and what can be done about it)

What is it about being human that conflicts with being scientific?

Read Time 8 minutes

Author’s note: This post was originally published in May, 2014. It has been updated to reflect my current thinking on the topic. Perhaps the best addition, by popular demand, is Rik’s coffee recipe (click the 1st inline footnote).

§

In 2012, I was having dinner with a good friend, Rik Ganju, who is one of the smartest people I know. And one of the most talented, too—a brilliant engineer, a savant-like jazz musician, a comedic writer, and he makes the best coffee I’ve ever had.1Here is the coffee recipe, courtesy Rik. I make this often and the typical response is, “Why are you not making this for a living?” Look for Vietnamese cinnamon, also known as Saigon cinnamon; you need two big dashes, if that. You need real vanilla (be careful to avoid the cheap versions with added sugar). Best is dissolved in ethanol; if that doesn’t work for you get the dried stick and scrape the pods. Then find a spice store and get chicory root (I’m a bit lazy and get mine on Amazon). You’ll want to replace coffee beans with ~10% chicory on a dry weight basis. If you’re on a budget, cut your coffee with Trader Joe’s organic Bolivian. But do use at least 50% of your favorite coffee by dry weight: 50-40-10 (50% your favorite, 40% TJ Bolivian, 10% other ingredients [chicory root, cinnamon, vanilla, amaretto for an evening coffee]) would be a good mix to start. Let it sit in a French press for 6 minutes then drink straight or with cream, but very little–max is 1 tablespoon of cream. The Rik original was done with “Ether” from Philz as the base. I was whining to him about my frustration with what I perceived to be a lack of scientific literacy among people from whom I “expected more.” Why was it, I asked, that a reporter at a top-flight newspaper couldn’t understand the limitations of a study he was reporting on? Are they trying to deliberately mislead people, or do they really think this study which showed an association between such-and-such, somehow implies X?

Rik just looked at me, kind of smiled, and asked the question in another way. “Peter, give me one good reason why scientific process, rigorous logic, and rational thought should be innate to our species?” I didn’t have an answer. So as I proceeded to eat my curry, Rik expanded on this idea. He offered two theses. One, the human brain is oriented to pleasure ahead of logic and reason; two, the human brain is oriented to imitation ahead of logic and reason. What follows is my attempt to reiterate the ideas we discussed that night, focusing on the second of Rik’s postulates—namely, that our brains are oriented to imitate rather than to reason from first principles or think scientifically.

One point before jumping in: This post is not meant to be disparaging to those who don’t think scientifically. Rather, it’s meant to offer a plausible explanation. If for no other reason, it’s a way for me to capture an important lesson I need to remember in my own journey of life. I’m positive some will find a way to be offended by this, which is rarely my intention in writing, but nevertheless I think there is something to learn in telling this story.

The evolution of thinking

Two billion years ago, we were just cells acquiring a nucleus. A good first step, I suppose. Two million years ago, we left the trees for caves. Two hundred thousand years ago we became modern man. No one can say exactly when language arrived, because its arrival left no artifacts, but the best available science suggests it showed up about 50,000 years ago.

I wanted to plot the major milestones, below, on a graph. But even using a log scale, it’s almost unreadable. The information is easier to see in this table:

Formal logic arrived with Aristotle 2,500 years ago; the scientific method was pioneered by Francis Bacon 400 years ago. Shortly following the codification of the scientific method—which defined exactly what “good” science meant—the Royal Society of London for Improving Natural Knowledge was formed. So, not only did we know what “good” science was, but we had an organization that expanded the application, including peer review, and existed to continually ask the question, “Is this good science?”

While the Old Testament makes references to the earliest clinical trial—observing what happened to those who did or did not partake of the “King’s meat”—the process was codified further by 1025 AD in The Canon of Medicine, and formalized in the 18th century by James Lind, the Scottish physician who discovered, using randomization between groups, the curative properties of oranges and lemons—vitamin C, actually—in treating sailors with scurvy. Hence the expression, “Limey.”

The concept of statistical significance is barely 100 years old, thanks to Ronald Fisher, the British statistician who popularized the use of the p-value and proposed the limits of chance versus significance.

The art of imitation

Consider that for 2 million years we have been evolving—making decisions, surviving, and interacting—but for only the last 2,500 years (0.125% of that time) have we had “access” to formal logic, and for only 400 years (0.02% of that time) have we had “access” to scientific reason and understanding of scientific methodologies.

Whatever a person was doing before modern science—however clever it may have been—it wasn’t actually science. And along the same vein, how many people were practicing logical thinking before logic itself was invented? Perhaps some were doing so prior to Aristotle, but certainly it was rare compared to the time following its codification.

Options for problem-solving are limited to the tools available. The arrival of logic was a major tool. So, too, was the arrival of the scientific method, clinical trials, and statistical analyses. Yet for the first 99.98% of our existence on this planet as humans—literally—we had to rely on other options—other tools, if you will — for solving problems and making decisions.

So what were they?

We can make educated guesses. If it’s 3,000 BC and your tribemate Ugg never gets sick, all you can do to try to not get sick is hang out where he hangs out, wear similar colors, drink from the same well—replicate his every move. You are not going to figure out anything from first principles because that isn’t an option, any more than traveling by jet across the Pacific Ocean was an option. Nothing is an option until it has been invented.

So we’ve had millions of years to evolve and refine the practice of:

Step 1: Identify a positive trait (e.g., access to food, access to mates),

Step 2: Mimic the behaviors of those possessing the trait(s),

Step 3: Repeat.

Yet, we’ve only had a minute fraction of that time to learn how to apply formal logic and scientific reason to our decision making and problem solving. In other words, evolution has hardwired us to be followers, copycats if you will, so we must go very far out of our way to unlearn those inborn (and highly refined) instincts to think logically and scientifically.

Recently, neuroscientists (thanks to the advent of functional MRI, or fMRI) have been asking questions about the impact of independent thinking (something I think we would all agree is “healthy”) on brain activity. I think this body of research is still in its infancy, but the results are suggestive, if not somewhat provocative.

To quote the authors of this work, “if social conformity resulted from conscious decision-making, this would be associated with functional changes in prefrontal cortex, whereas if social conformity was more perceptually based, then activity changes would be seen in occipital and parietal regions.” Their study suggested that non-conformity produced an associated “pain of independence.” In the study-subjects the amygdala became most active in times of non-conformity, suggesting that non-conformity—doing exactly what we didn’t evolve to do—produced emotional distress.

From an evolutionary perspective, of course, this makes sense. I don’t know enough neuroscience to agree with their suggestion that this phenomenon should be titled the “pain of independence,” but the “emotional discomfort” from being different—i.e., not following or conforming—seems to be evolutionarily embedded in our brains.

Good solid thinking is really hard to do as you no doubt realize. How much easier is it to economize on all this and just “copy & paste” what seemingly successful people are doing? Furthermore, we may be wired to experience emotional distress when we don’t copy our neighbor! And while there may have been only 2 or 3 Ugg’s in our tribe 5,000 years ago, as our societies evolved, so too did the number of potential Ugg’s (those worth mimicking). This would be great (more potential good examples to mirror), if we were naturally good at thinking logically and scientifically, but we’ve already established that’s not the case. Amplifying this problem even further, the explosion of mass media has made it virtually, if not entirely, impossible to identify those truly worth mimicking versus those who are charlatans, or simply lucky. Maybe it’s not so surprising the one group of people we’d all hope could think critically—politicians—seems to be as useless at it as the rest of us.

So we have two problems:

  1. We are not genetically equipped to think logically or scientifically; such thinking is a very recent tool of our species that must be learned and, with great effort, “overwritten.” Furthermore, it’s likely that we are programmed to identify and replicate the behavior of others, rather than think independently, and independent thought may actually cause emotional distress.
  2. The signal (truly valuable behaviors worth mimicking)-to-noise (all unworthy behaviors) ratio is so low—virtually zero—today that the folks who have not been able to “overwrite” their genetic tendency for problem-solving are doomed to confusion and likely poor decision making.

As I alluded to at the outset of this post, I find myself getting frustrated, often, at the lack of scientific literacy and independent, critical thought in the media and in the public arena more broadly. But, is this any different than being upset that Monarch butterflies are black and orange rather than yellow and red? Marcus Aurelius reminds us that you must not be surprised by buffoonery from buffoons, “You might as well resent a fig tree for secreting juice.”

While I’m not at all suggesting people unable to think scientifically or logically are buffoons, I am suggesting that expecting this kind of thinking as the default behavior from people is tantamount to expecting rhinoceroses not to charge or dogs not to bark—sure it can be taught with great patience and pain, but it won’t be easy in short time.

Furthermore, I am not suggesting that anyone who disagrees with my views or my interpretations of data frustrates me. I have countless interactions with folks whom I respect greatly but who interpret data differently from me. This is not the point I am making, and these are not the experiences that frustrate me. Healthy debate is a wonderful contributor to scientific advancement. Blogging probably isn’t. My point is that critical thought, logical analysis, and an understanding of the scientific method are completely foreign to us, and if we want to possess these skills, it requires deliberate action and time.

What can we do about it?

I’ve suggested that we aren’t wired to be good critical thinkers, and that this poses problems when it comes to our modern lives. The just-follow-your-peers-or-the-media-or-whatever-seems-to-work approach simply isn’t good enough anymore.

But is there a way to overcome this?

I don’t have a “global” (i.e., how to fix the world) solution for this problem, but the “local” (i.e., individual) solution is quite simple provided one feature is in place: a desire to learn. I consider myself scientifically literate. Sure, I may never become one-tenth a Richard Feynman, but I “get it” when it comes to understanding the scientific method, logic, and reason. Why? I certainly wasn’t born this way. Nor did medical school do a particularly great job of teaching it. I was, however, very lucky to be mentored by a brilliant scientist, Steve Rosenberg, both in medical school and during my post-doctoral fellowship. Whatever I have learned about thinking scientifically I learned from him initially, and eventually from many other influential thinkers. And I’m still learning, obviously. In other words, I was mentored in this way of thinking just as every other person I know who thinks this way was also mentored. One of my favorite questions when I’m talking with (good) scientists is to ask them who mentored them in their evolution of critical thinking.

Relevant aside: Take a few minutes to watch Feynman at his finest in this video—the entire video is remarkable, especially the point about “proof,”—but the first minute is priceless and a spot on explanation of how experimental science should work.

You may ask, is learning to think critically any different than learning to play an instrument? Learning a new language? Learning to be mindful? Learning a physical skill like tennis? I don’t think so. Sure, some folks may be predisposed to be better than others, even with equal training, but virtually anyone can get “good enough” at a skill if they want to put the effort in. The reason I can’t play golf is because I don’t want to, not because I lack some ability to learn it.

If you’re reading this, and you’re saying to yourself that you want to increase your mastery of critical thinking, I promise you this much—you can do it if you’re willing to do the following:

  1. Start reading (see starter list, below).
  2. Whenever confronted with a piece of media claiming to report on a scientific finding, read both the actual study and the media, in that order. See if you can spot the mistakes in reporting.
  3. Find other like-minded folks to discuss scientific studies. I’m sure you’re rolling your eyes at the idea of a “journal club,” but it doesn’t need to be that formal at all (though years of formal weekly journal clubs did teach me a lot). You just need a good group of peers who share your appetite for sharpening their critical thinking skills. In fact, we have a regularly occurring journal club on this site (starting in January, 2018).

I look forward to seeing the comments on this post, as I suspect many of you will have excellent suggestions for reading materials for those of us who want to get better in our critical thinking and reasoning. I’ll start the list with a few of my favorites, in no particular order:

  1. Anything by Richard Feynman (In college and med school, I would not date a girl unless she agreed to read “Surely You’re Joking, Mr. Feynman”)
  2. The Transformed Cell, by Steve Rosenberg
  3. Anything by Karl Popper
  4. Anything by Frederic Bastiat
  5. Bad Science, by Gary Taubes
  6. The Structure of Scientific Revolutions, by Thomas Kuhn
  7. Risk, Chance, and Causation, by Michael Bracken
  8. Mistakes Were Made (but not by me), by Carol Tavris and Elliot Aronson
  9. Thinking, Fast and Slow, by Daniel Kahneman
  10. The Method of Multiple Working Hypotheses,” by T.C. Chamberlin

I’m looking forward to other recommendations.

Disclaimer: This blog is for general informational purposes only and does not constitute the practice of medicine, nursing or other professional health care services, including the giving of medical advice, and no doctor/patient relationship is formed. The use of information on this blog or materials linked from this blog is at the user's own risk. The content of this blog is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Users should not disregard, or delay in obtaining, medical advice for any medical condition they may have, and should seek the assistance of their health care professionals for any such conditions.

295 Comments

  1. Very interesting! Thanks for the list and links within the article – a lot of reading to do! And please post the coffee recipe 🙂

    • Here is the coffee recipe, courtesy Rik. I make this often and the typical response is, “Why are you not making this for a living?”

      • Look for Vietnamese cinnamon, also known as Saigon cinnamon; you need two big dashes if that.
      • You need real vanilla (be careful there is no added sugar). Best is dissolved in ethanol; if that doesn’t work for you get the dried stick and scrape the pods.
      • Then find a spice store and get chicory. Amazon also sells.
      • You’ll want to replace coffee beans with ~10% chicory on a dry weight basis.
      • Then cut you coffee with Trader Joes organic Bolivian. But do use at least 50% of your favorite coffee by dry weight. 50-40-10 (50% ether, 40% TJ Bolivian, 10% other ingredients [cinnamon, vanilla, amaretto for an evening coffee]) would be a good mix to start.
      • Let it sit in a French press for 6 min then drink straight or with cream, but very little–max is 1 tablespoon of cream.
      • Best base coffee is “Ether” from Philz

    • Does the list of resources contain a good practical book or paper on interpreting stats? Misinturpreting stats is a huge part of the problem. If not; might you recommend one? Grad school is a faint blur. Also, you can order Cafe Du Mond Coffee and Chicory blend online from the New Orleands Establishment by the same name. It works well in a Vietnamese coffee maker.

    • I will supplement with a suggestion that you look into home roasting for your beans. The quality is unbelievably good.

      I purchased a roaster for ~$200 a year ago and I haven’t drank coffee more than two days “stale” since. Most cities have a few coffee shops that sell green coffee beans. Greens are typically less than half the price per weight of roasted beans. Even with the 30% weight lost during roasting you’re still paying $7-$10 per pound for the end product. Roast in small batches so it’s always fresh. Roasting takes 10-14 minutes.

      • I’ve been noodling this for a while, also on the rec of friend. I may need to try it. One more thing to obsess over…great.

    • A cheap way to try roasting coffee at home is with a hot air popcorn popper. Buy some green coffee online and throw it in there and roast to desired level. It is interesting because it does pop (first and second crack) which helps you figure out how roasted it is.

  2. A few that leap to mind:

    “The Art of Scientific Investigation,” by WIB Beveridge.

    “Introduction to the Study of Expirmental Medicine,” by Claude Bernard.

    “Do We Really Know What Makes Us Healthy?” by Gary Taubes in the New York Times Sunday Magazine.

  3. Excellent piece! I’ve been thinking about this a lot, especially the part about how there’s nothing ABnormal about people who have not learned to think critically. I run into a lot of blather (from all sides of the nutrition wars) wondering how people can be so gullible, as though being a free-thinking iconoclast was the natural state. It isn’t.

    I’ve also run across a lot of this:

    “Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.” ? Michael Shermer

  4. A few more:

    “Strong Inference,” an article by JR Platt.

    “Mistakes Were Made (But Not by Me),” by Elliot Aronson and Carol Tavris.

    “Fooled by Randomness,” “The Black Swan,” and “Antifragility,” by Nassim Nicholas Taleb.

  5. Another thing we need to realize about humans and science is that this idea that scientists do science by following the scientific method is a myth. Gary Taubes often quotes John Ziman when he said that in physics, textbook science may be about 90 percent right, whereas the primary literature is probably 90 percent wrong. I researched and wrote about how science gets done in the real world. https://gpinzone.blogspot.com/2014/05/the-myth-of-scientific-method.html

    • I agree that a lot of published science is incorrect–how much is up for debate–but I don’t think this fact implies the scientific method is broken. Rather, the current incentives for science are. Getting grants, peer review, publication, etc. is more like the problem, not the ideas conceived by Bacon and others.

  6. For more on why it’s difficult to think in a scientific manner and why humans (even scientists) are terrible intuitive statisticians, I highly recommend Daniel Kahneman’s amazing book, Thinking, Fast and Slow. The fast and slow types of thinking he refers to are the intuitive leaps we tend to make vs. the slower more measured thinking we do when deliberately conducting an experiment (which we are generally loathe to do, being lazy).

    • Beat me to it with “Thinking Fast and Slow”, but I’ll double the recommendation for you. I will admit to being a little impatient while reading that book at Kahneman’s consistent bias toward System 2 — both types of thinking are necessary; catching yourself using one of the two where the other would be more appropriate is a very real, and useful, skill of self-reflection. It’s also damned hard to do!

      Another comment about what passes for science much of the time — in Chapter 16 of “Brave New World” Aldous Huxley gives Mustapha Mond an almost throwaway line that is a real gem:

      “I was a pretty good physicist in my time. Too good–good enough to realize that all our science is just a cookery book, with an orthodox theory of cooking that nobody’s allowed to question, and a list of recipes that mustn’t be added to except by special permission from the head cook. I’m the head cook now. But I was an inquisitive young scullion once. I started doing a bit of cooking on my own. Unorthodox cooking, illicit cooking. A bit of real science, in fact.” He was silent.”

    • I second Kahneman’s Thinking Fast & Slow. You can’t “educate” yourself out of your innate & predictable irrationalities, because System 1 (limbic system) is indeed more resilient and active than System 2 (lateral prefrontal cortex and posterior parietal cortex ), which is quite lazy and tires quickly. Everyone will, when faced with a tough System 2 problem, leap to what Kahneman calls “substitution” – that is, unconsciously switching the problem to something System 1 can handle – and all you can do is try to back yourself up to engage System 2.

      Except in Kahneman’s experiments even trained experts, statisticians, & logicians often failed to notice when they had “substituted.” It’s a very difficult problem, to go back and force yourself to engage System 2. You need to devise a really robust process to help you remember that you are likely substituting and then to go back to the decision and try to circumvent that.

      The truth of Kahneman’s experiments have been demonstrated experimentally many times, and fMRI has shown the situation in action! A very nice such demo with lovely pix can be found in the classic paper, “Separate Neural Systems Value Immediate and Delayed Monetary Rewards,” by McClure et al, in Science Oct. 15 2004, pp 503-7.

    • Peter, Colin refers to a different Huxley. ‘Darwin’s Bulldog’ was Sir Thomas Huxley, not Aldous of “BNW” fame. Or perhaps I was misreading a bit of humor on your part?

    • Hi Anu,

      I highly second Thinking, Fast and Slow.

      By the way, are you from the east bay, CA?

      Henry
      Oakland

  7. On your list of major milestones you did not include the development of agriculture 10-12,000 years ago. You may want to reflect on why you did not include that.

    • I consider that a technical achievement more than a “thinking” achievement. Also, the real breakthrough in agriculture was 1940s to 1950s for reasons we can discuss another time.

  8. Peter when you speak of the small percentage of naturally lean people who can eat seemingly anything, are you aware of the growing problem referred to as TOFI ( thin outside fat inside) ? Dr Lustig has some data on this and the percentage ( which I don’t have in front of me) is really shocking! He says there isn’t currently a good understanding of how this happens that many folks display little subcutaneous fat accumulation but have very significant and unhealthy visceral fat nonetheless. As a fitness and nutrition trainer I don’t rely solely on visual or even caliper measurements any longer. I advise “clean” relatively low carb, low sugar eating for everyone over 40 ( and preferably even for younger)

    • Yes, and I was going to make this caveat, but figured it would only complicate an already cumbersome point! But I’m glad you brought it up, since it’s important for folks to know.

    • Saw this post a few days ago:

      “A lot of people are convinced that if they are naturally slim they are naturally healthy, unfortunately their fat is inside, in and around their organs.”

      Was drunk and in the mood for an online rant response:

      “More than that, insulin resistance affects other organs – especially the brain – but you won’t really notice it for decades.

      People who get fat are lucky, they have a vanity and outward facing health reason to tackle poor diet.

      Thin folk who eat junk tend to find their moods terrible, energy peak and trough, anger and depression, then as 40-50s approach rapidly declining health that requires medication and treatment that takes up huge amounts of time and financial resources.

      But hey, at least they’re thin.”

  9. Bad Science is by the great Ben Goldacre; Taubes wrote Good Calories, Bad Calories. Feynman has written too much of great importance to rely recommend just one item!

    • Ben Goldacre has written a book called Bad Pharma and had a blog called Bad Science. Taubes has written a book called “Bad Science: The Short Life and Weird Times of Cold Fusion”.

    • Just to clear up the confusion, Gary Taubes and Ben Goldacre both wrote books called Bad Science (different sub-titles), and both are very much worth reading. Goldacre followed his book with Bad Pharma and he has a blog called Bad Science.

  10. Hi Peter,
    the problem I see is that, even if we wanted, very often we (ordinary people) lack the background to understand the implications of what we read. I don’t see myself as a moron (I have an engineer Ph.D) but I can’t have critical thinking if I don’t get what I read. Therefore, in nutrition, for me the most important readings are critical analysis (books, blogs, etc.) from people I trust.

    So my point is that in this field, at least for me, trust is an important factor (not quite scientific, I know, more like a hunch) and critical thinking is severely limited without experts that pre-digests information. And then because of those experts.

    • I don’t know Vicente, a PhD in EE isn’t very “ordinary” in my book! I think you can figure out any of this. But more importantly, for those without PhDs, it gets to the community thing. You read a paper. Your buddy reads a paper (not the blog to start, the actual paper). Start with the methods section. Then look at the figures. Then go back and read from end to end. After a while you’ll see there are a few things that pique your interest with respect to the rigor of the experiment.
      Once you do this, I think you’ll be less trusting of everyone (me included!), when you read their interpretation.

  11. Great insights, Peter!

    However, even within the scientific community, where the scientific method and critical thinking are fundamental to its endeavors, there is still this general reluctance to embrace the findings that contradict established paradigms, even if it’s pretty conclusive. Furthermore, many subjects of research are excluded from the mainstream scientific community simply because it doesn’t fit with the accepted worldviews, and is dismissed as pseudoscience not because of the methodology, but because of its subject of study.

    My question is, does practicing critical thinking and the scientific method really allow us to break free from our conformist nature, or could there be other factors at play?

    PS, not about nutrition, but pretty convincing research that is an example of what I’m talking about: (https://www.youtube.com/watch?v=_u7RqklxNnA).

    • Agree with your concerns Jack. I think the problem you describe is really the “herd mentality” issue which I very loosely touch on with that fMRI study–the so-called “pain of independent thought.”

  12. I somehow get the impression that medical sciences are riddled with people who think that their world view is more important than evidence. Nutritionists seem to enjoy telling people what to do and so many times I as an educated layman cannot believe what some of these people are trying to tell me on TV shows for example (law of thermodynamics, physical activity, lipid hypothesis etc.).
    I think we should all try to think more like children, with a will to honest inquiry and a sense of curiosity (I think you have this).
    We should be concerned not about what the truth is, but that it is the truth.

    I recommend this article in The Atlantic about bad medical science:
    https://www.theatlantic.com/magazine/archive/2010/11/lies-damned-lies-and-medical-science/308269/

    • Good points, Al. I’d add one more quality we should aspire to (which is really hard sometimes): Saying “I don’t know” when we don’t.

  13. https://www.amazon.com/dp/B00IX279SI

    The Objective Bible: WESTERN CIVILIZATION’S STRUGGLE for PHILOSOPHIC LIBERATION from a Herd-Mentality and Pagan Mysticism

    The Objective Bible investigates the entire sweep of Western philosophy to display the pattern of mystical dualism that has lured man throughout the ages into a trance of group-think. This research examines the evolution of Western civilization by integrating different aspects of the humanities: social psychology, philosophy, and history. The result is a comprehensive illumination of the disparity between the unconscious, herding instinct that regulates social mood and the individual’s cognitive mind. Unveiling the role of these forces in the cultural evolution of the West sheds important light on the worst atrocities of the twentieth-century.
    Even more significantly, comprehending the operative forces of the communal dynamics of destruction equips readers to critically assess what is currently besetting societies in order to not repeat history.
    In ancient polytheistic religions, numerous chaotic, supernatural forces were believed to manipulate the otherwise stagnant physical world. Defying this primeval mysticism, the monotheistic revolution of Abraham launched the radical cosmology of a natural, orderly, and lawful universe guided by a benevolent Creator. The belief in one, transcendent God affirms the fundamental uniformity of the natural laws that govern the universe. Through these laws, the Creator directs the evolution and progress of creation, generating meaning and purpose—ultimately through the emergence of the ethical man. God created man in his image and likeness, so that his conscious mind would bring a moral meaning to nature.
    The liberty and objectivity inherent in monotheistic ethics is under assault—as much now as it has been throughout the ages. In Western philosophy, this biblical worldview has been gradually neglected, denigrated, and even abandoned since the Enlightenment. The results of this are evident in countless universities, where postmodernists today systematically attack the rational and ethical foundations of the West with moral relativism and social subjectivism, fueling a herd mentality and the irrational worship of outright nihilism.
    By shattering the idols of pagan mysticism, the Bible’s objective ethics emancipate man from tribal collectivism and empower individuals to pursue liberty and prosperity.

  14. Dear Peter,

    Interesting post… so many things I would like to add but I am at another city, 3 hours away to give a seminar -on mathematical methods to estimate reaction rates on metabolic networks and slides are not finished just yet- So let me send you the following link:

    https://www.motherjones.com/politics/2014/04/inquiring-minds-john-hibbing-physiology-ideology

    Notice also the section “IF YOU LIKED THIS, YOU MIGHT ALSO LIKE…” with other interesting links and book recommendations.

    By the way there is another book called “Bad science” by Ben Goldacre that I would also recommend. On the same vein, and to understand why we get poor media coverage -generally- you have the book of “flat earth news” by Nick Davies.

    Take care,

    I.

  15. A great impediment to thinking more scientifically and being more critical of research studies is the vocabulary. To someone who has many other interests and occupations, learning all those technical terms is onerous. Anyone who didn’t have a good scientific education in school and doesn’t use this vocabulary in daily life, can’t comprehend the material much less read critically. I read the findings of reports on brain studies, diabetes research, and social issues, but it’s hard. The statistics look like computer programming language. I’m largely dependent on second sources and they don’t always interpret the data correctly. When I click through to the original study to read more details, I’m often lost in gibberish.

    The legal profession is gradually learning to write in Plain English and the sciences need to do the same. I can’t expect everyone to understand how to think critically as a writer or an artist. Why should scientists expect everyone to be able to think critically like a scientist?

    • The first step should be that scientifics should act as scientifics. Assuming that they do is a mistake. Have you read Denise Minger’s reviews of the China Study book? (It is a must)

      I agree with your point about “vocabulary”: may be not-MDs can understand the scientific method but when articles include words like hormone, thyroid, oxidation, hemoglobin, ferritin, etc. things get complicated. Statistics are also out of reach for many.

  16. Nice article (as always). I see Taleb already made the list. His Incerto series (FBR, TBS, AF) is a must. Also, be sure to check out his technical companion, Silent Risk, if you can stomach the math… https://www.fooledbyrandomness.com/FatTails.html

    One other recommendation would be the below essay by David Bentley Hart. Dr. Hart is a philosopher & theologian of all things but he hits the nail on the head. “…only when a method is conscious of what it cannot explain can it maintain a clear distinction between the knowledge it secures and the ideology it obeys.”
    https://www.firstthings.com/web-exclusives/2011/09/lupinity-felinity-and-the-limits-of-method

  17. On the development of agriculture not being a major milestone in thinking. The idea of staying in one place and creating food instead of foraging when no other animal was doing this, I would consider as important as the development of the printing press, which was a technical invention. It may have led to a leap in literacy and knowledge, but so did agriculture. It produced towns and cities. Concentrated intelligence and specialization. None of this would be possible in a foraging or nomadic culture.

    The next great leap may be in developing more collaborative societies and governments. And less war and competition. Greater transparency. Awareness that society is composed of individuals that must be regarded as equivalent in order to build a strong and nourishing society developed with scientific thinking.

Facebook icon Twitter icon Instagram icon Pinterest icon Google+ icon YouTube icon LinkedIn icon Contact icon