Few years back, after getting my hands on Charles Duhigg’s life-changing book The Power of Habit, I became more interested in studying the intricacies of the human mind. Specifically, I wanted to familiarize myself with the logical flaws and fallacies we tend to fall prey to. This is how I found out about Prof. Steven Novella of Yale University, a clinical neurologist and an inborn skeptic.
Your Deceptive Mind was the first work of Prof. Novella that I voraciously digested. I listened and relistened to the program a number of times over the past few years. Then I got the companion book for this program – a course about the tricks our brains play on us, along with fight-back strategies to help improve our critical thinking abilities.
Since I read interactively, I collected about 70 quotes, notes, and take-away messages from this course book. I will share some of them here in the hope that it may help you as it has, so profoundly, helped me.
Not only do I better understand myself in terms of susceptibility for self-deception, but I am also more aware of the fallacies of folks around me and also the flaws of research studies and the cognitive biases of the researchers – atop of the financial implications and their conflicts of interest.
Logical Fallacies and Innate Innumeracy
We rely upon our memories as if they were accurate recordings of the past, but the evidence shows that we should be highly suspicious of even the most vivid and confident memories. 
As harsh as it may sound – memories cannot and should not be blindly trusted, yet we do this everyday, from the moment we wake up until we go to sleep.
The process of encoding information into the brain is very complex. It is a combination of multiple sensory inputs, emotion, and higher cortical processes. Subsequent experiences and thoughts may alter that information. Unless there is some objective outside verification system (written facts, recollections from other people, recordings, etc), you should be very careful when basing your decisions solely on your memories.
Carol Tavris provides reasonable arguments for this in the book Mistakes were Made (but Not by Me). Part of the book covers legal case studies and dubious medical and law-enforcement practices which are laden with erroneous fact/memory recollections.
Prof. Novella also mentions the research of Elizabeth Loftus of U.C. Irvine. Her work covers the fields of cognitive and behavioral psychology, human memory, eyewitness testimony, and not only.
Part of her research reveals how we incorporate misleading information into our visual memory. One example, as Prof. Novella highlights it, could be how false memory may be constructed when asking someone:
How fast were you going when you crashed into that car?
This implies you were speeding – and it may not be necessarily true.
As Novella points it:
There are numerous ways in which human memory is flawed. Far from being a passive recording of events, memory is constructed, filtered through our beliefs, and subjected to contamination and morphing over time. Memories can even be fused or entirely fabricated. It’s naive to implicitly trust our memories, and it’s important to recognize that we need to be realistic and humble about the limitations and flaws of human memory. 
Moreover, many of us fail to differentiate between rationalization and reasoning.
Humans possess logic, but we are not inherently logical creatures. In addition to being logical, we are also highly emotional creatures; we tend to follow our evolved emotions and rationalizations. Our thoughts tend to follow a pathway of least resistance, which is not always the optimal pathway. 
Through the process of rationalization we go from conclusions to finding arguments to support these conclusions. Example:
A high-fat diet works so well for me! Let me find out why it can work for everyone else. (by searching for evidence to support my theory)
Conversely, reasoning is the opposite of rationalization. With reasoning, one uses logic to reach conclusions. Most often, reasoning comes with seeking evidence to disprove theories. Example:
A high fat diet works so well for me. Can any other type of diet work equally well, if not better?
Once we emotionally invest in a conclusion, humans are very good at twisting and rationalizing facts and logic in order to fit that desired conclusion. Instead, we should invest in the process and be very flexible when it comes to any conclusions. 
Considering alternatives and letting go of beliefs in the face of imminent contradictory evidence is a marker of critical thinking. When someone proves me wrong, I should at least consider their explications.
We are generally very good at pattern recognition—so good that we often see patterns that are not actually there. 
N. N. Taleb’s Fooled by Randomness is the uncut raw manifesto for this. We often romanticize over the inexistent patterns we construct (rationalize) into our reality.
See that face on the moon on a night of clear sky?! Do you see Jesus’ appearing in the clouds or in the bark of a tree? Can you track the trends in the stock market (retrofitting)?! What about economic cycles?!
Sadly, most of these examples can be, more or less, associated with a condition called visual pareidolia. Some consider it an evolutionary adaptation – as part of our brain is devoted to facial recognition. Collaboration and social skills may have been decisive in the survival of the human species.
False dichotomy: A logical fallacy in which multiple choices are reduced artificially to only a binary choice, or where a continuum is reduced to its two extremes. 
Fats or carbs? Creationist or darwinist (or dawkinsist)? Runner or marathoner? High-frequency strength training or The Big 5?!
Why not both and/or thousands of variations in between? Why limiting yourself?
Moving on…the research field!
Essentially, anecdotes are experiences that we personally have in our everyday lives that are not part of a controlled or experimental condition, but we use them as a method for estimating probability. 
Research is literally littered with anecdotes. Many folks base their lives on anecdotes. And sadly, often times with no successful outcome. What it should be all about then?
Legitimate scientists endeavor to disprove their own theories and only give provisional assent once a theory has survived dedicated attempts at proving it wrong. They also consider alternate theories—not just their own theory. 
On another side of the spectrum we meet with the quacks and the pseudoscientists. They live in a different land. Most often, their purpose is to prove their claims correct by looking for confirming evidence (confirmation bias ftw!) and by avoiding and totally disproving disconfirming evidence.
Anecdotes + out of context information + recklessness + innate innumeracy (poor statistical skills) make the best of pseudoscience and quackery.
Good Practices – Domesticating the Brain
These are only a few of the factory flaws of our organic software. Many more there are!
As you get a good grasp of them, you can start experimenting with sounder and healthier thinking – the good stuff, a.k.a. critical thinking.
To build a superior mind one should also engage in metacognition – the process of thinking about thinking. To illustrate:
Falling prey to one of the logical fallacies from above and becoming aware of the trap reflects metacognition. This allows you to remove yourself from the situation and possibly avoid getting trapped in future circumstances of the same kind.
We compensate for all of these flaws in our brain’s functioning by using metacognition, or thinking about thinking itself. A process called scientific skepticism involves systematic doubt— questioning everything that you think, the process of your thinking, and everything that you think you know. 
For increased comprehension, some formal definitions are required:
Critical thinking: Applying systematic logic and doubt to any claim or belief; thinking carefully and rigorously. 
Carefully building arguments and using reasoning to reach conclusions. When different conclusions come out of better arguments, one should be willing to change their own conclusion if they were to render themselves the status of critical thinkers. Consequently:
To be a critical thinker is to be comfortable with uncertainty and with the limits of human knowledge and to be aware of all the many flaws and limitations of human intelligence—and, therefore, to be flexible in the face of new ideas or information but to not be afraid to acknowledge that some ideas are objectively better than others. 
Professor Novella brings Bayesian analysis in support of critical thinking:
Reaching a conclusion from the best current knowledge and arguments and acknowledging the possible limitations of the process. Having the openness to update the conclusion when new and appropriate evidence becomes available.
With a Bayesian approach one is willing to always update knowledge and conclusions, instead of irrationally defending obsolete, invalid, and outdated conclusions.
Cognitive dissonance: An unpleasant emotion generated by the simultaneous existence of mutually exclusive beliefs. 
My mantra – ketogenic diets for everyone. Carbs are evil. But there’s this apparently good study showing potential benefits of carbohydrates. Arrrghhh, I feel so cognitively dissonant right now!!!
Cognitive dissonance often comes on the same plate with the confirmation bias.
Confirmation bias: A cognitive bias to support beliefs we already hold, including the tendency to notice and accept confirming information while ignoring or rationalizing disconfirming information. 
I wrote more about the confirmation bias – cognitive dissonance duo here.
To make matters worse, enter innate innumeracy:
Humans are terrible at probability. Our brains are very good at certain tasks, such as pattern recognition, but we have a horrible innate sense of probability. We especially have difficulty dealing with large numbers. We appear to have evolved an intuitive sense of small numbers but can only deal with large numbers in the abstract language of mathematics.
Novella exemplifies this by referring to: cold reading and the retrofitting of evidence.
He reminds about Nostradamus and the fact that whenever he made specific predictions (by giving specific names, dates, and locations) all of them failed miserably. Many such quacks inhabit the Earth today.
Retrofitting, or the process of looking – after time has passed – for some kind of pattern recognition as a way of mining an inadvertently large data set. 
A good weapon against innumeracy is metacognition (you don’t say?!) – understanding and acknowledging the flaws of our thinking and substituting formal, mathematical analysis for our naive senses. 
And here comes one of my favorites, massive delusions.
In addition to our evolved tendencies and personal quirks, we are strongly influenced by those around us and our culture. We can even get caught up in group or mass delusions…
We respond to the beliefs of others, to our social group, and to the broader culture… It’s important to keep your critical thinking active and not to surrender to these group dynamics. 
Religion, diets (of all kind), politics…fill in the blanks…
Noah Harrari’s book Sapiens is sobering for this purpose.
Sound Practices in Times of Information Overload
There are many ways to improve your reasoning abilities.
First of all, you may want to protect yourself from the built-in flaws of your own mind.
Then you’d want to guard against those of other people.
Then you’d want to efficiently filter through the information you’re being bombarded with every second you’re awake…
and I could go on indefinitely…
So, from Professor Novella (and me, in a small part):
- Before taking any information for granted, try at least verifying it – from multiple reliable and independent sources – if possible. Do not trust any one source as definitive. 
- Look for disconfirming evidence and contrary opinions about the topic, subject or specific information you are dealing with.
- Use search engines with specific terminology:
Searching for the topic of interest with key words such as scam, skeptics, skeptical, or fraud can help. See what all sides are saying about an issue before deciding who has the strongest case. 
I’d also add words like: quacks, quackery, and sham to the list.
- Most importantly, your personal experience should be a solid argument in your decision making process.
For example (diet illustration):
If a very-low-fat-diet works extremely well for your, if you feel good and your blood markers and health parameters prove it, it doesn’t matter how others seem to lose fat consuming oil coffee and plain butter (my favorite line)! Keep doin’ whatcha’ doin’!
Some of the most avid skeptics question everything and anything that crosses their minds. I personally find it unproductive and mildly pathological.
I strive to include a healthy dose of skepticism into most of my cortical processes. Some anecdotes may slip through though. I don’t mind.
I don’t have to filter and verify all my thoughts against double-blind controlled trials (the gold standard of research).
If something seems to work for me – fine! I’ll go with it. Thank you!
However, if I am not satisfied with my decisions and the outcomes that come as a result or if I try to extrapolate to other people, skepticism and critical thinking will kick in.
I don’t know what I don’t know. Checking my beliefs with other people and different outside sources as objectively as possible may increase the chances for the existent flaws in my thinking to be covered up. Thanks Prof. Novella for the reminder!
Once we accept that we cannot trust what we think we remember, we become humble in the face of our experiences and knowledge. Then, we are open to the dire need for a systematic approach to knowledge—methods to compensate for all the many flaws of our brains’ function. 
In an ideal world, people think.
Limiting the use of prejudices, rationalization, and zombie-like thinking (auto-pilot mind) will reduce gullibility and may drastically improve the quality of life.
This can be the real world – our real world, provided that you and I make a conscious effort toward it.
- Professor Steven Novella
- Duhigg, C. (2012). The Power of Habit. Cornerstone Digital.
- Novella, S. (2013). Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills. The Great Courses
- Tavris, C. (2015). Mistakes Were Made (but Not by Me). Mariner Books.
- Loftus, E. (1996). Eyewitness Testimony. Harvard University Press.
- Taleb, N. N. (2005). Fooled by Randomness. Random House.
- Harari, N. (2015). Sapiens. Harper.
Further Reading (from Prof. Novella):
- Burton, R (2009). On Being Certain: Believing You Are Right Even When You’re Not. New York: St. Martin’s Griffin.
- Flew, A. (1998). How to Think Straight: An Introduction to Critical Reasoning. Amherst: Prometheus Books.
- Gilovich, T. (1991). How We Know What Isn’t So. New York: The Free Press.
- Hines, T. (2002). Pseudoscience and the Paranormal. Amherst: Prometheus Books.
- Kida, T. E. (2006). Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. Amherst: Prometheus Books.
- Novella, S. (2000). “Anatomy of Pseudoscience.”
- Paulos, J. A. (1990). Innumeracy: Mathematical Illiteracy and Its Consequences. New York: Vintage.
- Vos Savant, M. (1997). The Power of Logical Thinking. New York: St. Martin’s Griffin.
Image: Adapted from here