• Media Center

System 1 and System 2 Thinking

What is system 1 and system 2 thinking.

System 1 thinking  is a near-instantaneous process; it happens automatically, intuitively, and with little effort. It’s driven by instinct and our experiences.  System 2 thinking  is slower and requires more effort. It is conscious and logical.

System 1 and System 2 Thinking

The Basic Idea

Theory, meet practice.

TDL is an applied research consultancy. In our work, we leverage the insights of diverse fields—from psychology and economics to machine learning and behavioral data science—to sculpt targeted solutions to nuanced problems.

When commuting to work, you always know which route to take without having to consciously think about it. You automatically walk to the subway station, habitually get off at the same stop, and walk to your office while your mind wanders. It’s effortless. However, the subway line is down today.

While your route to the subway station was intuitive, you now find yourself spending some time analyzing alternative routes to work in order to take the quickest one. Are the buses running? Is it too cold outside to walk? How much does a rideshare cost?

Our responses to these two scenarios demonstrate the differences between our slower thinking process and our instantaneous one. 

However, even when we think that we are being rational in our decisions, our System 1 beliefs and biases still drive many of our choices. Understanding the interplay of these two systems in our daily lives can help us become more aware of the bias in our decisions – and how we can avoid it.

The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps. – Daniel Kahneman in Thinking, Fast and Slow

System 1 Thinking:  Our brains’ fast, automatic, unconscious, and emotional response to situations and stimuli. This can be in the form of absentmindedly reading text on a billboard, knowing how to tie your shoelaces without a second thought, or instinctively hopping over a puddle on the sidewalk.

System 2 Thinking:  The slow, effortful, and logical mode in which our brains operate when solving more complicated problems. For example, System 2 thinking is used when looking for a friend in a crowd, parking your vehicle in a tight space, or determining the quality-to-value ratio of your take-out lunch.

Automatic Thinking :  An unconscious and instinctive process of human thinking. This term can be used interchangeably with System 1 thinking.

Reasoning:  Consciously using existing information to logically make a decision or reach a conclusion.

Dual Process Model:  A theory in psychology that distinguishes two thought processes in humans by describing them as unconscious and conscious, respectively.

critical thinking system 1 and 2

For many centuries, philosophers and psychologists have been able to differentiate instinctive thinking and conscious reasoning, starting as early as the 17th century with Descartes’ mind-body dualism.

William James, an American psychologist, was at the root of this idea in the late 19th century. In his book,  Principles of Psychology , James believed that associative and true reasoning formed the two ways of thinking. 1,2  Associative knowledge was derived only from past experiences, as opposed to true reasoning being used in new, unfamiliar scenarios that an individual is unfamiliar with. James’s ideas laid the groundwork for System 1 and System 2 thinking.

In 1975, psychologists Michael Posner and Charles Snyder developed the dual-process model of the mind in their book,  Attention and Cognitive Control . The dual-process model was a more polished version of James’ ideas, distinguishing the two ways of thinking by describing them as  automatic  and controlled, respectively. 3

As the theory developed, automatic processes were characterized by four conditions:

  • They are elicited unintentionally;
  • They require only a small amount of cognitive resources;
  • They cannot be stopped voluntarily; and
  • They happen unconsciously.

Likewise, controlled processes were characterized by four conditions:

  • They are elicited intentionally;
  • They require a considerable amount of cognitive resources;
  • They can be stopped voluntarily; and
  • They happen consciously.

However, in 1992, John Bargh challenged these rigid characteristics and suggested that it was virtually impossible for any process to satisfy all four characteristics. 4

Fast forward to 2011, and Daniel Kahneman published his bestselling book,  Thinking, Fast and Slow , popularizing the distinction between automatic and conscious thought processes. 5  In this book, Kahneman incorporated the terms System 1 and System 2 to describe the two processes, first coined by psychologists Keith Stanovich and Richard West in 2000. 6

Daniel Kahneman

A renowned psychologist in the field of  behavioral economics  who was influential in topics such as judgement and decision-making. Kahneman’s 2011 book,  Thinking, Fast and Slow , popularized the concepts of System 1 & System 2.

William James

An American psychologist, philosopher, and historian who is credited with laying the initial groundwork for two different types of thinking in the late 19th century. His work would go on to influence formal literature on the dual process model in the late 20th century. At Harvard University, James was one of the very first educators to offer a psychology course in the United States. 7

Michael Posner

An American psychologist who, along with Charles Snyder, was one of the first to formally introduce the dual process model. Posner and Snyder’s book,  Attention and Cognitive Control , described the two forms of thinking as automatic and controlled, respectively.


Case 1: marketing.

The concepts of System 1 and System 2 have become highly influential in the world of marketing. In a world where consumers have more options than ever, brands often rely on the automatic, feelings-driven processes of System 1 to sell their products. Advertising seeks not just to communicate information about a product, but also to establish certain emotional associations around it that will stick in customers’ heads and drive them to purchase it without extra thought.

The power of System 1 thinking means that overhauled and refreshed marketing campaigns may not be as effective as initially thought. As competition continues to grow fiercely in the field of marketing, many brands are attempting bold, radically new campaigns. However, overhauled campaigns may get rid of the valuable, distinctive features that shape consumers’ automatic perception of the brand’s image. 8  By leveraging a brand’s distinctive image to increase its resonance in consumers’ System 1 thinking, a greater return on investment can be created in the short and long term. 8

That doesn’t mean that System 2 doesn’t play an important role in consumer decisions. For expensive purchases, consumers tend to make decisions based on System 1 beliefs, in addition to a more careful and rational thought-process driven by System 2. 8  Brands can use their knowledge of System 2 to provide a powerful justification, reinforcing consumers’ System 1 beliefs with details, facts or statistics. 8

Case 2: Financial planning

Governments can also take advantage of System thinking to develop effective behavioral interventions. Recognizing System 1 thinking’s automatic preference for the default has led to the development of effective interventions, addressing issues such as insufficient retirement savings. 9

In the United States, behavioral economists recognized that even when workers received a raise, few would actually take action to increase their savings rate. They concluded that the lack of action was a sign of an overreliance on System 1 thinking.

In this case, the default option kept the savings rate the same, unless a worker took action to increase it. To tackle the problem, behavioral economists designed an intervention that automatically increased a worker’s savings rate whenever they received a raise. The automatic increase was able to take advantage of workers’ System 1 thinking to increase savings rates in the US. 9


The concepts of System 1 and System 2 thinking have become common in mainstream thinking. The transition from academia to popular culture has resulted in the original theory losing some of its nuance and depth, replaced by simplifications of human thought processes. There are three common misconceptions that have emerged in popular culture. 5

First is the idea that System 1 and System 2 thinking literally represents our brain structure. This is false, and Kahneman even says that “there is no part of the brain that either of the systems would call home.” 10

critical thinking system 1 and 2

Second is the idea that System 1 thinking occurs first, followed by System 2 thinking if necessary. Kahneman explains that the dual-system approach combines both forms of reasoning as almost all processes are a mix of both systems. Though difficult scenarios may rely more on System 2, both systems work together. Emotions from our unconscious System 1 processes influence and complement our logical System 2 thinking, and our brain integrates the two to enable us to make purposeful decisions. 5

Finally, popular culture tends to incorrectly label System 1 as the source of bias, and System 2 as the logical correction to said biases. In fact, both systems are susceptible to biases and mistakes, such as  confirmation bias . 5  For example, we may notice information when it supports our existing System 1 beliefs, in addition to using System 2 to analyse new information in order to justify our existing beliefs as a result of the confirmation bias. 5

In 1995, the popularity of M&M’s, the multi-colored chocolate candy, was decreasing. BBDO, an advertising agency, was recruited in an attempt to revitalize the brand. Then-creative director, Susan Credle, had a small budget to work with compared to other iconic brands, like Pepsi or Coke. However, Credle’s approach was highly successful: she made each colour of M&M candy into a character – a ‘spokescandy’. 11  BBDO introduced Red (the sarcastic one), Yellow (the happy one), Blue (the cool one), and Green (the seductive one).

critical thinking system 1 and 2

This move resulted in the creation of M&M retail stores and multiple M&M line extensions. 11  The characters became so popular that, in an attempt to prevent consumers from losing interest, BBDO experimented with occasionally removing them from television advertisements. In response, consumers would ask where the characters had gone. 11  The characters were eventually reinstalled, and today, remain easily identifiable.

By developing memorable characters, BBDO was able to successfully ingrain M&M into consumers’ System 1 thinking. This was achieved on a sustainable, mass scale by creating distinctive brand assets. This not only deepens M&M’s resonance in consumers’ System 1 thinking, it also creates more return on investment in the short and long run. 11

Related TDL Content

Automatic Thinking

The Decision Lab takes a closer look at automatic thinking by considering its history, in addition to the consequences and controversies, it is associated with.

How to Protect An Aging Mind From Financial Fraud

Although aging is inevitable, financial fraud in old age isn’t. Elderly individuals in the US alone lose an estimated $3 billion a year to financial scams. System 1 thinking can play a part in this, and research by The Decision Lab offers insights into how this reality can be avoided.

  • Dual-process model . (n.d.). Oxford Reference.  https://www.oxfordreference.com/view/10.1093/oi/authority.20110803095732808
  • Dual-process models . (n.d.). Psychology Wiki. Retrieved October 12, 2021, from  https://psychology.wikia.org/wiki/Dual_process_models
  • Gawronski, B., & Creighton, L. A. (2013). Dual process theories. In D. E. Carlston (Ed.),  The Oxford handbook of social cognition  (pp. 282–312). Oxford University Press.
  • Bargh, J. A. (1992). The ecology of automaticity: Toward establishing the conditions needed to produce automatic processing effects.  The American Journal of Psychology ,  105 (2), 181.  https://doi.org/10.2307/1423027
  • System 1 and System 2 Thinking . (n.d.). The Marketing Society.  https://www.marketingsociety.com/think-piece/system-1-and-system-2-thinking#_ftn1
  • Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate?  Behavioral and Brain Sciences ,  23 (5), 645-665.  https://doi.org/10.1017/s0140525x00003435
  • William James . (n.d.). Department of Psychology.  https://psychology.fas.harvard.edu/people/william-james#:~:text=In%201875%20James%20taught%20one,Stanley%20Hall%20in%201878
  • What is ‘System 1’ thinking—and why do you need to learn it?  (17, September 19). Observer.  https://observer.com/2017/09/what-is-system-1-thinking-and-how-do-you-do-it/
  • Zheng, J. (2012, February 22).  The benefits of being in two minds . The Conversation.  https://theconversation.com/the-benefits-of-being-in-two-minds-5388#
  • Kahneman, D. (2011).  Thinking, Fast and Slow . Doubleday Canada.
  • O’Reilly, L. (2016, March 26).  How 6 colorful characters propelled M&M’s to become America’s favorite candy . Business Insider.  https://www.businessinsider.com/the-story-of-the-mms-characters-2016-3

About the Author

critical thinking system 1 and 2

Joshua was a former content creator with a passion for behavioral science. He previously created content for The Decision Lab, and his insights continue to be valuable to our readers.

brain and gear icon


The COM-B Model for Behavior Change

The COM-B Model for Behavior Change

butterfly icon

The Butterfly Effect

Puzzle piece and head


Notes illustration

Eager to learn about how behavioral science can help your organization?

Get new behavioral science insights in your inbox every month..

Kahneman fast and slow thinking explained

Kahneman fast and slow thinking explained

Excerpt: this is a reference page. Here you can find the fundamentals of Kahneman’s breakthrough work on human decision making. Firstly, it will address his discovery of fast and slow thinking. Secondly, the importance of our unconscious mind in making decisions and influencing behaviour will be discussed.

1. Kahneman Fast and Slow Thinking

On this page, we want to give you a quick guide to Daniel Kahneman’s groundbreaking work about decision making. Maybe you’ve already heard of system 1 and system 2. Or you’ve heard Kahneman was the first psychologist to win the Nobel prize for economics in 2002. Could be you’ve heard about cognitive biases and heuristics. Enough to be intrigued. He is one of our heroes and the godfather of behavioural economics. We’ll give you the highlights of Kahneman’s thinking which he published in his best-selling book ‘ Thinking Fast and Slow .’

Therefore, this isn’t so much an article as a reference page that you can consult whenever you want to know more. Or reread about Kahneman. To make your life a bit easier, we have created page sections so you can easily jump to the subject that is of particular interest to you. We also have included shortcuts links for this page as well as links to more detailed information if you want to dive a bit deeper. The page sections:

System 1 and 2 The power of your subconscious mind Heuristic: definition and meaning Cognitive bias

System 1 and system 2

Most importantly, the groundbreaking research of Daniel Kahneman showed that our brain has two operating systems. Which he called system 1 and system 2. These are the differences between the two systems of our brain:

  • DEFINING CHARACTERISTICS: unconscious, automatic, effortless
  • WITHOUT self-awareness or control “What you see is all there is.”
  • ROLE: Assesses the situation, delivers updates
  • Does  98% of all our thinking
  • DEFINING CHARACTERISTICS: deliberate and conscious, effortful, controlled mental process, rational thinking
  • WITH self-awareness or control, logical and skeptical
  • ROLE: seeks new/missing information, makes decisions
  • Does  2%  of all our thinking

System 2 is a slave to our system 1

To summarize, you could say that our system 2 is a slave to our system 1. Our system 1 sends suggestions to our system 2 which then turns them into beliefs. Do you want to know more about the differences between system 1 and 2? We’ve created a more elaborate overview of the main characteristics of system 1 and 2. Or maybe you’d like to hear Daniel Kahneman himself explain the concept of system 1 and 2? This is a good video to watch and is only 6.35 minutes long.

Want to shape behaviour and decisions?

Then our two-day Fundamentals Course is the perfect training for you. You will learn the latest insights from behavioural science and get easy-to-use tools and templates to apply these in practice right away!

Go ahead, it’s completely free of charge!

The power of your subconscious mind

Kahneman’s additional discovery of the bandwidth of each system was what made this research so significant. It was a breakthrough into the lack of reasoning in human decision-making. He showed how the two thought systems arrive at different results, even though they are given the same inputs. Foremost, however, he revealed the power of the subconscious mind; where we all tend to think we’re rational human beings who think about our decisions and about the things we do. Kahneman demonstrated that we’re (almost) completely irrational. But that’s a good thing. It’s our survival mechanism.

35,000 decisions a day

On average we all have about 35,000 decisions to make each day. These differ in difficulty and importance. It could be taking a step to your left or right when talking. Or deciding to take the stairs or elevator. But they all hit you on a daily basis. If you had to consciously process all these decisions your brain would crash. Your automatic system’s primary task is to protect your system 2 in order to prevent cognitive overload.

There are a few ways our automatic system lightens the load on our deliberate system. First, it takes care of our more familiar tasks by turning them into autopilot routines, also known as habits. But what system 1 primarily does is rapidly sift through information and ideas without you even noticing it by prioritising whatever seems relevant and filtering out the rest by taking shortcuts. These shortcuts are also called heuristics. We’ll explain them in the next section.

We are all irrational human-beings

Above all, we all have to accept that we are irrational human beings almost all the time. Even if you think you’re not. Somehow we can accept our irrationality, or at least understand it when it’s explained to us, but we keep making the same mistake with others. When trying to influence someone, we tend to forget they are irrational too. We often try to convince somebody with rational arguments or facts. We love to tell someone about the benefits of our products or services or ideas.

Decisions are based on short-cuts

However, the decision of the person you’re trying to convince isn’t based on this rational information. It’s based on system 1 shortcuts. Kahneman’s work demonstrates that people struggle with statistics and cannot reason the probable outcomes of their decisions. A second very important insight from his work is that our decisions are driven by heuristics and biases. We’ll dive deeper into those in the next two sections.

Heuristic: definition and meaning

The shortcuts our system 1 makes are heuristics. The definition of a heuristic, as can be found on Wikipedia , is:

Any approach to problem-solving, learning, or discovery that employs a practical method, not guaranteed to be optimal, perfect, logical, or rational. But instead sufficient for reaching an immediate goal. Where finding an optimal solution is impossible or impractical. Heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

A heuristic is our automatic brain at work

If we bring it back to Kahneman’s thinking, a heuristic is simply a shortcut our automatic (system 1) brain makes to save the mental energy of our deliberate (system 2) brain. This is our survival mechanism at play. You’re probably already familiar with the experience of heuristics. We sometimes refer to them as a gut feeling, guestimate, common sense, or intuition. We use heuristics for problem-solving that isn’t a routine or habit. The way we ‘build’ heuristics is by reviewing the information at hand and connecting that information to our experience. Heuristics are strategies derived from previous experiences with similar problems. The most common heuristic is trial and error, trying to solve a problem based on experience instead of theory.

The availability heuristic

Another example is the so-called availability heuristic. When making a decision, this heuristic provides us with a mental short-cut that relies on immediate cases that come to our mind. Or easier put: we value information that springs to mind quickly as being more significant. So, when we have to make a decision, w e automatically think about related events or situations. As a result, we might judge those events as being more frequent or more probable than others. Therefore, we have a greater belief in this information and tend to overestimate the probability and likelihood of similar things happening in the future.

Heuristics can be wrong: biased

The problem with heuristics is that sometimes they’re wrong. They are nothing more than mental shortcuts that usually involve focusing on one aspect of a complex problem and ignoring others. Therefore, heuristics affect our decision-making and, subsequently, our customer’s behaviour.

Do you want applied Behavioural Design delivered to your inbox?

Join over 5.000 forward-thinking leaders, innovators, and professionals who use our insights to amplify their impact, growth, and transformation by injecting behavioural intelligence into their organisation and projects.

Go ahead, it's free and no spam!

Cognitive bias

With all this in mind, you could say that Kahneman discovered something very interesting about our cognitive abilities as human beings. To be clear about the meaning of cognition, let’s take a look at how the dictionary defines it.

“The mental action or process of acquiring knowledge and understanding through thought, experience, and the senses.”

What Kahneman discovered is truly paradigm shifting. It is breakthrough thinking that can even hurt egos. We are far less rational and far less correct in our thinking than we’d like to give ourselves credit for. The side-effect of heuristics is that we all suffer from cognitive bias. A cognitive bias refers to a systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own ‘subjective social reality’ from their perception of the input.

List of cognitive biases

There are a lot of cognitive biases. You can take a look on Wikipedia , at their extensive list of cognitive biases or check out an overview we made of the most common ones. The most important thing to remember is that we all base our decisions on a heuristic, and we all are influenced by our cognitive biases. By being aware of the most common biases, you can anticipate them.

Cognitive bias in recruitment

To round things up, here is an example that ties up all the concepts of Kahneman discussed in this post. Think about recruitment. If you have to interview a person for a position for your team or organisation, the chance of this person is getting hired is proven to be established in the first 10 minutes. What happens? A person steps into the room and your system 1 makes a fast, mostly unconscious judgment based on heuristics. This leads to certain biases in your judgment. If the person is similar to you, your system 1 instantly likes him or her (liking bias). If the person wears glasses, your system 1 thinks he or she is smart (stereotyping bias). It all happens fast.

Lowering mental stress

In conclusion, your system 1 has sent these suggestions to your system 2 without you even noticing it. And your system 2 turns those into beliefs. The rest of the interview your system 2 looks for affirmation of the system 1 suggestions. To recap, our brain simply loves consistency. It lowers our mental stress or cognitive overload. And there you go. You base your final judgment on the two operating systems of your brain. Helped by heuristics and skewed by cognitive bias. We do this all day, in all kinds of situations.

BONUS: How to become a better recruiter by understanding your biases'

Especially for you we've created a free cheat card to make sure you avoid these biases in HR situations. For you to keep at hand, so you can start using the insights from Kahneman whenever you want—it is a little gift from us to you.

To sum it up

To sum it up, by understanding Kahneman you can understand human decision-making. Because if you understand human-decision making, you can understand human or customer behaviour. You can see how we are predictably irrational. Dan Ariely wrote a beautiful book with this title, which we highly recommend. However, we just have to accept our own irrationality and understand that if we want to convince someone or try to nudge them into a certain behaviour, they are just irrational too.

Would you like to know more?

We have created a brochure telling you all about the details of the Behavioural Design Sprint. Such as the set-up, the investment, the time commitment, and more. Please, feel free to contact us any time should you have any further questions. We are happy to help!

Go ahead, there are no strings attached!

How do you do. Our name is SUE.

Do you want to learn more?

Suppose you want to learn more about how influence works. In that case, you might want to consider joining our Behavioural Design Academy , our officially accredited educational institution that already trained 2500+ people from 45+ countries in applied Behavioural Design. Or book an in-company training or one-day workshop for your team. In our top-notch training, we teach the Behavioural Design Method© and the Influence Framework©. Two powerful tools to make behavioural change happen in practice.

You can also hire SUE to help you to bring an innovative perspective on your product, service, policy or marketing. In a Behavioural Design Sprint , we help you shape choice and desired behaviours using a mix of behavioural psychology and creativity.

You can download the Behavioural Design Fundamentals Course brochure, contact us here or subscribe to our Behavioural Design Digest. This is our weekly newsletter in which we deconstruct how influence works in work, life and society.

Or maybe, you’re just curious about SUE | Behavioural Design. Here’s where you can read our backstory.

critical thinking system 1 and 2

Comments are closed.

‘s-Gravenhekje 1a 1011 TG Amsterdam The Netherlands


Get access to our ‘1,5 Minutes on Influence’ newsletter, and join other forward-thinking professionals who use our insights and method to amplify their innovation, transformation and growth.

© 2024 SUE | Behavioural Design Academy. All rights reserved. Terms and Conditions & Privacy Statement & Complaints Procedure

  • Fundamentals Course
  • Advanced Course
  • Training for Teams
  • Courses Overview
  • Courses Calendar
  • Success stories
  • Testimonials
  • Behavioural Design Ethics
  • Get the newsletter ‘1,5 Minutes on Influence’
  • Use of our Work
  • Our consultancy

Want to get started using Behavioural Design for free?

Sign-up for our email and we will send you free training, tips, and tools.

Your first name (required)

Your last Name (required)

Your work email (required)

I agree to the Terms & Conditions

I agree with the Privacy Statement

Please leave this field empty.

No thanks, I'm good.

Ok. Are you 100% sure?

You will miss out on receiving free training.

Tips you can start using right away.

To change minds and nudge behaviour.

Always practical, we promise!

Ok you've convinced me

Sign me up anyways

Ok. So this is you...

June 15, 2012

17 min read

Of 2 Minds: How Fast and Slow Thinking Shape Perception and Choice [Excerpt]

In psychologist Daniel Kahneman's recent book, he reveals the dual systems of your brain, their pitfalls and their power

By Daniel Kahneman

To survive physically or psychologically, we sometimes need to react automatically to a speeding taxi as we step off the curb or to the subtle facial cues of an angry boss. That automatic mode of thinking, not under voluntary control, contrasts with the need to slow down and deliberately fiddle with pencil and paper when working through an algebra problem. These two systems that the brain uses to process information are the focus of Nobelist Daniel Kahneman's new book, Thinking, Fast and Slow (Farrar, Straus and Giroux, LLC., 2011). The following excerpt is the first chapter, entitled "The Characters of the Story," which introduces readers to these systems. (Used with permission. )

Understanding fast and slow thinking could help us find more rational solutions to problems that we as a society face. For example, a commentary in the March issue of the journal Nature Climate Change outlined how carbon labeling that appeals to both systems could be more successful than previous efforts to change consumer habits. ( Scientific American is part of Nature Publishing Group.) Understanding how we think can also guide more personal decisions. Last month, Kahneman highlighted in a lecture given at the National Academy of Sciences "The Science of Science Communication" conference how realizing the limitations of each system can help us catch our own mistakes.  

To observe your mind in automatic mode, glance at the image below.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

Your experience as you look at the woman’s face seamlessly combines what we normally call seeing and intuitive thinking. As surely and quickly as you saw that the young woman’s hair is dark, you knew she is angry. Furthermore, what you saw extended into the future. You sensed that this woman is about to say some very unkind words, probably in a loud and strident voice. A premonition of what she was going to do next came to mind automatically and effortlessly. You did not intend to assess her mood or to anticipate what she might do, and your reaction to the picture did not have the feel of something you did. It just happened to you. It was an instance of fast thinking.

Now look at the following problem:

17 × 24

You knew immediately that this is a multiplication problem, and probably knew that you could solve it, with paper and pencil, if not without. You also had some vague intuitive knowledge of the range of possible results. You would be quick to recognize that both 12,609 and 123 are implausible. Without spending some time on the problem, however, you would not be certain that the answer is not 568. A precise solution did not come to mind, and you felt that you could choose whether or not to engage in the computation. If you have not done so yet, you should attempt the multiplication problem now, completing at least part of it.

You experienced slow thinking as you proceeded through a sequence of steps. You first retrieved from memory the cognitive program for multiplication that you learned in school, then you implemented it. Carrying out the computation was a strain. You felt the burden of holding much material in memory, as you needed to keep track of where you were and of where you were going, while holding on to the intermediate result. The process was mental work: deliberate, effortful, and orderly—a prototype of slow thinking. The computation was not only an event in your mind; your body was also involved. Your muscles tensed up, your blood pressure rose, and your heart rate increased. Someone looking closely at your eyes while you tackled this problem would have seen your pupils dilate. Your pupils contracted back to normal size as soon as you ended your work—when you found the answer (which is 408, by the way) or when you gave up.  


Psychologists have been intensely interested for several decades in the two modes of thinking evoked by the picture of the angry woman and by the multiplication problem, and have offered many labels for them. I adopt terms originally proposed by the psychologists Keith Stanovich and Richard West, and will refer to two systems in the mind, System 1 and System 2.

• System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. • System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

The labels of System 1 and System 2 are widely used in psychology, but I go further than most in this book, which you can read as a psychodrama with two characters.

When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2. The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps. I also describe circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1. You will be invited to think of the two systems as agents with their individual abilities, limitations, and functions.

In rough order of complexity, here are some examples of the automatic activities that are attributed to System 1:

• Detect that one object is more distant than another. • Orient to the source of a sudden sound. • Complete the phrase “bread and . . .” • Make a “disgust face” when shown a horrible picture. • Detect hostility in a voice. • Answer to 2 + 2 = ? • Read words on large billboards. • Drive a car on an empty road. • Find a strong move in chess (if you are a chess master). • Understand simple sentences. • Recognize that a “meek and tidy soul with a passion for detail” resembles an occupational stereotype.

All these mental events belong with the angry woman—they occur automatically and require little or no effort. The capabilities of System 1 include innate skills that we share with other animals. We are born prepared to perceive the world around us, recognize objects, orient attention, avoid losses, and fear spiders. Other mental activities become fast and automatic through prolonged practice. System 1 has learned associations between ideas (the capital of France?); it has also learned skills such as reading and under- standing nuances of social situations. Some skills, such as finding strong chess moves, are acquired only by specialized experts. Others are widely shared. Detecting the similarity of a personality sketch to an occupational stereotype requires broad knowledge of the language and the culture, which most of us possess. The knowledge is stored in memory and accessed with- out intention and without effort.

Several of the mental actions in the list are completely involuntary. You cannot refrain from understanding simple sentences in your own language or from orienting to a loud unexpected sound, nor can you prevent yourself from knowing that 2 + 2 = 4 or from thinking of Paris when the capital of France is mentioned. Other activities, such as chewing, are susceptible to voluntary control but normally run on automatic pilot. The control of attention is shared by the two systems. Orienting to a loud sound is normally an involuntary operation of System 1, which immediately mobilizes the voluntary attention of System 2. You may be able to resist turning toward the source of a loud and offensive comment at a crowded party, but even if your head does not move, your attention is initially directed to it, at least for a while. However, attention can be moved away from an unwanted focus, primarily by focusing intently on another target.

The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away. Here are some examples:

• Brace for the starter gun in a race. • Focus attention on the clowns in the circus. • Focus on the voice of a particular person in a crowded and noisy room. • Look for a woman with white hair. • Search memory to identify a surprising sound. • Maintain a faster walking speed than is natural for you. • Monitor the appropriateness of your behavior in a social situation. • Count the occurrences of the letter a in a page of text. • Tell someone your phone number. • Park in a narrow space (for most people except garage attendants). • Compare two washing machines for overall value. • Fill out a tax form. • Check the validity of a complex logical argument.

In all these situations you must pay attention, and you will perform less well, or not at all, if you are not ready or if your attention is directed inappropriately. System 2 has some ability to change the way System 1 works, by programming the normally automatic functions of attention and memory. When waiting for a relative at a busy train station, for example, you can set yourself at will to look for a white-haired woman or a bearded man, and thereby increase the likelihood of detecting your relative from a distance. You can set your memory to search for capital cities that start with N or for French existentialist novels. And when you rent a car at London’s Heathrow Airport, the attendant will probably remind you that “we drive on the left side of the road over here.” In all these cases, you are asked to do something that does not come naturally, and you will find that the consistent maintenance of a set requires continuous exertion of at least some effort.

The often-used phrase “pay attention” is apt: you dispose of a limited budget of attention that you can allocate to activities, and if you try to go beyond your budget, you will fail. It is the mark of effortful activities that they interfere with each other, which is why it is difficult or impossible to conduct several at once. You could not compute the product of 17 × 24 while making a left turn into dense traffic, and you certainly should not try. You can do several things at once, but only if they are easy and undemanding. You are probably safe carrying on a conversation with a passenger while driving on an empty highway, and many parents have discovered, perhaps with some guilt, that they can read a story to a child while thinking of something else.

Everyone has some awareness of the limited capacity of attention, and our social behavior makes allowances for these limitations. When the driver of a car is overtaking a truck on a narrow road, for example, adult passengers quite sensibly stop talking. They know that distracting the driver is not a good idea, and they also suspect that he is temporarily deaf and will not hear what they say.

Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention. The most dramatic demonstration was offered by Christopher Chabris and Daniel Simons in their book The Invisible Gorilla. They constructed a short film of two teams passing basketballs, one team wearing white shirts, the other wearing black. The viewers of the film are instructed to count the number of passes made by the white team, ignoring the black players. This task is difficult and completely absorbing. Halfway through the video, a woman wearing a gorilla suit appears, crosses the court, thumps her chest, and moves on. The gorilla is in view for 9 seconds. Many thousands of people have seen the video, and about half of them do not notice anything unusual. It is the counting task—and especially the instruction to ignore one of the teams—that causes the blindness. No one who watches the video without that task would miss the gorilla. Seeing and orienting are automatic functions of System 1, but they depend on the allocation of some attention to the relevant stimulus. The authors note that the most remarkable observation of their study is that people find its results very surprising. Indeed, the viewers who fail to see the gorilla are initially sure that it was not there—they cannot imagine missing such a striking event. The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.  


The interaction of the two systems is a recurrent theme of the book, and a brief synopsis of the plot is in order. In the story I will tell, Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine—usually.

When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer, as probably happened to you when you encountered the multiplication problem 17 × 24. You can also feel a surge of conscious attention whenever you are surprised. System 2 is activated when an event is detected that violates the model of the world that System 1 maintains. In that world, lamps do not jump, cats do not bark, and gorillas do not cross basketball courts. The gorilla experiment demonstrates that some attention is needed for the surprising stimulus to be detected. Surprise then activates and orients your attention: you will stare, and you will search your memory for a story that makes sense of the surprising event. System 2 is also credited with the continuous monitoring of your own behavior—the control that keeps you polite when you are angry, and alert when you are driving at night. System 2 is mobilized to increased effort when it detects an error about to be made. Remember a time when you almost blurted out an offensive remark and note how hard you worked to restore control. In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.

The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate. System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. As we shall see, it sometimes answers easier questions than the one it was asked, and it has little understanding of logic and statistics. One further limitation of System 1 is that it cannot be turned off. If you are shown a word on the screen in a language you know, you will read it—unless your attention is totally focused elsewhere.  

Figure 2 is a variant of a classic experiment that produces a conflict between the two systems. You should try the exercise before reading on.

You were almost certainly successful in saying the correct words in both tasks, and you surely discovered that some parts of each task were much easier than others. When you identified upper- and lowercase, the left-hand column was easy and the right-hand column caused you to slow down and perhaps to stammer or stumble. When you named the position of words, the left-hand column was difficult and the right-hand column was much easier.

These tasks engage System 2, because saying “upper/lower” or “right/ left” is not what you routinely do when looking down a column of words. One of the things you did to set yourself for the task was to program your memory so that the relevant words (upper and lower for the first task) were “on the tip of your tongue.” The prioritizing of the chosen words is effective and the mild temptation to read other words was fairly easy to resist when you went through the first column. But the second column was different, because it contained words for which you were set, and you could not ignore them. You were mostly able to respond correctly, but overcoming the competing response was a strain, and it slowed you down. You experienced a conflict between a task that you intended to carry out and an automatic response that interfered with it.

Conflict between an automatic reaction and an intention to control it is common in our lives. We are all familiar with the experience of trying not to stare at the oddly dressed couple at the neighboring table in a restaurant. We also know what it is like to force our attention on a boring book, when we constantly find ourselves returning to the point at which the reading lost its meaning. Where winters are hard, many drivers have memories of their car skidding out of control on the ice and of the struggle to follow well-rehearsed instructions that negate what they would naturally do: “Steer into the skid, and whatever you do, do not touch the brakes!” And every human being has had the experience of not telling someone to go to hell. One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control.  

To appreciate the autonomy of System 1, as well as the distinction between impressions and beliefs, take a good look at figure 3.

This picture is unremarkable: two horizontal lines of different lengths, with fins appended, pointing in different directions. The bottom line is obviously longer than the one above it. That is what we all see, and we naturally believe what we see. If you have already encountered this image, however, you recognize it as the famous Müller-Lyer illusion. As you can easily confirm by measuring them with a ruler, the horizontal lines are in fact identical in length.

Now that you have measured the lines, you—your System 2, the conscious being you call “I”—have a new belief: you know that the lines are equally long. If asked about their length, you will say what you know. But you still see the bottom line as longer. You have chosen to believe the measurement, but you cannot prevent System 1 from doing its thing; you cannot decide to see the lines as equal, although you know they are. To resist the illusion, there is only one thing you can do: you must learn to mistrust your impressions of the length of lines when fins are attached to them. To implement that rule, you must be able to recognize the illusory pattern and recall what you know about it. If you can do this, you will never again be fooled by the Müller-Lyer illusion. But you will still see one line as longer than the other.

Not all illusions are visual. There are illusions of thought, which we call cognitive illusions. As a graduate student, I attended some courses on the art and science of psychotherapy. During one of these lectures, our teacher imparted a morsel of clinical wisdom. This is what he told us: “You will from time to time meet a patient who shares a disturbing tale of multiple mistakes in his previous treatment. He has been seen by several clinicians, and all failed him. The patient can lucidly describe how his therapists misunderstood him, but he has quickly perceived that you are different. You share the same feeling, are convinced that you understand him, and will be able to help.” At this point my teacher raised his voice as he said, “Do not even think of taking on this patient! Throw him out of the office! He is most likely a psychopath and you will not be able to help him.”

Many years later I learned that the teacher had warned us against psychopathic charm,and the leading authority in the study of psychopathy confirmed that the teacher’s advice was sound. The analogy to the Müller-Lyer illusion is close. What we were being taught was not how to feel about that patient. Our teacher took it for granted that the sympathy we would feel for the patient would not be under our control; it would arise from System 1. Furthermore, we were not being taught to be generally suspicious of our feelings about patients. We were told that a strong attraction to a patient with a repeated history of failed treatment is a danger sign—like the fins on the parallel lines. It is an illusion—a cognitive illusion—and I (System 2) was taught how to recognize it and advised not to believe it or act on it.

The question that is most often asked about cognitive illusions is whether they can be overcome. The message of these examples is not encouraging. Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people’s mistakes than our own.  


You have been invited to think of the two systems as agents within the mind, with their individual personalities, abilities, and limitations. I will often use sentences in which the systems are the subjects, such as, “System 2 calculates products.”

The use of such language is considered a sin in the professional circles in which I travel, because it seems to explain the thoughts and actions of a person by the thoughts and actions of little people inside the person’s head. Grammatically the sentence about System 2 is similar to “The butler steals the petty cash.” My colleagues would point out that the butler’s action actually explains the disappearance of the cash, and they rightly question whether the sentence about System 2 explains how products are calculated. My answer is that the brief active sentence that attributes calculation to System 2 is intended as a description, not an explanation. It is meaningful only because of what you already know about System 2. It is shorthand for the following: “Mental arithmetic is a voluntary activity that requires effort, should not be performed while making a left turn, and is associated with dilated pupils and an accelerated heart rate.”

Similarly, the statement that “highway driving under routine conditions is left to System 1” means that steering the car around a bend is automatic and almost effortless. It also implies that an experienced driver can drive on an empty highway while conducting a conversation. Finally, “System 2 prevented James from reacting foolishly to the insult” means that James would have been more aggressive in his response if his capacity for effortful control had been disrupted (for example, if he had been drunk).

System 1 and System 2 are so central to the story I tell in this book that I must make it absolutely clear that they are fictitious characters. Systems 1 and 2 are not systems in the standard sense of entities with interacting aspects or parts. And there is no one part of the brain that either of the systems would call home. You may well ask: What is the point of introducing fictitious characters with ugly names into a serious book? The answer is that the characters are useful because of some quirks of our minds, yours and mine. A sentence is understood more easily if it describes what an agent (System 2) does than if it describes what something is, what properties it has. In other words, “System 2” is a better subject for a sentence than “mental arithmetic.” The mind—especially System 1—appears to have a special aptitude for the construction and interpretation of stories about active agents, who have personalities, habits, and abilities. You quickly formed a bad opinion of the thieving butler, you expect more bad behavior from him, and you will remember him for a while. This is also my hope for the language of systems.

Why call them System 1 and System 2 rather than the more descriptive “automatic system” and “effortful system”? The reason is simple: “Automatic system” takes longer to say than “System 1” and therefore takes more space in your working memory. This matters, because anything that occupies your working memory reduces your ability to think. You should treat “System 1” and “System 2” as nicknames, like Bob and Joe, identifying characters that you will get to know over the course of this book. The fictitious systems make it easier for me to think about judgment and choice, and will make it easier for you to understand what I say.  


“He had an impression, but some of his impressions are illusions.”

“This was a pure System 1 response. She reacted to the threat before she recognized it.”

“This is your System 1 talking. Slow down and let your System 2 take control.”

Reprinted from Thinking, Fast and Slow by Daniel Kahneman, published by Farrar, Straus and Giroux, LLC. Copyright © 2011 by Daniel Kahneman. All rights reserved.

Matt Grawitch Ph.D.

The False Dilemma: System 1 vs. System 2

We often rely on both when it comes to decision-making..

Updated March 30, 2024 | Reviewed by Ekua Hagan

  • Dual-process perspectives pit an unconscious System 1 against the more conscious System 2.
  • While intuitively, appealing, it has significant flaws to it.
  • There is actually no clear distinction between the two systems.
  • Most decisions actually rely on a combination of intuition and conscious, effortful decision making.

Image by Gerd Altmann from Pixabay

A lot of scientific and popular decision-making literature is couched in a dual-process model [1] . The dual-process model pits the relatively automatic, heuristic-driven, and unconscious System 1 against the more effortful, controlled, and conscious System 2 [2] . The implication of the System 1/System 2 dichotomy is that fast and automatic thinking is fraught with errors, and the way to correct those errors is to be more conscious and effortful about decision-making. It is quite intuitively appealing, but unfortunately, it is, itself, fraught with errors [3] .

Melnikoff and Bargh (2018) detailed many of the problems with “the mythical number 2,” most notably that many of the properties attributed to System 1 and System 2 don’t actually line up with the evidence, that dual-process theories are largely unfalsifiable, and that most of the claimed support for them is “confirmation bias at work” (p. 283). System 1 isn’t some error-prone decision-making process, and System 2 isn’t devoid of errors. Biases, motivated reasoning , and fallacious reasoning affect all decision-making, whether it is unconscious or conscious, intuition-driven or highly analytical.

Couchman et al. (2016) [4] provided some evidence that highlighted the flaws in the dual-process perspective. They studied the performance of college students on multiple-choice exams and asked students to indicate whether they were confident in their responses [5] . The more confident students were in their initial response — which researchers operationalized as the intuitive response — the more likely those responses were to be correct. When students were confident in their intuitive response, they were correct over 85% of the time, but their success rate for guessing was only a little over 50%.

Therefore, it wasn’t that students’ intuitive responses tended to be flawed — unless they lacked confidence in those responses. When the intuitive response was one they were confident in, it was substantially more likely to be accurate. When the intuitive response wasn’t one they were confident in, then their likelihood of being accurate was little better than a coin toss.

There is obviously a difference between confidence that is grounded in relevant expertise and experience and a false sense of confidence, as I mentioned when discussing the expertise bubble and as has been demonstrated in applied research in naturalistic decision making (NDM). However, the Couchman et al. (2006) results call into question the veracity of the claims made by dual-process advocates.

The False Dilemma

Much of the evidence supporting dual-process decision-making models comes from tasks designed to take a complex phenomenon and transform it into a false dilemma. These tasks allow researchers to make inferences about whether the decision was based in System 1 or System 2 [6] , but the sole source of evidence for these inferences is the (in)correctness of the response. Consider the following:

A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?

According to dual-process researchers, if your answer to the above was 10 cents, it was because you gave the intuitive response and that was a result of System 1 thinking (because that answer is incorrect [7] .). If your response was 5 cents (the correct answer), then you obviously derived that by engaging your conscious, effortful decision-making capabilities and overrode your System 1 initial response. Thankfully, System 2 swooped in to save the day.

But how do we know that if you said 10 cents, it was because you went with an intuitive response? Might it be possible that you thought consciously about it and decided that 10 cents was correct, and if so, wouldn’t that mean System 2 produced the incorrect answer? Isn’t it possible that you intuitively knew the correct answer was 5 cents, and, if so, wouldn’t that mean System 1 actually produced the correct answer and not the error?

The only response dual-process theories can offer to these questions is that we just know . In other words, if you got it correct, it must have been the result of System 2, and if you got it wrong, it must have been the result of System 1. Unfortunately, when we draw conclusions about the quality of decision-making based strictly on the outcome, that would be considered outcome bias [8] .

critical thinking system 1 and 2

Thus, the most popular model of decision making in contemporary theory and practice is (1) a heuristic that (2) transforms decision making into a false dilemma and (3) is largely based on inferences grounded in outcome bias (or fallacy) so that (4) the results confirm pre-existing beliefs. Unfortunately, even with all of these flaws, the dual-process perspective has become entrenched in most discussions of decision-making, especially when it comes to making better decisions.

No Clear Distinction Between Intuitive and Effortful Decision-Making

The dual-process perspective has some serious flaws, so perhaps there is a better way of conceptualizing how we go about making decisions. To that end, I mentioned previously that Gigerenzer and Gaissmaier (2011) defined a heuristic as “a strategy that ignores part of the information, with the goal of making decisions more quickly, frugally, and/or accurately than more complex methods." Because people generally have limits regarding how much information they can process, also known as bounded rationality, decision-making necessarily involves ignoring part of the information. Gigerenzer and Gaissmaier (2011) thus concluded that “there is no strict dichotomy between heuristic and nonheuristic, as strategies can ignore more or less information."

Instead, when people make decisions, whether conscious or unconscious, the accuracy of those decisions generally come down to the specific strategy used to derive a decision (e.g., pattern recognition , recall, the weighting of different information) and the amount of information considered (i.e., what information is ignored and what information is considered). The complexity of the strategy and the amount of information considered will affect the amount of time and cognitive effort required to make the decision. Effective decision-making occurs when we rely on a strategy and consider an amount of information (i.e., expend time and effort) that allows us to reach an accurate enough decision for that situation [9] .

Almost every decision situation produces one or more relatively quick and cognitively frugal heuristic responses, which are influenced by both stored information from past experience and novel information available in the present situation (which I discussed in a prior post ). We rely on some of the information available in the present situation as the basis for initiating possible responses [10] . If one response (we’ll call it Intuitive Response 1 or IR1) is elicited more quickly or if we have more confidence in that response, we may simply go with that response. But if we lack sufficient confidence in IR1, we may expand our information search — that is, we may collect more information before actually making a decision. This expansion of our search could cause us to become more confident in IR1, or it may cause us to select a different response, possibly IR2, IR3, or some other option we didn’t think about originally.

To put this into context, let’s consider the following situation. One day, Tom is leaving his house and notices his lawn is looking a little bare and decides to buy some grass seed [11] . He heads to the nearest lawn and garden store. He isn’t all that familiar with different types of grass seed, but as he’s perusing the various types, he notices bags of fescue, which is a type of grass seed he’s heard of. This stimulates the recognition heuristic, which leads to the following intuitive response: Purchase a bag of fescue. He’s not all that confident that fescue is what he needs or wants, though, so he expands his information search. He notices the different varieties of seed are similar prices, so that information isn’t helpful. He begins to read some of the labels, notices that fescue requires above-average watering, and decides he doesn’t want to water his grass that often. He then comes across some Kentucky Bluegrass, which requires only average watering. He doesn’t see anything else on the bag to suggest it would be an unsound purchase, so he employs the take-the-best heuristic [12] and opts to buy that bag of grass seed instead [13] .

In Tom’s situation, only fescue came to mind, so there was only one intuitive response, but it could be that Tom had heard of fescue and zoysia (i.e., two intuitive responses), meaning that the recognition heuristic alone would have been useful only for discriminating between those types of grass seed that met the criterion and those that did not. Given that Tom ended up deciding based on how often it needed to be watered, zoysia requires less water than fescue, so he might have opted for the zoysia. However, zoysia grows slowly, so if that piece of information became salient to Tom, he might have ignored both intuitive responses in favor of a grass seed that required average watering but grew quickly (which might have been a simplified fast-and-frugal tree) — which would again have led him to Kentucky Bluegrass.

Let’s dissect Tom’s decision-making using the dual-process perspective. Since Tom started with an intuitive decision but ultimately made a different decision, it implies that System 1 thinking led Tom to generate his initial intuitive response(s), but that System 2 swooped in to save the day by causing Tom to make a different choice. But in Tom’s case, all he did was alter the search criteria a little bit (replacing types he’d heard of with watering frequency and then growth rate). Did he really engage in conscious, effortful decision-making? If so, how do we know? Even if he did consciously change criteria, he certainly didn’t engage in as much conscious decision-making as he could have, and he certainly didn’t peruse every bag of grass seed in order to determine the best choice. Instead, he stumbled onto what he decided were relevant enough criteria and opted for a type of seed that fit those criteria.

Scenarios like this show the flaws in the claims of the dual-process perspective. The dual-process perspective assumes a very rote, mechanized approach to decision-making that doesn’t actually apply to most decisions people make. Instead, decision situations generally elicit one or more intuitive responses, and if we are confident enough in one of those responses, we tend to opt for it. If we’re not, we tend to expand our information search until we do obtain some acceptable level of confidence. The entire process can occur nearly instantly, or it can take a lot of time; it can be a one-and-done decision, or it may unfold incrementally, and the decision-making may occur mostly unconsciously or mostly consciously.

The Implications

So, what does this all mean? Perhaps most importantly, it means that we should stop assuming that intuitive decision-making is necessarily error-prone. As Gary Klein’s work in NDM has repeatedly demonstrated, expertise and experience can yield quite valid intuitive responses. Although terms like “ gut instincts ” are often used to describe intuitive, that description implies that intuitive responses aren’t the result of analysis. Yet, as evidence related to the Recognition-Primed Decision (RPD) model highlights, intuitive responses — and how those responses evolve as a function of additional information — are often the result of very sophisticated cognitive capabilities. The key to effective intuitive decision making, though, is to learn to better calibrate one’s confidence in the intuitive response (i.e., to develop more refined meta-thinking skills) and to be willing to expand search strategies in lower confidence situations or based on novel information.

Relatedly, it also means we should stop assuming that more conscious and effortful decision-making is necessarily better than more heuristically-driven intuitive decision-making. There is a time and a place to engage in very planful, deliberative decision-making processes, but the success of these processes still hinges on the effectiveness of the heuristic rules we employ. When we decide to be more deliberative, slow, or cognitively intensive, we still rely extensively on heuristics, whether we recognize it or not.

[1] This intimation of a dual process perspective goes all the back at least to William James (1890) , who postulated two different types of thinking: associative thinking and true reasoning. Wason and Evans (1974) later proposed the existence of heuristic processes and analytic processes. Kahneman later popularized the System 1/System 2 conceptualization, culminating in his book, Thinking, Fast and Slow (2011) .

[2] The laundry list of descriptors that have been used to describe these two systems is extensive, as demonstrated in a table on Wikipedia .

[3] The irony of all this is that the dual-process approach is actually a heuristic, and heuristics are what the dual-process theorists argue are fraught with error.

[4] If you don’t have access to the full article, a summary of some of the results can be found as a part of a Conversation article from 2015, written by the lead author of the study.

[5] While this study employs a task like many traditional dual-process tasks (i.e., multiple-choice questions with a defined correct answer), the fact that participants should have expected some knowledge of the material in question differentiates it from more traditional laboratory-based decision-making tasks, such as the one mentioned later.

[6] They do not generally leave open the possibility it could be based on both.

[7] System 1 would also be the alleged culprit if you gave a different erroneous response.

[8] I would actually call it an outcome fallacy, as it is a logical reasoning problem, not necessarily a propensity to do so.

[9] Also known as ecological rationality.

[10] How much information we rely on will vary. Some aspects of the situation are likely to catch our attention, while others may be less salient. The information that catches our attention is what influences those heuristic responses.

[11] At this point, I will tell you I know next to nothing about grass seed. I just needed a good example (don’t ask me how this example came to mind because I have no idea). For anyone who does know grass seed and would say my descriptions are wrong, I will refer you to Scotts , which is where my information came from.

[12] Note that what is considered best was a value judgment on Tom’s part.

[13] In the alternative, he might check out a few other bags, notice that none of them require less watering than Kentucky Bluegrass, before deciding to stick with that choice.

Matt Grawitch Ph.D.

Matt Grawitch, Ph.D. , is a professor at Saint Louis University (SLU), serving within the School for Professional Studies (SPS).

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

Understanding How We Think: System 1 and System 2

  • First Online: 29 February 2024

Cite this chapter

critical thinking system 1 and 2

  • Randolph H. Pherson 4 ,
  • Ole Donner 5 &
  • Oliver Gnad 6  

Part of the book series: Professional Practice in Governance and Public Organizations ((PPGPO))

136 Accesses

System 1 thinking enables us to reach a judgment quickly and effortlessly based on incomplete and even contradictory information. This ability has developed during evolution and contributed to the survival of our species, especially at the beginning of human development. System 2 “kicks in” when we encounter a complex calculation or a complex analysis problem and must think deliberately about what to do. System 1 thinking is more susceptible to cognitive biases which are unconscious errors of reasoning caused by our simplistic information processing strategies. Heuristics can be helpful and profitable but can also lead to misperceptions and incorrect judgments and conclusions. Intuitive traps generally lead to perceptual errors and make an accurate perception of reality more difficult. In contrast, heuristics can be helpful and profitable but can also lead to misperceptions and incorrect judgments and conclusions. Intuitive stumbling blocks are practical manifestations of both cognitive biases and misapplied heuristics and can affect the analyst’s day-to-day work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

“Daniel Kahneman […] is a professor of psychology at Princeton University and one of the world’s most influential cognitive psychologists. He was awarded the 2002 Nobel Prize in Economics for his work.” Kahneman, op. cit.

Kahneman, Daniel, Thinking Fast and Slow (New York: Farrar, Straus, and Giroux, 2011) p. 20.

Kahneman, ibid., p. 21.

Cf. Kahneman, ibid., p. 136.

Cf. the remarks of Dr. Daniel Kahneman in: Daniel Kahneman: “Thinking, Fast and Slow” | Talks at Google, online: https://www.youtube.com/watch?v=CjVQJdIrDJ0&t=905s [Accessed 02 Sep. 2019].

Richards J. Heuer Jr. worked for the Central Intelligence Agency (CIA) for nearly 45 years. In addition to several prominent positions within the CIA, he held lectureships and received several awards for his service. Cf. Heuer, op. cit. p. 185.

Heuer Jr., Richards (2): Limits of Intelligence Analysis, Orbis, quarterly journal of the Foreign Policy Research Institute, Winter 2005, p. 4.

Kahneman, op. cit. p. 34.

“Neuronal plasticity is the peculiarity of synapses, neurons, or even entire brain areas to change in their anatomy and function for the purpose of optimizing ongoing processes.” Source: Wikipedia: Neuronal plasticity, online: https: // de.wikipedia.org/wiki/Neuronale_Plastizit%C3%A4t , [Accessed 14 Aug. 2019].

f. Heuer, loc. Cit. p. 20 f.

See Kahneman, Daniel: “The End of Intuition: Daniel Kahneman Speaks at 14th-Annual Lynford Lecture,” Online: https: //engineering.nyu.edu/news/end-intuition-daniel-kahneman-speaks-14th-annual-lynford-lecture, [Accessed 16 Aug. 2019].

Cf. Heuer, op. cit. p. 22.

Cf. Heuer, ibid., p. 22 ff.

Heuer, ibid., p. 22, own translation.

Kahneman, op. cit. p. 96.

Cf. Kahneman, ibid., p. 50.

Cf. Kahneman, op. cit. p. 32 ff.

Cf. Kahneman, ibid., p. 33 ff.

Cf. Kahneman, ibid., p. 79.

Kahneman, ibid., p. 42.

Kahneman, ibid., p. 105.

Cf. Kahneman, op. cit. p. 106.

Kahneman, ibid., p. 106.

Cf. Kahneman, ibid., p. 112 ff.

This definition is a hybrid of the definitions and explanations of Richards Heuer and Randolph Pherson. See Heuer, op. cit. p. 111: “Cognitive biases are mental errors, caused by our simplified information processing strategies.” and Globalytica: Glossary of Cognitive Biases and Inappropriately Used Heuristics, © 2017 Globalytica, LLC: “They prevent an analyst from accurately understanding reality even when all the needed data and evidence that would form an accurate view is in hand!”

Kahneman, op. cit. p. 127.

Cf. Kahneman, ibid., p. 105.

Sinclair worked for 37 years at the Central Intelligence Agency and subsequently worked as a consultant in the field of analysis. See Sinclair, Robert S.: Thinking and Writing: Cognitive Science and Intelligence Analysis, Center for the Study of Intelligence, Washington, DC: February 2010 (Originally published in January 1984).

Sinclair, op. cit. p. 9.

Sinclair describes the heuristic approach as follows: “The heuristic approach is a form of intelligent trial-and-error, in which we use experience and inference to clarify, narrow, or otherwise refine a problem to make it workable.” Sinclair, ibid., p. 9.

Cf. Pherson, Randolph / Heuer, Jr., Richards J.: Structured Analytic Techniques for Intelligence Analysis , third Edition, CQ Press, California: 2021, p. 23.

Heuer, ibid., p. 162.

Cf. Kahneman, op. cit. p. 128

Cf. Kahneman, op. cit. p. 36.

Cf. Heuer/Pherson, op. cit.

Cf. Heuer / Pherson, ibid., p. 19 ff.

Richards J. Heuer Jr., Psychology of Intelligence Analysis (Washington, DC: CIA Center for the Study of Intelligence, 1999; reprinted by Pherson Associates, 2007).

See Katherine Hibbs Pherson and Randolph H. Pherson, Critical Thinking for Strategic Intelligence, 2nd ed. (Washington, DC: CQ Press, 2017), xxii.

Kahneman, op. cit. p. 88, own emphasis.

Cf. Pherson/Heuer, ibid., p. 128 ff.

Cf. Pherson/Heuer, ibid., p. 182 ff.

Cf. Pherson, Randolph / Heuer, Jr., Richards J.: Structured Analytic Techniques for Intelligence Analysis, 3rd Edition, CQ Press, California: 2020, p. 90 f.

Cf. Pherson/Heuer, ibid., p. 250 ff.

Cf. Pherson/Heuer, ibid., p. 306 ff.

For a critical reflection on the empirical demonstrability of the benefits of structured analytic techniques, see: Artner, op. cit. and Heuer/Pherson, op. cit., p. 345 ff.

Author information

Authors and affiliations.

Pherson Associates, LLC, Reston, VA, USA

Randolph H. Pherson

Strukturierte Analyse Deutschland, Bardowick, Germany

Bureau für Zeitgeschehen (BfZ) GmbH, Frankfurt am Main, Germany

Oliver Gnad

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Pherson, R.H., Donner, O., Gnad, O. (2024). Understanding How We Think: System 1 and System 2. In: Clear Thinking. Professional Practice in Governance and Public Organizations . Springer, Cham. https://doi.org/10.1007/978-3-031-48766-8_2

Download citation

DOI : https://doi.org/10.1007/978-3-031-48766-8_2

Published : 29 February 2024

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-48765-1

Online ISBN : 978-3-031-48766-8

eBook Packages : Behavioral Science and Psychology Behavioral Science and Psychology (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Find the right market research agencies, suppliers, platforms, and facilities by exploring the services and solutions that best match your needs

list of top MR Specialties

Browse all specialties

Browse Companies and Platforms

by Specialty

by Location

Browse Focus Group Facilities

critical thinking system 1 and 2

Manage your listing

Follow a step-by-step guide with online chat support to create or manage your listing.

About Greenbook Directory

IIEX Conferences

Discover the future of insights at the Insight Innovation Exchange (IIEX) event closest to you

IIEX Virtual Events

Explore important trends, best practices, and innovative use cases without leaving your desk

Insights Tech Showcase

See the latest research tech in action during curated interactive demos from top vendors

Stay updated on what’s new in insights and learn about solutions to the challenges you face

Greenbook Future list

An esteemed awards program that supports and encourages the voices of emerging leaders in the insight community.

Insight Innovation Competition

Submit your innovation that could impact the insights and market research industry for the better.

Find your next position in the world's largest database of market research and data analytics jobs.

critical thinking system 1 and 2

For Suppliers

Directory: Renew your listing

Directory: Create a listing

Event sponsorship

Get Recommended Program

Digital Ads

Content marketing

Ads in Reports

Podcasts sponsorship

Run your Webinar

Host a Tech Showcase

Future List Partnership

All services

critical thinking system 1 and 2

Dana Stanley

Greenbook’s Chief Revenue Officer

Insights Industry News

August 1, 2020

Lessons from Thinking, Fast & Slow: System 1 and System 2

System 1, System 2, AND System 3? How systemic thinking can improve your insights.

Lessons from Thinking, Fast & Slow: System 1 and System 2

by Molly Purcell

Digital Marketing Specialist at GreenBook

The concept os two thinking systems, System 1 Thinking and System 2 Thinking, was created by the Nobel Prize winner and the intellectual godfather of behavioural economics, Daniel Kahneman in the book Thinking, Fast & Slow . He and his great collaborator Amos Tversky framed human thinking in two forms that they call System 1 and System 2 .   According to Kahneman and Tversky, human judgment and decision-making with all of its biases and heuristics could be explained within the two-system view.

What is the difference between System 1 and System 2?

Which is better system 1 or system 2.

First, let’s consider the bigggest challenge with Type 2 thinking. “System 2 is a lazy controller and doesn’t like to expend much effort,” writes Rich Raquet in Thinking Fast and Slow: How Market Researchers Can Implement System 1 and System 2 Thinking. “One of its main functions is to monitor and control thoughts and actions suggested by System 1.” Kahneman and Shane Frederick (professor or marketing at Yale) have also shown that System 1 is indeed lazy and doesn’t always do that job.

In some industries this might not have relevance. But when it comes to market research, there are implications that support a preference for System 1 thinking. When people are asked questions that require thinking, and researchers haven’t engaged System 2, we limit ourselves to surface information and not actionable insights. On the contrary, getting System 2 thinking involved brings for deeper insights and ideas (rather than “superfluous answers with little depth” ). “If we can use some other external mechanism to activate System 2 then we are in the game. Smart Incentives is an effective idea generation and gamification approach we have used at TRC for this purpose.”

Raquet continues, “Kahneman says that people are remarkably adept at coming up with answers to all kinds of questions without knowing how or why (surveys researchers beware!).” His explanation for a lot of this is the idea of substitution. That is, people answer an easier but wrong question rather than the difficult one they were asked. So when asked whether they approve of the President’s performance, they simply answer it by assuming that the question is really whether they like the President. They may not even know that they have done that.

“This has implications for market research where respondents are asked all kinds of questions, some of which are quite difficult. If they are simply answering an alternate easier question then it is no wonder that the data don’t make sense! Much of conventional market research assumes that decision-making is done only by System 2 and that too with little input from System 1. Kahneman shows that that is not the case (as demonstrated by a variety of experiments in behavioural economics) and researchers would be well advised to take note and think about how to account for the influence of System 1 in consumer decision-making.”

How researchers can use System 1 and System 2 for insights

Olivier Tilleuil, Founder & CEO of EyeSee pointed out that neuro-marketing techniques could measure System 1 mechanisms via biological aspects of behaviour. Techniques that can measure System 1 thinking include:

  • Eye-tracking : measuring eye positions and movement to determine where the person is looking. An increase in package visibility by 10% can lead to an increase in sales by 2%
  • Facial coding : measuring emotions through naturalistic and spontaneous facial expressions. This method can predict the viral potential of videos 2x better than surveys alone.
  • Brain-imaging techniques : functional magnetic resonance imaging (fMRI), magnetoencephalography (MEG), electroencephalography (EEG), positron emission tomography (PET), etc.
  • Biometric reactions monitoring   (heart rate, skin conductance) , which depends on the level of emotional excitement
  • Implicit Association Test (IAT) : exposing the subconscious associations that respondents are not able, or not willing, to self-report
  • Virtual shopping simulation : used to re-create actual shopping experiences. Correlates with real market shares of 0.8-0.95

While measuring System 1 responses is good practice for a brand, keep in mind that best practices state both System 1 and System 2 insights should be measured together for accurate insights. “If we made every decision based upon what our System 1 or “fast” brain suggested, half of us would be in jail, and the other half would find themselves indulging in too much dessert at the dinner table…Evaluating the speed of attribute associations alone is not enough to understand what drives behaviour. It is imperative that System 1 and System 2 research techniques be layered together holistically.” – Garrett Meccariello, Protobrand

What is the future of systems thinking in market research?

In an interview with GreenBook, Orlando Wood, Chief Innovation Officer of System1 Group spoke about how traditional research tactics are rooted in System 2 thinking, “The way we ask questions, by and large, requires System 2 processing…We assume that people are entirely rational agents with a perfect grasp of how they will behave in a different context when actually the people and environments around them in real-life settings have an enormous bearing on their behaviour and decisions.

So I believe that there is actually an enormous opportunity for researchers to create experiences that mirror more closely real-life environments and to create conditions that promote System 1 thinking. These approaches will get us closer to real-life behaviour, and help us to understand and predict it better.”

Rethinking the Way We Think

System 1 is in the driving seat, so effective marketing is marketing that appeals to it directly.

In a case study published on the GreenBook Blog, System1 Group found that their client’s high impact ads performed exceptionally well in emotional testing – they outscored standard ads by over 60% for both the level of positive emotion and the intensity people felt it.

Neuroscience is advancing the dual systems paradigm

Breaking down the brain into two distinct ‘systems’ has been extremely helpful for simplifying the complexities of cognition by dividing the brain but it has some limitations. Framing thinking as an either/or process:

  • Does not account for context (e.g., decisions change relative to a context)
  • Does not account for time (e.g., decisions are anchored in time and space)
  • Not mutually exclusive (e.g., habits are automatic, yet can be conscious)
  • Does not fully account for social influences or emotions

A recently published series of articles in the Journal of Consumer Research concluded,“Dual-process conceptualizations (“system 1/system 2”) may be inherently misleading, arguing that it is better to view behaviour as the result of deep interactions among conscious and unconscious processing. There seems to be a consensus for the view that dual-process accounts of behaviour, although popular and generative, may be approaching the end of their lifecycle.” ( Poehlman & Williams, 2017 )

Neuroscience has repeatedly demonstrated our brain does not passively wait for information, but rather is “always active”, automatically and continuously, predicting the incoming streams of input before they arrive to prepare us for action ( Clark, 2013 ).

Behavioural science is moving beyond a consciousness-centric perspective (deep), and focusing on a broader range of causal drivers that are highly relevant for marketers and insights to understand, predict and change behaviour. This major shift is making behavioural science more applied by providing a broader, holistic picture of customer behaviour that is grounded in context and time- two underappreciated drivers of decision-making.

Insights professionals are looking for System 3 methods

Coined by The Irrational Agency, System 3 refers to the imaginative capability of customers. According to their blog, “Customers imagine their possible futures: the outcomes they would experience after a choice, and how those outcomes will make them feel. The future that makes them feel happiest will be the one they choose. These choices use different parts of the brain than System 1 and 2. They are called System 3 choices.

“Think about how you might buy a car. System 1 would suggest that you see a colour, or shape, or brand of car, immediately fall in love with it and buy without thinking. System 2 implies that you calculate the price, financing options, fuel efficiency, resale value – and pick the model that makes the most financial sense.

“A System 3 decision would look like this: imagine yourself driving that car. Feel, in your mind, the sensations of the seats and how it drives. Imagine how your partner or your friends would view you in it. Consider, too, the impact on your bank account and what else you would be missing out on to pay for it… How do you feel? Is it good?… The car you feel best in – within this mental simulation – is probably the one you’ll choose.”

The original version of this article appeared here.

System thinking resources, blog articles.

  • Introducing System 3: How We Use Our Imagination to Make Choices
  • Undertone & System1 Prove Emotion Drives Digital Profit
  • Consumer Insights ?" href="https://www.greenbook.org/mr/market-research-methodology/is-system-1-system-2-still-relevant-for-consumer-insights/" target="_blank" rel="noopener noreferrer">Is “System 1” & “System 2” Still Relevant for Consumer Insights?
  • Reflecting On The Reality Of Research: A Pre-BAQMaR Conference Interview With Orlando Wood of BrainJuicer
  • Exploring System 1 A Little More Deeply
  • Some Help in Evaluating Subconscious, Implicit, System 1 Measures
  • Do We Need a Copernican Revolution in Market Research?
  • Behavioral Science Market Research" href="https://www.greenbook.org/mr/market-research-trends/5-lies-youve-been-told-about-behavioral-science-market-research/" target="_blank" rel="noopener noreferrer">5 Lies You’ve Been Told About Behavioral Science Market Research
  • Busting The 3 Myths About Behavioral Economics That Are Holding MR Back
  • [Webinar Recording] COVID-19: Using System 1 to Connect with Consumers in a Crisis

Additional resources

  • Neuroscience, psychology and economics: the evidence for System 3 (long)
  • neuromarketing -neuromarketing" target="_blank" rel="noopener noreferrer">Market Research Companies that Specialize in Neuroscience and Neuromarketing

Molly Purcell

The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.

Comments are moderated to ensure respect towards the author and to prevent spam or self-promotion. Your comment may be edited, rejected, or approved based on these criteria. By commenting, you accept these terms and take responsibility for your contributions.

More from Molly Purcell

Brand Advertising vs. Performance Advertising: Which Is Right For You?

Research Methodologies

Brand Advertising vs. Performance Advertising: Which Is Right For You?

Which advertising tracking strategy is right for your company?

September 1, 2020

Read article

What’s the Difference Between Consumer Insights and Market Research?

What’s the Difference Between Consumer Insights and Market Research?

“Insights” and “Market Research” are used interchangeably in our industry, but what are their true definitions and which should you focus on?

What is Brand Tracking?

What is Brand Tracking?

Learn more about brand tracking. Plus resources to start your brand tracking study.

Voice of the Customer Programs: What They Are and How to Make Them Work for You

Voice of the Customer Programs: What They Are and How to Make Them Work for You

All the information you need to get an effective VOC program started

August 15, 2020

Top in Quantitative Research

Moving Away from a Narcissistic Market Research Model

Moving Away from a Narcissistic Market Research Model

Why are we still measuring brand loyalty? It isn’t something that naturally comes up with consumers, who rarely think about brand first, if at all. Ma...

Devora Rogers

Devora Rogers

Chief Strategy Officer at Alter Agents

May 31, 2023

The Stepping Stones of Innovation: Navigating Failure and Empathy with Carol Fitzgerald

Qualitative Research

The Stepping Stones of Innovation: Navigating Failure and Empathy with Carol Fitzgerald

Natalie Pusch

Natalie Pusch

Senior Content Producer at Greenbook

March 16, 2023

Play Episode

Sign Up for Updates

Get content that matters, written by top insights industry experts, delivered right to your inbox.

critical thinking system 1 and 2

67k+ subscribers

Weekly Newsletter

Greenbook Podcast

Event Updates

I agree to receive emails with insights-related content from Greenbook. I understand that I can manage my email preferences or unsubscribe at any time and that Greenbook protects my privacy under the General Data Protection Regulation.*

We will send you a greatest letters one per week for your happy

Your guide for all things market research and consumer insights

Create a New Listing

Manage My Listing

Find Companies

Find Focus Group Facilities

Tech Showcases

GRIT Report

Expert Channels

Get in touch

Marketing Services

Future List

Publish With Us

Privacy policy

Cookie policy

Terms of use

Copyright © 2024 New York AMA Communication Services, Inc. All rights reserved. 234 5th Avenue, 2nd Floor, New York, NY 10001 | Phone: (212) 849-2752

Shortform Books

Shortform Books

The World's Best Book Summaries

System 1 Thinking: How It Works (And When You Shouldn’t Trust It)

' src=

This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.

Like this article? Sign up for a free trial here .

What is “System 1 Thinking,” from Daniel Kahneman’s Thinking, Fast and Slow ? When should I use it, and when shouldn’t I?

System 1 thinking is thinking that operates automatically and quickly. It takes little or no effort, and no sense of voluntary control.

We’ll cover how Kahneman’s System 1 thinking is involved in making judgments and what biases System 1 thinking leaves you susceptible to.

Two Systems of Thinking

We believe we’re being rational most of the time, but really much of our thinking is automatic , done subconsciously by instinct. Most impressions arise without your knowing how they got there. Can you pinpoint exactly how you knew a man was angry from his facial expression, or how you could tell that one object was farther away than another, or why you laughed at a funny joke?

This becomes more practically important for the decisions we make. Often, we’ve decided what we’re going to do before we even realize it . Only after this subconscious decision does our rational mind try to justify it.

The brain does this to save on effort, substituting easier questions for harder questions. Instead of thinking, “should I invest in Tesla stock? Is it priced correctly?” you might instead think, “do I like Tesla cars?” The insidious part is, you often don’t notice the substitution . This type of substitution produces systematic errors, also called biases. We are blind to our blindness.

System 1 and System 2 Thinking

In Thinking, Fast and Slow , Kahneman defines two systems of the mind:

System 1 thinking: operates automatically and quickly, with little or no effort, and no sense of voluntary control

  • System 1 Thinking Examples: Detect that one object is farther than another; detect sadness in a voice; read words on billboards; understand simple sentences; drive a car on an empty road.

System 2 thinking: allocates attention to the effortful mental activities that demand it, including complex computations. Often associated with the subjective experience of agency, choice and concentration

  • System 2 Thinking Examples: Focus attention on a particular person in a crowd; exercise faster than is normal for you; monitor your behavior in a social situation; park in a narrow space; multiple 17 x 24.

Properties of System 1 Thinking

Kahneman’s System 1 thinking can be completely involuntary. You can’t stop your brain from completing 2 + 2 = ?, or from considering a cheesecake as delicious. You can’t unsee optical illusions , even if you rationally know what’s going on. 

System 1 thinking can arise from expert intuition, trained over many hours of learning. In this way a chess master can recognize a strong move within a second, where it would take a novice several minutes of System 2 thinking.

System 1 thinking automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions.  

System 1 thinking can detect errors and recruits System 2 for additional firepower. 

  • Kahneman tells a story of a veteran firefighter who entered a burning house with his crew, felt something was wrong, and called for them to get out. The house collapsed shortly after. He only later realized that his ears were unusually hot but the fire was unusually quiet, indicating the fire was in the basement.

Because System 1 thinking operates automatically and can’t be turned off, biases are difficult to prevent. Yet it’s also not wise (or energetically possible) to constantly question System 1, and System 2 is too slow to substitute in routine decisions. “ The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.”

In summary, most of what you consciously think and do originates in Kahneman’s System 1 thinking, but System 2 takes over when the situation gets difficult. System 1 normally has the last word.

System 1 Thinking: How We Make Judgments

System 1 thinking continuously monitors what’s going on outside and inside the mind and generates assessments with little effort and without intention. The basic assessments include language, facial recognition, social hierarchy, similarity, causality, associations, and exemplars. 

  • In this way, you can look at a male face and consider him competent (for instance, if he has a strong chin and a slight confident smile).
  • The survival purpose is to monitor surroundings for threats.

However, not every attribute of the situation is measured. System 1 thinking is much better at determining comparisons between things and the average of things, not the sum of things. Here’s an example:

In the below picture, try to quickly determine what the average length of the lines is. 

critical thinking system 1 and 2

Now try to determine the sum of the length of the lines. This is less intuitive and requires System 2.

Unlike System 2 thinking, these basic assessments of System 1 thinking are not impaired when the observer is cognitively busy.

In addition to basic assessments: Kahneman’s System 1 thinking also has two other characteristics:

1) Translating Values Across Dimensions, or Intensity Matching

System 1 thinking is good at comparing values on two entirely different scales. Here’s an example.

Consider a minor league baseball player. Compared to the rest of the population, how athletic is this player? 

Now compare your judgment to a different scale: If you had to convert how athletic the player is into a year-round weather temperature, what temperature would you choose?

Just as a minor league player is above average but not the top tier, the temperature you chose might be something like 80 Fahrenheit.

As another example, consider comparing crimes and punishments, each expressed as musical volume. If a soft-sounding crime is followed by a piercingly loud punishment, then this means a large mismatch that might indicate injustice.

2) Mental Shotgun

Kahneman’s System 1 thinking often carries out more computations than are needed. Kahneman calls this “mental shotgun.”

For example, consider whether each of the following three statements is literally true:

  • Some roads are snakes.
  • Some jobs are snakes.
  • Some jobs are jails.

All three statements are literally false. The second statement likely registered more quickly as false to you, while the other two took more time to think about because they are metaphorically true. But even though finding metaphors was irrelevant to the task, you couldn’t help noticing them – and so the mental shotgun slowed you down. Your System-1 brain made more calculations than it had to.

Biases of System 1 Thinking

Putting it all together, we are most vulnerable to biases when:

  • System 1 thinking forms a narrative that conveniently connects the dots and doesn’t express surprise.
  • Because of the cognitive ease by System 1 thinking, System 2 is not invoked to question the data. It merely accepts the conclusions of System 1.

In day-to-day life, this is acceptable if the conclusions are likely to be correct, the costs of a mistake are acceptable, and if the jump saves time and effort. You don’t question whether to brush your teeth each day, for example.

In contrast, this shortcut in thinking is risky when the stakes are high and there’s no time to collect more information, like when serving on a jury, deciding which job applicant to hire, or how to behave in a weather emergency.

Here’s a collection of Kahneman’s System 1 thinking biases.

System 1 Thinking Bias #1: Ordering Effect

First impressions matter. They form the “trunk of the tree” to which later impressions are attached like branches. It takes a lot of work to reorder the impressions to form a new trunk.

Consider two people who are described as follows:

  • Amos: intelligent, hard-working, strategic, suspicious, selfish
  • Barry: selfish, suspicious, strategic, hard-working, intelligent

Most likely you viewed Amos as the more likable person, even though the five words used are identical , just differently ordered. The initial traits change your interpretation of the traits that appear later.

This explains a number of effects:

  • In an experiment, students were randomly ordered in a report of academic performance. This report was then given to teachers. Students who were randomly rated as more competent ended the year with better academic scores, even though they started the school year with no average difference.
  • Kahneman previously graded exams by going through an entire student’s test before going to the next student’s. He found that the student’s first essay dramatically influenced his interpretation of later essays – an excellent first essay would earn the student benefit of the doubt on a poor second essay. A poor first essay would cast doubt on later effective essays. He subverted this by batching by essay and iterating through all students.
  • Work meetings often polarize around the first and most vocal people to speak. Meetings would better yield the best ideas if people could write down opinions beforehand.
  • Witnesses are not allowed to discuss events in a trial before testimony. 

The antidote to the ordering effect:

  • Before having a public discussion on a topic, elicit opinions from the group confidentially first. This avoids bias in favor of the first speakers.

System 1 Thinking Bias #2: Mere Exposure Effect

Exposing someone to an input repeatedly makes them like it more. Having a memory of a word, phrase, or idea makes it easier to see again.

System 1 Thinking Bias #3: Narrative Fallacy

This is explained more in Part 2, but it deals with System 1 thinking. 

People want to believe a story and will seek cause-and-effect explanations in times of uncertainty. This helps explain the following:

  • Stock market movements are explained like horoscopes, where the same explanation can be used to justify both rises and drops (for instance, the capture of Saddam Hussein was used to explain both the rise and subsequent fall of bond prices).
  • Most religions explain the creation of earth, of humans, and of the afterlife.
  • Famous people are given origin stories – Steve Jobs reached his success because of his abandonment by his birth parents. Sports stars who lose a championship have the loss attributed to a host of reasons. 

Once a story is established, it becomes difficult to overwrite. (Shortform note: this helps explain why frauds like Theranos and Enron were allowed to perpetuate – observers believed the story they wanted to hear.)

System 1 Thinking Bias #4: Affect Heuristic

How you like or dislike something determines your beliefs about the world.

For example, say you’re making a decision with two options. If you like one particular option, you’ll believe the benefits are better and the costs/risks more manageable than those of alternatives. The inverse is true of options you dislike.

Interestingly, if you get a new piece of information about an option’s benefits, you will also decrease your assessment of the risks, even though you haven’t gotten any new information about the risks. You just feel better about the option, which makes you downplay the risks.

Vulnerability to Bias

We’re more vulnerable to biases when System 2 is taxed.

To explain this, psychologist Daniel Gilbert has a model of how we come to believe ideas:

  • System 1 thinking constructs the best possible interpretation of the belief – if the idea were true, what does it mean?
  • System 2 evaluates whether to believe the idea – “unbelieving” false ideas.

When System 2 is taxed, then it does not attack System 1 thinking’s belief with as much scrutiny. Thus, we’re more likely to accept what it says.

Experiments show that when System 2 is taxed (like when forced to hold digits in memory), you become more susceptible to false sentences. You’ll believe almost anything.

This might explain why infomercials are effective late at night. It may also explain why societies in turmoil might apply less logical thinking to persuasive arguments, such as Germany during Hitler’s rise.

———End of Preview———

Like what you just read read the rest of the world's best summary of "thinking, fast and slow" at shortform . learn the book's critical concepts in 20 minutes or less ..

Here's what you'll find in our full Thinking, Fast and Slow summary :

  • Why we get easily fooled when we're stressed and preoccupied
  • Why we tend to overestimate the likelihood of good things happening (like the lottery)
  • How to protect yourself from making bad decisions and from scam artists
  • ← Expecting Better Fish Chart: Fish to Eat When Pregnant
  • Narrative Fallacy: 7 Examples of Harmful Storytelling →

' src=

Amanda Penn

Amanda Penn is a writer and reading specialist. She’s published dozens of articles and book reviews spanning a wide range of topics, including health, relationships, psychology, science, and much more. Amanda was a Fulbright Scholar and has taught in schools in the US and South Africa. Amanda received her Master's Degree in Education from the University of Pennsylvania.

You May Also Like

Extrovert Bias: Society Prefers Outgoing People

Extrovert Bias: Society Prefers Outgoing People

Competence, Relatedness, and Autonomy in Children

Competence, Relatedness, and Autonomy in Children

How Southern Identity Shapes Conservative Values

How Southern Identity Shapes Conservative Values

Emotional Intelligence 2.0 Quotes to Reflect on

Emotional Intelligence 2.0 Quotes to Reflect on


The SEE Method: Try This Association Memory Technique

How to Build Faith in Yourself and Change Your Life

How to Become a New Person Through Neuroplasticity

Leave a reply cancel reply.

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.


System 1 and System 2 Thinking

Behavioural science has given us concepts, theories, and frameworks that help make sense of the complexities of human behaviour and how people make decisions. Some of these focus on understanding the critical role of context or explore concepts such as anchoring or framing of information. Others look at how everyone is wired with inherent pre-dispositions, otherwise known as cognitive biases, that cause us to act in certain ways. Examples of these biases include availability bias and confirmation bias.   Arguably the most famous theory in the behavioural science world was popularised by Nobel Laureate Daniel Kahneman and describes the process of ‘thinking fast and slow’ otherwise known as System 1 and System 2 thinking. This two-system model has been widely adopted due to its simplicity and intuitive nature. Nowadays, even if you don’t know anything about behavioural science, you’ve probably heard of Kahneman and would recognise the phrase System 1 and 2’.   This article, the fifth in our series exploring new frontiers in behavioural science, is all about System 1 and System 2 thinking. At this point in the series we want to take a pause from looking forward to re-establish the foundations that underpin our behavioural science knowledge. We cannot explore new frontiers on unstable footing, and as such we need to investigate how misperceptions about this pivotal theory have arisen over the years.   Firstly, we summarise System 1 and 2 as described by Kahneman and other behavioural scientists, before examining the claim that the theory is an oversimplification of the human mind. We then identify three key misconceptions that have developed following the extensive discussion of system 1 and 2 in the popular media, outlining evidence which ‘debunks’ these myths, improves our understanding, and strengthens our behavioural foundations. 

System 1 & 2 - A Refresh

For centuries, philosophers, psychologists, and scientists alike have distinguished between intuitive and conscious reasoning; from Decartes’ mind-body dualism in the 17th century to Posner and Synder’s formal depiction of the (first) dual process model of the mind in 1975. However, it was not until Daniel Kahneman included the terms system 1 and system 2 in his 2011 bestselling book “Thinking Fast and Slow” that the distinction between automatic and deliberate thought processes became popularised (it is worth noting that he was not the first to use these terms, that honour goes to Stanovich & West in 2000 [1] )   Kahneman’s model divides the mind’s processes into two distinct systems:

  • System 1 “is the brain’s fast, automatic, intuitive approach” [2] . System 1 activity includes the innate mental activities that we are born with, such as a preparedness to perceive the world around us, recognise objects, orient attention, avoid losses - and fear spiders! Other mental activities become fast and automatic through prolonged practice.   
  • System 2 is “the mind’s slower, analytical mode, where reason dominates” [3] . Usually, system 2 activity is activated when we do something that does not come naturally and requires some sort of conscious mental exertion. 

A common example used to demonstrate the two systems is the following puzzle:   A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?  Faced with this puzzle, the majority of people instantly guess 10 cents. The correct answer, however, is 5 cents - which, again, most people can work out after spending more time thinking about the question. For years, this has been used as a perfect example of how the way we think is ruled by two types of mental processes: fast and intuitive, versus slow and analytical.  The intention of the theory was to provide a helpful analogy that can guide our understanding of how our minds process information - and it does an admirable job of this. Analysing behaviours through a system 1 and 2 lens has been invaluable for furthering our understanding of human decision-making and behaviour, as well as exploring ways we can influence or ‘nudge’ behaviour in different directions. 

This theory quickly moved from academia to the non-academic world

The distinction between system 1 and 2 is appealing and the dual-system theory has travelled from the world of academia into popular language and mainstream thinking, in part due to its accessible nature. Although it is not the easiest read, the myriad examples Kahneman uses to illustrate concepts and ideas throughout the chapters result in a book that is entertaining and can be appreciated across the board.  On the one hand this is a welcome bridge for the often-criticised gap that exists amongst academics and the ‘real-world’. On the other, however, in its transition from academia to pop-culture, the original theory of system 1 and 2 seems to have lost some of its depth, nuance and detail, and has been replaced with over-generalisations and (often false) simplifications of how the human mind operates. Central to this misunderstanding seems to be the idea that system 1 and system 2 are literal representations of our brain structure. Additionally, many of the more popular ‘sound-bites’ from the book have been reproduced and disseminated without the context and constraints provided when read in-situ.      As the dual-system theory sticks in ‘everyday language’ and is used liberally by a variety of non-academic sources, the original descriptions of the system 1 and 2 are somewhat glossed over, resulting in a variety of oversimplified assertions (myths) about how our mind operates. As behavioural science progresses, it is important to be wary of these myths that have arisen and to remain vigilant in our understanding.  This article aims to ‘de-bunk’ three key myths that have emerged in popular media by focussing on the following facts - as per our current understanding - of system 1 and 2: 

  • The brain is not literally divided into two!
  • System 1 and 2 work in tandem, not as separate entities 
  • Both systems can be biased and can make mistakes - Neither one is categorically “good” or “bad”

In this way we ‘re-establish’ an important section of the foundations of behavioural science, which will ultimately allow us continue to move forward towards new frontiers.

FACT 1: The brain is not literally divided into two

Just as the common myth that people are either right or left brained has been proved false, we also know there aren’t actually sections of the brain with system 1 or system 2 stamped on them. A misconception many people have is that our brain is physically divided into two parts, but this is not the case. Indeed, Kahneman clearly states that “there is no one part of the brain that either of the systems would call home” [4] . The idea of left-brain and right-brain thinking is persistent, and many people continue to believe that the left side of the brain is responsible for analytical thinking, whilst the right side is more creative. It is easy to understand why system 1 and system 2 type thinking have been mistakenly associated with this idea. System 2’s rational, logical thinking is analogous with the ‘left brain’ and similarly system 1 thinking seems easily associated with the idea of an intuitive, artistic right brain.  These ideas are fundamentally incorrect, however. The brain is not physically divided in any way and as such system 1 and system 2 type thinking cannot be physically divided either. This debunks our first myth.  All this being said, neuroscientists have found that some regions of the brain are slightly more associated with one of the two systems [5] . For example, this body of evidence indicates that affective cognition (system 1-type thinking for emotional responses) is located in the mesolimbic dopamine reward system. This pathway is responsible for the release of dopamine. Given that human beings tend to seek instant gratification, dopamine plays a key role in “thinking fast” [6] . On the other hand, the frontal and parietal cortex have been linked to the analytic system of decision-making (system 2), therefore this region is more associated with our complex reasoning and higher-order “slow” thinking. The separation of brain functions for decision-making and perceived specialisation has given rise to the multiple systems hypothesis. However, it is important to note that it is the combination of information gathered from the multiple systems - mesolimbic pathway, frontal cortex, and parietal cortex - that help to produce our decisions. In other words, whilst different regions may be more or less relevant for either system 1 or system 2, neither hemisphere is restricted to solely system 1-type decision-making and the other for system 2.

FACT 2: System 1 and 2 work in tandem, not as separate entities

Another myth or common misconception is that system 1 and 2 are hierarchical processes with one occurring before the other. In more general terms this means that people often think system 1 thinking occurs first and system 2 thinking following later if necessary. The dual-system approach actually imagines the two forms of reasoning as integrated and mutually supportive. Indeed, Kahneman points out that almost all processes are a mix of both systems, and it is important to emphasise that the systems are complementary.   Importantly, unconscious processes such as emotion (system 1) play a vital role in our more logical reasoning (system 2), and it is this integrative approach that makes our decision-making meaningful, and often more effective and purposeful [7] . The philosopher David Hume, for example, recognises the importance of the heart (system 1) for the head (system 2) in decision-making as reason alone rarely provides any clear motivation and drive. Without emotion or feeling, reason is merely a cold, mechanical method of calculation; informing us of what the consequences of our actions may be, but not whether they are desirable.  Ellen Peters and her colleagues provide further evidence of the mutually supportive nature of system 1 and 2, demonstrating how decisions are most effective when drawing from both systems. They conducted an experiment in which they gave participants tasks that required processing numbers. Unsurprisingly, participants who had high levels of numeracy outperformed those who were less numerate. Numeracy has been previously linked to an improved ability to use system 2 reasoning effectively. 

However, they also found that participants with great numeracy skills were also able to use system 1 reasoning more frequently and reliably. They found that the more numerically able participants’ unconscious responses guided their initial unconscious decisions, which then triggered the use of the conscious thought needed to complete the task. Importantly, they also found that over time, the consistent and effective use of system 2 reasoning calibrates system 1 processing making that more effective, which in turn promotes better systematic (system 2) reasoning, essentially creating a feedback loop. These findings suggest that the two systems do not work in isolation but are in fact integrated and mutually influential on each other.  

Outside of an experimental setting, everyday tasks provide further evidence for the teamwork of systems 1 and 2. Language is just one example; we communicate deliberately, but during the flow of conversation we don’t tend to rehearse grammatical rules which are taken into account without conscious thought. Physical activity is another. Recent research suggests that exercise is partly habit-driven, yet also requires conscious oversight to be successfully completed [8] .  We can also see the integrated nature of systems 1 and 2 in tasks such as driving a familiar route, typing, or playing a well-rehearsed tune on an instrument [9] , all of which require a combination of deliberate and automatic action.

FACT 3: Both systems can be biased and can make mistakes - neither one is categorically “good” or “bad”

A particularly interesting myth that has developed around systems 1 and 2 is the idea that system 1 is the source of bias, and system 2 is called up as the ‘voice of reason’ to correct such biases in our thinking. This may have developed off the back of the common mistake of using the terms ‘emotional’ and ‘irrational’ interchangeably - particularly in the context of describing a person’s disposition.  Whatever the reason may be for the development of this misconception, it is in fact, a myth. It is not the case that system 1 is biased and system 2 is not. Both are actually susceptible to bias and both can make mistakes. For example, system 1 may have gathered accurate information, yet system 2 may process this poorly and make a mistake. Conversely, system 1 may have gathered biased information and so despite system 2 processing it accurately, the conclusion may be incorrect due to a biased starting point. Confirmation bias is a good example of how both systems can be affected by bias: we may notice and more easily remember information that supports our existing beliefs (a system 1 activity), while also being motivated to analyse new information in a way that supports our existing belief (a system 2 activity).  This is to say, system 2 is just as prone to error as is system 1; we ignore evidence we dislike, overthink seemingly simple/ irrelevant decisions, rationalise our biases and produce questionable justifications for bad decisions: I only had a small breakfast, so it is fine to have a big slice of cake.  In the medical field, for example, it was long thought that diagnostic errors were caused by system 1 type reasoning and clinicians were consequently advised to think more slowly and gather as much information as possible. However, later reviews found that experts were just as likely to make errors when attempting to be systematic and analytical. Research by behavioural scientists such as Gerd Gigerenzer has shown that more information and slower processing does not always lead to the most accurate answer. Diagnosing patients and making treatment decisions using mental shortcuts and evidence-based rules of thumb can perform just as well, if not better. This discovery lead to the creation of ‘fast and frugal’ decisions trees for patient diagnosis, where doctors only needed to ask three crucial diagnostic questions. When tested, using this method improved accurate diagnosis of heart disease by between 15-25% [10] .   Indeed, more information, more computation, and more time do not always result in better performance; system 1 type strategies can often be just as effective, if not more, in certain circumstances than more complex decision-making strategies. Researchers decided to test the accuracy of various heuristics (a system 1 activity) across a number of real-world situations and compare this accuracy against more complex decision-making strategies as the benchmark [11] .  One of the tasks in their experiment was to predict which of two cities (Los Angeles or Chicago) had a higher rate of homelessness based on some basic initial data points provided. They compared the accuracy of prediction using three common heuristics (take-the-best, tallying and minimalist [12] ) to two baseline complex predictive strategies (linear regression and naive Bayes [13]   ) and found that when faced with limited initial data, heuristic strategies actually outperform complex strategies. These results suggest that the increased effort does not always result in increased accuracy as a general rule [14] . Kahneman finds the misconception that system 1 is error-prone and system 2 is analytic and therefore correct “ridiculous”. He explains that “system 1 is not a machine for making errors, it usually functions beautifully” [15] . Indeed, critics and proponents of system 1 and 2 alike agree on the pressing need to dispel the ‘good/bad’ (biased/ unbiased) fallacy. 

Since Kahneman first published ‘Thinking Fast and Slow’, the theory of system 1 and system 2 thinking has quickly spread through both the academic and non-academic worlds, referenced not only in the behavioural sciences but across a variety of disciplines and in popular media. Its popularity rests, in part, in its intuitive simplicity but this had led to misunderstandings and misconceptions about how this dual-system of decision making actually works. 

In this article we have debunked three pervasive myths by demonstrating the true facts behind the fictions.

  • Fact 1 - The two systems are not physically tied to any specific area of the brain
  • Fact 2 – System 1 & 2 are complementary systems that work in tandem to produce more effective and efficient decision-making
  • Fact 3 - Neither system is accurate 100% of the time, both can make mistakes! 

While these myths possess considerable intuitive appeal, it would be a shame – and more importantly, damaging to the field - if their simplistic descriptions drowned out the more fascinating story of how our brains really work. The theory of system 1 and system 2 is incredibly useful as a way to understand the complexities of human decision making. Clearing up these pervasive myths can only help us utilise the theory even more effectively in the future. 

New Frontiers in Behavioural Science Series: Article 1 - The Past, The Present and The Future Article 2 - Default Settings - The most powerful tool in the behavioural scientist's toolbox Article 3 - Social norms and conformity part 1 Article 4 - Social norms and conformity part 2

About the authors:

Crawford Hollingworth  is co-Founder of The Behavioural Architects, which he launched in 2011 with co-Founders Sian Davies and Sarah Davies. He was also founder of HeadlightVision in London and New York, a behavioural trends research consultancy. HeadlightVision was acquired by WPP in 2003. He has written and spoken widely on the subject of behavioural economics for various institutions and publications, including the Market Research Society, Marketing Society, Market Leader, Aura, AQR, London Business School and Impact magazine. Crawford is a Fellow of The Marketing Society and Royal Society of Arts.

Liz Barker  is Global Head of BE Intelligence & Networks at The Behavioural Architects, advancing the application of behavioural science by bridging the worlds of academia and business. Her background is in Economics, particularly the application of behavioural economics across a wide range of fields, from global business and finance to international development. Liz has a BA and MSc in Economics from Cambridge and Oxford.

[1] Stanovich, K.E. & West, R.F. (2000). Individual Differences in Reasoning: Implications for the Rationality Debate. Behavioural and Brain Sciences, 23, 645-665 [2] The Harvard Gazette (2014). Layers of choice . Retrieved from https://news.harvard.edu/gazette/story/2014/02/layers-of-choice/ [3] The Harvard Gazette (2014). Layers of choice . Retrieved from https://news.harvard.edu/gazette/story/2014/02/layers-of-choice/ [4] Kahneman, D. (2011). Thinking, fast and slow . New York: Farrar, Straus and Giroux. Pg29. [5] Camerer, C., Loewenstein, G., & Prelec, D. (2005). Neuroeconomics: How Neuroscience Can Inform Economics. Journal of Economic Literature, 43 (1), 9-64. [6] TheEconReview. (2017). What Neuroscience Has to Say about Decision-Making. Retrieved from https://theeconreview.com/2017/01/13/what-neuroscience-has-to-say-about-decision-making/ [7] Peters, E., Västfjäll, D., Slovic, P., Mertz, C. K., Mazzocco, K., & Dickert, S. (2006). Numeracy and decision making. Psychological science , 17(5), 407-413. [8] Gardner, B., & Rebar, A. L. (2019). Habit Formation and Behaviour Change. In Oxford Research Encyclopaedia of Psychology; Rhodes, R. E., & Rebar, A. L. (2018). Physical activity habit: Complexities and controversies. In the Psychology of Habit. 91-109. New York: Springer [9] New Scientist. (2018). We've got thinking all wrong. This is how your mind really works . Retrieved from https://www.newscientist.com/article/mg24032040-300-weve-got-thinking-all-wrong-this-is-how-your-mind-really-works/ [10] Green, L., & Mehr, D. R. (1997). What alters physicians’ decisions to admit to the coronary care unit? The Journal of Family Practice , 45 (3), 219–226. [11] Katsikopoulos, K.V., Schooler, L.J., Hertwig, R. (2010). The robust beauty of ordinary information. Psychological Review, 117 , 1259–1266. [12] The take-the-best heuristic assumes that cues are processed in order of validity, and it compares both alternatives on a single cue, one at a time, until a cue is found that distinguishes between the alternatives; tallying is a heuristics that simply tallies cues for or against each alternative; and minimalist heuristic assesses options against random cues, and applies a stopping rule when one of the alternatives has a positive cue, and the other does not (all cues receive the same weight for tallying and minimalistic heuristics). [13] A linear regression predicts a statistical relationship between cues with a linear functional form; naive Bayes selects the alternative with the higher probability of having the higher criterion value, given the alternatives’ entire cue profile [14] Hertwig, R., & Pachur, T. (2015). Heuristics, history of. In International encyclopaedia of the social & behavioural sciences . Elsevier. [15] New Scientist. (2018). We've got thinking all wrong. This is how your mind really works . Retrieved from https://www.newscientist.com/article/mg24032040-300-weve-got-thinking-all-wrong-this-is-how-your-mind-really-works/

Enjoy this? Get more.

Our monthly newsletter, The Edit, curates the very best of our latest content including articles, podcasts, video.

Become a member

Not a member yet?

Now it's time for you and your team to get involved. Get access to world-class events, exclusive publications, professional development, partner discounts and the chance to grow your network.

Recommended content

Photo by Ruslan Bardash source - unsplash

Hong Kong Board Announcement

The Marketing Society Hong Kong

Hong Kong Changemakers Reunited

Source: Unsplash, Credit: absolutvision

The Marketing Society announces new Hong Kong Leadership

  • Equity, Diversity and Inclusion


  • Scholarly Community Encyclopedia
  • Log in/Sign up

critical thinking system 1 and 2

Video Upload Options

  • MDPI and ACS Style
  • Chicago Style

While the majority of cognitive psychologists now embrace the dual-processing theory of the mind, Systems 1 and 2, there are still some who disagree. Most evolutionary psychologists, in contrast, dispute the existence of System 2, a domain-general mind, although some disagree. However, a consensus is growing in favor of System 2, although evolutionary psychologists’ concerns must be addressed.

1. Introduction

2. attack of evolutionary psychologists, 3. cognitive psychologists fight back, 4. the reasserted point of view of cognitive psychologists, 5. the view of dissident cognitive psychologists.

  • Cosmides, L.; Tooby, J. Beyond intuition and instinct blindness: Toward an evolutionary rigorous cognitive science. Cognition 1994, 50, 41–77.
  • Buss, D.M. Evolutionary Psychology: The New Science of the Mind, 6th ed.; Routledge: New York, NY, USA, 2019.
  • Fodor, J.A. The Modularity of Mind; The MIT Press: Cambridge, UK, 1983.
  • Fodor, J.A. Precis of the modularity of mind. Behav. Brain Sci. 1985, 8, 73–77.
  • Cosmides, L.; Tooby, J. From evolution to behavior: Evolutionary psychology as the missing link. In The Latest on the Best: Essays on Evolution and Optimality; Dupré, J., Ed.; The MIT Press: Cambridge, MA, USA, 1987; pp. 276–306.
  • Tooby, J.; Cosmides, L. Psychological foundations of culture. In The Adapted Mind; Barkow, J., Cosmides, L., Tooby, J., Eds.; Oxford University Press: New York, NY, USA, 1992; pp. 19–136.
  • Spelke, E.S. Initial knowledge: Six suggestions. Cognition 1994, 50, 431–445.
  • Carey, S. Conceptual Change in Childhood; The MIT Press: Cambridge, MA, USA, 1985.
  • Hirschfeld, L.A.; Gelman, S.A. Mapping the Mind: Domain Specificity in Cognition and Culture; Cambridge University Press: Cambridge, UK, 1994.
  • Keil, F.C. Concepts, Kinds, and Cognitive Development; The MIT Press: Cambridge, MA, USA, 1989.
  • Leslie, A.M. Pretense and representation: The origins of “theory of mind”. Psychol. Rev. 1987, 94, 412–426.
  • Barrett, H.C.; Kurzban, R. Modularity in cognition: Framing the debate. Psychol. Rev. 2006, 113, 628–647.
  • Marr, D. Vision: A Computational Investigation into the Human Representations and Processing of Visual Information; Freeman: San Francisco, CA, USA, 1982.
  • Alcock, J. Animal Behavior: An Evolutionary Approach; Sinauer: Sunderland, MA, USA, 2013.
  • Kanazawa, S. General intelligence as a domain-specific adaptation. Psychol. Rev. 2003, 111, 512–523.
  • Stanovich, K.E. The Robot’s Rebellion: Finding Meaning in the Age of Darwin; The University of Chicago Press: Chicago, IL, USA, 2004.
  • Samuels, R. Evolutionary psychology and the massive modularity hypothesis. Br. J. Philos. Sci. 1998, 49, 575–602.
  • Samuels, R.; Stich, S.P.; Tremoulet, P.D. Rethinking rationality: From beak implications to Darwinian modules. In What Is Cognitive Science? Lepore, E., Pylyshyn, Z., Eds.; Blackwell: Oxford, UK, 1999; pp. 74–120.
  • Over, D.E. Evolution and the Psychology of Thinking: The Debate; Psychology Press: Hove, UK, 2003.
  • Berthet, V.; De Gardelle, V. The Heuristics-and-Biases Inventory: An open source tool to explore individual differences in rationality. Front. Psychol. 2023, 14, 1145246.
  • Gigerenzer, G. How to make cognitive illusions disappear: Beyond “heuristics and biases”. Eur. Rev. Soc. Psychol. 1991, 2, 83–115.
  • Macchi, L.; Mosconi, G. Computational features vs frequentist phrasing in the base-rate fallacy. Swiss, J. Psychol. 1998, 57, 79–85.
  • Evans, J.S.B.T.; Simon, J.H.; Perham, N.; Over, D.E.; Thompson, V.A. Frequency versus probability formats in statistical word problems. Cognition 2000, 77, 197–213.
  • Cosmides, L.; Tooby, J. Better than rational: Evolutionary psychology and the invisible hand. Am. Econ. Rev. 1994, 84, 327–332.
  • Evans, J.S.B.T. In two minds: Dual-process accounts of reasoning. Trends Cogn. Sci. 2003, 7, 454–459.
  • Evans, J.S.B.T. Dual-processing accounts of reasoning, judgment, and social cognition. Annu. Rev. Psychol. 2008, 59, 255–278.
  • Kalat, J.W. Biological Psychology, 13th ed.; Cengage Learning, Inc.: Boston, MA, USA, 2019.
  • Kahneman, D. Thinking, Fast and Slow; Farrar, Straus and Giroux: New York, NY, USA, 2011.
  • Evans, J.S.B.T.; Barston, J.L.; Pollard, P. On the conflict between logic and belief in syllogistic reasoning. Mem. Cognit. 1983, 11, 295–306.
  • Goel, V.; Buchel, C.; Frith, C.; Dolan, R.J. Dissociation of mechanisms underlying syllogistic reasoning. Neuroimage 2000, 12, 504–514.
  • Goel, V.; Dolan, R.J. Explaining modulation of reasoning by belief. Cognition 2003, 87, 11–22.
  • Wason, P. Reasoning. In New Horizons in Psychology; Foss, B.M., Ed.; Penguin: Harmondsworth, UK, 1966; pp. 135–151.
  • Cosmides, L.; Tooby, J. Cognitive adaptations for social exchange. In The Adapted Mind; Barkow, J., Cosmides, L., Tooby, J., Eds.; Oxford University Press: New York, NY, USA, 1992; pp. 163–228.
  • Houde, O.; Zago, L.; Mellet, E.; Moutier, S.; Pineau, A.; Mazoyer, B.; Tzourio-Mazoyer, N. Shifting from the perceptual brain to the logical brain: The neural impact of cognitive inhibition training. J. Cogn. Neurosci. 2000, 12, 721–728.
  • Mithen, S. Human evolution and the cognitive basis of science. In The Cognitive Basis of Science; Carruthers, P., Stich, S.P., Siegal, M., Eds.; Cambridge University Press: New York, NY, USA, 2002; pp. 23–40.
  • Stanovich, K.E.; West, R.F. Individual differences in reasoning: Implications for the rationality debate. Behav. Brain Sci. 2000, 23, 645–726.
  • Osman, M. An evaluation of dual-process theories of reasoning. Psychon. Bull. Rev. 2004, 11, 988–1010.
  • Keren, G.; Schul, Y. Two is not always better than one: A critical evaluation of two-system theories. Perspect. Psychol. Sci. 2009, 4, 533–550.
  • Kruglanski, A.W.; Gigerenzer, G. Intuitive and deliberative judgements are based on common principles. Psychol. Rev. 2011, 118, 97–109.
  • Gigerenzer, G. Personal reflections on theory and psychology. Theory Psychol. 2011, 20, 733–743.
  • Kleffner, D.A.; Ramachandran, V.S. On the perception of shape from shading. Percept. Psychophys. 1992, 52, 18–36.
  • Goldstein, D.G.; Gigerenzer, G. Models of ecological rationality: The recognition heuristic. Psychol. Rev. 2002, 109, 75–90.
  • Jacoby, L.L.; Dallas, M. On the relationship between autobiographical memory and perceptual learning. J. Exp. Psychol. 1981, 110, 306–340.
  • Schooler, L.; Hertwig, R. How forgetting aids heuristic inference. Psychol. Rev. 2005, 112, 610–628.
  • Gigerenzer, G.; Goldstein, D.G. Reasoning the fast and frugal way: Models of bounded rationality. Psychol. Rev. 1996, 103, 650–669.
  • Dawes, R.H. The robust beauty of improper linear models in decision making. Am. Psychol. 1979, 34, 571–582.
  • Simon, H.A. A behavioral model of rational choice. Q. J. Econ. 1955, 69, 99–118.
  • Todd, P.M.; Miller, G.F. From pride and prejudice to persuasion: Realistic heuristics for mate search. In Simple Heuristics that Make Us Smart; Gigerenzer, G., Todd, P.M., the ABC Research Group, Eds.; Oxford University Press: New York, NY, USA, 1999; pp. 287–308.
  • DeMiguel, V.; Garlappi, L.; Uppal, R. Optimal versus naive diversification: How inefficient is the 1/N portfolio strategy? Rev. Financ. Stud. 2009, 22, 1915–1953.
  • Johnson, E.L.; Goldstein, D.G. Do defaults save lives? Science 2003, 302, 1338–1339.
  • Pichert, D.; Katsikopoulos, K.V. Green defaults: Information presentation and pro-environmental behavior. J. Environ. Psychol. 2008, 28, 63–73.
  • Axelrod, R. The Evolution of Cooperation; Basic Books: New York, NY, USA, 1984.
  • Boyd, R.; Richerson, P.J. The Origin and Evolution of Cultures; Oxford University Press: New York, NY, USA, 2005.


  • Terms and Conditions
  • Privacy Policy
  • Advisory Board

critical thinking system 1 and 2

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Can Med Educ J
  • v.7(2); 2016 Oct

Logo of cmej

Systems 1 and 2 thinking processes and cognitive reflection testing in medical students

Shu wen tay.

1 Department of Neonatology, Cork University Maternity Hospital, Ireland

2 Department of Paediatrics and Child Health, University College Cork, Ireland

3 Teagasc, Moorepark, Fermoy, Co. Cork, Ireland

C Anthony Ryan

Diagnostic decision-making is made through a combination of Systems 1 (intuition or pattern-recognition) and Systems 2 (analytic) thinking. The purpose of this study was to use the Cognitive Reflection Test (CRT) to evaluate and compare the level of Systems 1 and 2 thinking among medical students in pre-clinical and clinical programs.

The CRT is a three-question test designed to measure the ability of respondents to activate metacognitive processes and switch to System 2 (analytic) thinking where System 1 (intuitive) thinking would lead them astray. Each CRT question has a correct analytical (System 2) answer and an incorrect intuitive (System 1) answer. A group of medical students in Years 2 & 3 (pre-clinical) and Years 4 (in clinical practice) of a 5-year medical degree were studied.

Ten percent (13/128) of students had the intuitive answers to the three questions (suggesting they generally relied on System 1 thinking) while almost half (44%) answered all three correctly (indicating full analytical, System 2 thinking). Only 3–13% had incorrect answers (i.e. that were neither the analytical nor the intuitive responses). Non-native English speaking students (n = 11) had a lower mean number of correct answers compared to native English speakers (n = 117: 1.0 s 2.12 respectfully: p < 0.01). As students progressed through questions 1 to 3, the percentage of correct System 2 answers increased and the percentage of intuitive answers decreased in both the pre-clinical and clinical students.


Up to half of the medical students demonstrated full or partial reliance on System 1 (intuitive) thinking in response to these analytical questions. While their CRT performance has no claims to make as to their future expertise as clinicians, the test may be used in helping students to understand the importance of awareness and regulation of their thinking processes in clinical practice.


Making a diagnosis is central to medical practice. A correct diagnosis sets off a chain of events, investigations, and therapeutic treatments, that lead to appropriate management. This is done through clinical reasoning, the “cognitive process that is necessary to evaluate and manage a patient’s medical problem.” 1 Some experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure from multiple causes including inadequate knowledge, faulty data gathering, and/or faulty verification. 2 Thus, the clinician’s ability to provide safe, high-quality care is dependent upon their ability to reason, think, and judge.

Despite the importance placed on patient safety in the modern curriculum 3 , medical education at present has built an environment that does not always actively promote development of clinical reasoning. Educators recognize its importance in developing expertise, but it is often not an explicit educational objective. 4 Part of this is due to the belief that clinical reasoning will be acquired on its own over time with practice and an accumulation of knowledge. 5 Norman and Eva 6 in a systematic review of the literature, concluded that strategies directed at encouraging both analytical and non-analytical reasoning could lead to some gains in diagnostic accuracy. Thus, knowing how doctors think, make decisions, and make errors in thinking is important for novice and expert clinical decision makers, but also for educators who will need to have multiple strategies to teach both analytical and non-analytical reasoning. 7

Decision-making is complex. It is partly based on the dual-process theory of Epstein and Hammond, 8 recently popularized in Daniel Kahneman’s book “Thinking Fast and Slow.” 10 Two families of cognitive operations, called System 1 (intuitive) and System 2 (analytical) thinking, are used in decision-making. System 1 thinking is often described as a reflex system, which is “intuitive” and “experiential” or “pattern recognition”, which triggers an automated mode of thinking. It is generated without much conscious effort and channels the available information through a subconscious pattern recognition based on similar past situations; 11 , 12 this is often described as the “gut feeling”. When problems are routine and when under time constraint, System 1 kicks in. When an individual is more dependent on System 1 thinking (for example, HALT: “hungry, angry, tired or late” or under conditions of illness, substance abuse or emotional distress), the accuracy of decision-making can be adversely affected. 14 Nevertheless, there is evidence that System 1 thinking is an indispensable element of clinical decision-making in physician primary care. 15 , 16 Although System 2 (analytical) thinking is more deliberate than System 1, the latter is not necessarily less capable. On the contrary, complex cognitive operations eventually migrate from System 2 to System 1 (i.e. become more automatic) as proficiency and skill are acquired and pattern matching has replaced effortful serial processing.

System 2 is the more “analytical,” “deliberate” and “rational” side to the thinking process. It is pieced together by logical judgment and a mental search for additional information acquired through past learning and experience. 17 , 18 The data are then processed carefully, through a conscious application of rules, making it a much slower and cognitively demanding process but more likely to lead to better decisions. The analytical system is engaged usually when there is uncertainty, complexity, or the outcomes give little room for error but there is time to think. 19 , 20 System 2 thinking is slow, requiring significant cognitive effort, and, though it is less prone to error, is not foolproof.

Experts, drawing upon greater quantities of information within their field, are occasionally subject to cognitive errors and biases, by picking up the wrong information or “distracting cues”, resulting in diagnostic errors. When used alone, System 2 thinking can lead to poorer performance by slowing action processes down. Experience, despite being a yardstick of the expert, does not necessarily translate into better performance. Indeed, experience, without feedback or reflection, can often be the fertile ground for the development of faulty thinking. 21 , 22

Thus, Systems 1 and 2 thinking are useful in the right place and the right time; indeed, they complement each other. Taken together, they promote greater efficiency in thinking, decision-making and action, and help bring order to chaos and uncertainty. 23 Whether to use Systems 1 or 2 thinking in a given clinical situation depends on the complexity of the situation in relation to the individual’s capabilities, past experiences, and self-confidence.

The Cognitive Reflection Test (CRT) is a three-question test designed to measure respondent’s ability to activate metacognitive processes allowing them to switch to System 2 thinking. In other words, it is the disposition to resist reporting the response that first comes to mind. 24 As explained by the CRT inventor Shane Frederick: “The three items on the CRT are “easy” in the sense that their solution is easily understood when explained, yet reaching the correct answer often requires the suppression of an erroneous answer that springs “impulsively” to mind.” In his study, Frederick has shown a reduction in intuitive answers as the questions precede from question 1 to 3 and also a gender bias with better performance among the male population. 24 The CRT has also been used in a group of judges in the United States. 25 Judges are thought to be predominantly intuitive thinkers, picking up on intuitive clues that lead them to reach conclusions that they later rationalize. This study showed that only two thirds of the judges gave the right (deliberative) answer to one or more of the three CRT questions, confirming a significant reliance on System 1 (intuitive) thinking in the remaining responses. Of course, these results are not generalizable as to how judges think in courtroom practice. Nevertheless, the CRT may have been a useful exercise in encouraging the judges to be aware of and to regulate their thinking, in particular metacognition, the executive function that turns on their System 2 thinking that can, among other things, expose their cognitive biases.

The CRT has not, to our knowledge, been tested in the medical profession. The aim of this study was to evaluate the level of Systems 1 and 2 thinking in medical students using the CRT. Since the development of clinical expertise is often associated with more automatic System 1 thinking, we also wanted to compare the CRT responses of students in clinical practice (“experts”) and those in pre-clinical years (“novices”). We also wished to test the question progression improvement phenomenon and the possible gender bias seen above. 21 Although not a specific aim of this study, we believe the CRT test, when used with students, can help them understand the differences between intuitive and analytical thinking in decision making in clinical practice.

The assessment tool used in this study was the internationally validated CRT, originally developed as a measure of a type of cognitive ability. 24 The questions, along with the intuitive (incorrect) and the analytical (correct) answers, are as follows:

  • CRT question 1: This question required respondents to evaluate the cost of a ball given that the total cost of a bat and a ball was $1.10 and the bat cost $1.00 more than the ball. An intuitive (impulsive) answer that the ball costs $0.10 does spring to mind by subtracting $1.00 from $1.10. However, should this be the case, the total cost of the bat and ball would be $1.20, which is incorrect. Hence, the right answer is $0.05.
  • CRT question 2: This question asks respondents to evaluate, if 5 machines take 5 minutes to make 5 widgets, how long does is take 100 machines to make 100 widgets. Again, the impulsive answer that springs to mind is 100 minutes, but if one were to take a step back and consider, it would take 1 machine 5 minutes to make 1 widget. Therefore, it would take 100 machines 5 minutes to make 100 widgets.
  • CRT question 3: This question gave the background of a lily patch in a pond. Each day the lily patch doubled in size. It takes 48 days for the lily patch to cover the entire pond and respondents are asked how long it would take for the lily pad to cover half the pond. The intuitive answer would be to take the half of 48 (day 24), but logically, if the patch were to double in size every day, the day before it covers the entire lake it would cover half the lake (day 47).

Thus the correct answers are, in summary, 5, 5 and 47, while the intuitive answers are 10, 100 and 24, respectfully.

After obtaining ethical approval from Research Ethics Committee of the Cork Teaching Hospitals, medical students in the School of Medicine, University College Cork, from Years 2 & 3 (pre-clinical), and Years 4 (in clinical practice) of a 5-year Medical Degree course were approached to participate in this study. The CRT was distributed to students at the end of a lecture, thus allowing them 10–15 minutes to complete the CRT, in addition to a few demographic questions. A voluntarily completed response was taken to indicate consent for participation. Respondents were assured that participation was anonymous and would have no bearing on their future medical education. Statistical analysis was by student t -test for continuous data and Spearman rank correlation to test the association between ranked variables.

There were approximately 90 students in the pre-clinical and 90 students in clinical class, i.e. 180 students in total, of whom 130 (72%) completed the survey. Two students had previously been exposed to the CRT and were excluded. Of the remaining 128 students, 49 were male and 79 were female, while 61 were pre-clinical and 67 were clinical students. Ten percent of students (13/128) answered none of the CRT questions correctly, 21% (27/128) answered one correctly, 25% (32/128) answered two questions correctly while 44% (56/128) answered all three correctly. Over half of respondents (56%) obtained the correct (analytical) answer to the first question, with 40% giving the intuitive answer. More than two thirds of respondents (70%) got the second question right, with 22% getting the intuitive answer. For question 3, 77% of respondents got the right answer, with 14% getting the intuitive answer. The mean number of questions answered correctly was 2.02. The students returned a total of 388 questions: 67% (259) were correct answers, 25% (97) were intuitive answers and 8% (32) were incorrect answers.

The outcomes from the individual three questions, by pre-clinical and clinical students, are presented in Table 1 and Figure 1 . The percentage of correct answers increased as the questions progressed from question 1 to 3. At the same time, the percentage of intuitive answers decreased, while incorrect responses ranged between 3 to 13%.

An external file that holds a picture, illustration, etc.
Object name is cmej0797f1.jpg

Correct answers increased and intuitive answers decreased in both pre-clinical and clinical students as they progressed from question 1 to 3

Responses to the CRT questions by pre-clinical students and students in clinical practice

Pre-clinical respondents gave 5–10% more correct and 2–10% less intuitive answers than their clinical counterparts for each question ( Figure 1 ). However, pairwise analysis of the means showed no significant differences between pre-clinical and clinical respondents.

Approximately 9% (11/128) of respondents were international students who did not have English as a childhood language. There was a significant difference between the mean number of correct answers of students who had English as a childhood language (2.12, n = 117) and those who did not (1.0, n = 11: t test; p < 0.01). Conversely, respondents who were non-native English speakers gave significantly more intuitive answers (1.6, n = 11) than English speakers (0.7, n = 117: t test; p < 0.01). There was no relationship between age of respondents, or gender of respondents (male n = 49 and female n = 79) and correct (72% vs 65%), intuitive (23% vs 27%) or incorrect (5% vs 8%) answers, respectively.

The CRT is designed to measure the ability of respondents to activate thinking processes that switch to System 2 thinking where System 1 (more intuitive) thinking might lead them astray. To our knowledge, this is the first study to test the CRT on medical students. Our study confirmed that less than half (44%) of the medical students answered all three questions correctly (i.e. were fully metacognitive in engaging System 2 thinking) while one in 10 students answered none of the questions right (suggesting they did not think metacognitively to engage System 2 thinking and generally relied on intuitive thinking). Thus, more than half of the students demonstrated full or partial reliance on intuitive thinking in responding to these analytical questions.

A minority of students (< 13%) had incorrect answers that were neither the analytical nor the intuitive responses. It is possible that these students may have recognized they should switch from System 1 thinking (i.e. they activated some metacognitive processes), but were unsuccessful in their System 2 thinking. Using focus groups to ask participants to explain the manner in which they went about solving each question would have answered this conundrum. However, this was outside the scope of the current study but a useful direction for further research. Finally, a small number of students in this study did not have English as a childhood language and had lower correct responses. A lower level of functioning in the English language may have affected their score due to a less accurate sense of the situations being described in the problems.

As was seen in Frederick’s studies and also in the present study, System 2 responses increased and System 1 answers decreased with progression through the CRT questions. 24 According to Frederick, the first question is commonly regarded as the easiest and the third, the hardest. Thus, when confronted with problems of a harder nature, respondents use their System 2 processes to override their System 1 intuitive processes to obtain the correct answer. Fredrick found that men scored higher than women on the CRT. He postulated that men, supposedly having more learned skills in mathematics, were less likely to go with the intuitive responses. 24 We found no gender differences in the CRT scores in our study; however, it was not powered enough to show a significant difference if one existed.

The CRT is a test of cognition and care must be taken not to interpret these results as an index of the medical students’ current or future clinical reasoning. Performance on a math problem has relatively low stakes compared with health care decision-making. Low scoring students may have faulty mathematical intuition based on the CRT, but there is no evidence as yet to say they have faulty intuition in general, particularly medical intuition. It may be of interest in a future study to link CRT responses to subsequent clinical decision-making. However, it is most unlikely that a single mathematical examination such as the CRT could predict future performance in clinical reasoning and judgment. Instead, as it stands, we believe the test can be used to help students understand the differences between analytic and intuitive thinking, the importance of both systems thinking, and especially the need to develop their metacognitive skills. In addition, using the CRT and answering the questions correctly, has been shown to activate System 2 processes and may help prepare students for metacognitive thinking. 26

The objective of the present study was not to make correlations or reach conclusions that mathematical reasoning predicts or facilitates diagnostic decision-making. However, we observed that students in pre-clinical years demonstrated some evidence of more cognitive override (metacognition) than students in clinical practice although this was not statistically significant. The intuitive answers for the CRT mathematical problems were intrinsically incorrect. In medical practice, intuitive responses are not always wrong. In their work on intuition Tracy et al 15 stated that: “There was overwhelming agreement that intuition plays a vital role in the practice of family medicine” and that “intuition has its origins in personal clinical experience.” Intuition may also be adaptive in complex situations where decisions are required in a timely fashion; for instance, intuitive responses are essential in emergency situations. Nevertheless, where possible, intuition should be guided and formed by System 2 thinking to reduce the possibility of error or cognitive biases.

The CRT has previously been used at an educational session at the Florida Conference of Circuit Judges in 2006. 25 The medical students in the present study scored higher than the judges, correctly answering a mean of 2.02 of the CRT questions compared to 1.23 for the judges. The judges were also more likely to respond with intuitive responses, in that only two thirds of the judges (compared to 90% of the medical students), gave the right (deliberative) answer to one or more of the CRT questions. There were a number of differences between the studies, however, that caution direct comparisons. The CRT questions in the Judges’ study were embedded within a larger questionnaire that was administered over 45 minutes. In contrast, the medical student questionnaire consisted of the 3 CRT questions and a number of demographic questions that was administered over 10–15 minutes between lecture slots.

There were a number of limitations in this study. It was only possible to distribute it to a relatively small number of students between lectures on a couple of occasions as multiple attempts would have affected the reliability of the results, through spill-over of the content and answers of CRT questions to other students. We did not include final year medical students because they were dispersed throughout the various teaching hospitals and were not accessible in a large group. There may be alternative reasons, other than better analytical skills, why some students scored higher System 2 responses than others. For instance, although we specifically asked students if they were previously aware of the CRT problems and excluded them if they answered in the affirmative, it is possible that some may have prior experience in similar kinds of mathematical problems and may therefore have found the CRT problems to be quite straightforward. This CRT test had only three questions; Frederick 27 has used up to eight CRT problems in some studies, which may result in greater reliability. Ideally, we believe that the CRT test should have been followed up by a debrief where students could have explored the purpose of the test, the differences between Systems 1 and 2 thinking, the role of metacognition, and the importance of knowing how our minds think as novices, as experts, and in times of distress. Finally there is a need for ongoing research, including non-mathematical critical thinking tests, to assess the development of analytic and logical reasoning skills of medical students and emerging doctors over time.

The CRT mathematical test has shown that intuition is a dominant force in the minds of medical students. It has also shown that it is possible for this intuitive force to be put aside and for logic to prevail even as the CRT questions progress. Awareness and understanding of how experts think, in addition to intuition and metacognitive training, should be promoted amongst medical students as a way to aid their thinking processes and avoid cognitive errors in subsequent clinical practice. Finally, students need to understand how faulty or lazy thinking can lead to cognitive errors that can impact upon patient care and patient safety.

Conflicts of interest: There are no conflicts of interest for any of the authors.

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents


Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Critical Thinking -- System 1 or 2 Thinking

Profile image of Ben Hebebrand

This paper provides a quick overview of stressing System 2 Thinking in helping to develop critical thinkers.

Related Papers

Critical thinking includes the component skills of analyzing arguments, making inferences using inductive or deductive reasoning, judging or evaluating, and making decisions or solving problems. Background knowledge is a necessary but not a sufficient condition for enabling critical thought within a given subject. Critical thinking involves both cognitive skills and dispositions. These dispositions, which can be seen as attitudes or habits of mind, include openand fair-mindedness, inquisitiveness, flexibility, a propensity to seek reason, a desire to be wellinformed, and a respect for and willingness to entertain diverse viewpoints. There are both general-and domain-specific aspects of critical thinking. Empirical research suggests that people begin developing critical thinking competencies at a very young age. Although adults often exhibit deficient reasoning, in theory all people can be taught to think critically. Instructors are urged to provide explicit instruction in critical t...

critical thinking system 1 and 2

Alubabari Desmond Nbete

patrice chataigner

Dr. Punam Bansal

ABSTRACT In today’s complex world, where human beings need to solve problems, make decisions, or decide in a reasonable and reflective way what to believe or what to do, critical thinking is found to be useful. Critical thinking is that mode of thinking - about any subject, content, or problem - in which the thinker improves the quality of his or her thinking by skillfully taking charge of the structures inherent in thinking and imposing intellectual standards upon them. It entails effective communication and problem solving abilities that can help citizens make sense of their world and participate in a democratic dialogue. To prepare such citizens with higher order thinking skills should be foremost priority of any education system. Therefore ,it is the responsibility of teachers to foster critical thinking skills of their students and switch over to constructivist methods so that students can construct their knowledge and apply it to solve real life problems. This paper is a modest attempt by author to suggest some useful practices in classroom to develop critical thinking skills.

Jurnal Ilmiah Psikologi Terapan

Ahmad Sulaiman

Critical-thinking dispositions are integral to drive and maintain students' use of critical-thinking in their academic and daily life setting. Yet, critical-thinking dispositions have received small attention from researchers. The present study developed and tested a novel strategy aimed to foster three Indonesian Master Students' critical-thinking dispositions. The strategy represented by a table consisting of a set of critical-thinking criteria and phases of critical-thinking self-regulation. The strategy intended to make explicit and to enhance the students' metacognition and self-regulation in the process of critical-thinking. Scores retrieved from critical-thinking dispositions scale shows that the intervention successfully increased the overall students' level of critical-thinking dispositions. The qualitative data from individual interviews revealed some valuable insights about the students' learning difficulties and that the strategy successfully made students aware of their thinking process and reinforced the students' metacognition and self-regulation process. The limitation and implication of the strategy are discussed.

Mohammad Amini Farsani

Proceedings of the Proceedings of the 2nd Annual Conference of Engineering and Implementation on Vocational Education (ACEIVE 2018), 3rd November 2018, North Sumatra, Indonesia

lala ananda

Jonathan Heard

Robert DiYanni


Jurnal Abdi Masyarakat Kita

Georg Franck

Višnja Vekić-Kljaić

Rossi Passarella

Zenodo (CERN European Organization for Nuclear Research)

Parker Emmerson

Environmental health : a global access science source

dianna Magliano

International Journal of Botany Studies

Joshita Nongthombam

European Politics and Society

Sigrid Kaasik-Krogerus

Tạp chí Phẫu thuật Tim mạch và Lồng ngực Việt Nam

Ngọc Thành Lê

Aristeidis Kourtidis

Tony Fowkes

Robert Wilkinson

Authorea (Authorea)

Bekir Tasdemir

International Review of Finance

Katja Ignatieva

Melvin Robinson, III

International Journal for Innovation Education and Research

Juliano Keller Alvez

George Murer

International Journal of Industrial and Manufacturing Systems Engineering

Quadri Bolaji

Murali Munisamy

Clinical Nutrition Supplements

Rosana Apª Santos


  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024
  • [email protected]
  • Login / Register

Teach Critical, Creative, and Independent Thinking

Article 26 May 2024 93 0

Teach Critical, Creative, and Independent Thinking


In an ever-evolving world, the ability to think critically, creatively, and independently is more crucial than ever. These skills empower individuals to navigate complex problems, innovate solutions, and make informed decisions. This article explores various methods and techniques to teach these essential thinking skills, providing educators, parents, and learners with practical strategies to foster a thinking mindset.

Definitions and Importance

Critical thinking.

Critical thinking involves analyzing information objectively, evaluating evidence, and reasoning logically. It’s essential for problem-solving, decision-making, and understanding complex issues. In today’s information-rich age, critical thinking helps discern credible sources from misinformation, making it a vital skill for both personal and professional success.

Creative Thinking

Creative thinking is the ability to generate new ideas, see connections between seemingly unrelated concepts, and think outside the box. This skill drives innovation and is crucial in fields ranging from the arts to science and business. Creative thinkers can adapt to new situations and find unique solutions to challenges.

Independent Thinking

Independent thinking refers to the ability to form one’s own opinions and make decisions without undue influence from others. It involves self-reflection, confidence, and the courage to stand by one’s beliefs. Independent thinkers are better equipped to lead, innovate, and contribute meaningfully to society.

Educational Approaches

Socratic questioning.

Socratic questioning is a method of teaching that encourages critical thinking through dialogue. By asking probing questions, educators can help students explore complex ideas and develop their reasoning skills. This approach fosters deep understanding and helps learners become more thoughtful and reflective.

Problem-Based Learning (PBL)

Problem-Based Learning (PBL) is an educational strategy where students learn by solving real-world problems. This method promotes critical thinking, creativity, and independent learning as students must research, collaborate, and apply their knowledge to find solutions.

Brainstorming Sessions

Brainstorming sessions are a popular technique for fostering creative thinking. By encouraging free-flowing ideas in a judgment-free environment, educators can help students think expansively and explore multiple possibilities. This method is especially effective in group settings, where diverse perspectives can spark innovative solutions.

Practical Tips for Educators and Parents

Use open-ended questions.

Open-ended questions stimulate critical and creative thinking by requiring more than a yes-or-no answer. Questions like “What do you think will happen if...?” or “How might we solve this problem?” encourage students to think deeply and explore different angles.

Encourage Curiosity

Fostering a sense of curiosity is fundamental to developing thinking skills. Encourage students to ask questions, explore new topics, and seek out information. Create a learning environment that celebrates inquiry and values the process of discovery.

Promote Self-Reflection

Self-reflection helps students develop independent thinking by encouraging them to consider their own thoughts and feelings. Activities like journaling or discussing personal experiences can help learners understand their thought processes and build confidence in their ideas.

Cognitive Strategies

Mind mapping.

Mind mapping is a visual tool that helps organize thoughts and ideas. By creating a diagram that connects related concepts, students can see the relationships between different pieces of information and think more holistically. This technique is particularly useful for brainstorming and organizing complex subjects.

Analogical Reasoning

Analogical reasoning involves drawing comparisons between similar situations to understand new concepts. By relating unfamiliar ideas to known experiences, students can grasp complex ideas more easily and develop creative solutions.

Thinking Routines

Thinking routines are structured approaches that guide students through the thinking process. Routines like “See-Think-Wonder” or “Claim-Support-Question” provide a framework for exploring ideas deeply and systematically. These routines can be used across various subjects to promote critical and creative thinking.

Real-Life Applications

Problem-solving skills.

Critical and creative thinking skills are essential for effective problem-solving. Whether in personal life or professional settings, the ability to analyze situations, generate solutions, and make decisions is invaluable. Teaching these skills equips learners to handle challenges confidently and competently.

Decision-Making Abilities

Independent thinking enhances decision-making by fostering self-reliance and confidence. By teaching students to evaluate options, consider consequences, and trust their judgment, we prepare them to make informed choices that align with their values and goals.

Innovation and Creativity

Creative thinking drives innovation, which is crucial in today’s fast-paced, competitive world. By encouraging students to think creatively, we cultivate a mindset that embraces change, seeks out new opportunities, and continuously strives for improvement.

Resources and Tools

  • “Thinking, Fast and Slow” by Daniel Kahneman : This book explores the two systems of thought and how they shape our judgments and decisions.
  • “The Art of Thinking Clearly” by Rolf Dobelli : A practical guide to avoiding cognitive errors and thinking more effectively.
  • “Creative Confidence” by Tom Kelley and David Kelley : A book that encourages readers to unleash their creativity and apply it in their daily lives.

Online Courses

  • Coursera’s “Learning How to Learn” : This course offers techniques to help learners master complex subjects and develop effective thinking strategies.
  • edX’s “Critical Thinking & Problem-Solving” : A course designed to enhance critical thinking skills through practical exercises and real-world applications.
  • Udemy’s “Creative Thinking Techniques and Tools for Success” : A course that provides tools and methods to boost creative thinking and innovation.
  • MindMeister : An online mind mapping tool that helps visualize and organize ideas.
  • Evernote : A note-taking app that supports organizing thoughts and tracking creative ideas.
  • Coggle : A collaborative mind mapping tool that is ideal for group brainstorming sessions.

Teaching someone how to think critically, creatively, and independently is one of the most valuable gifts we can offer. These skills not only enhance academic and professional success but also enrich personal growth and lifelong learning. By employing the educational approaches, cognitive strategies, and practical tips discussed in this blog, educators and parents can cultivate a thinking mindset in learners, empowering them to navigate an increasingly complex world with confidence and creativity.

Whether through Socratic questioning, problem-based learning, or encouraging self-reflection, the journey to developing these essential skills is both challenging and rewarding. Let us commit to fostering environments that celebrate curiosity, promote deep thinking, and inspire the next generation of critical, creative, and independent thinkers.

  • Latest Articles

Best Tips for Students to Improve Their Study

Essential improvements for nepali students' education, facilitating foreign students by u.s. government, challenges faced by international students in america, is everyday life in america becoming too hard, how technology enhances student learning, scientists and belief in god: fascinating facts, why human brains outperform the internet, can quantum physics explain consciousness quantum mind theory, how does reading affect your brain, why there are 60 seconds in a minute: the historical reason, buddha was born in nepal: discover lumbini’s significance, 20 essential life lessons to learn early, 20 reading rules that transformed my life, 20 life lessons i wish i knew at 20: wisdom from 40, how bill gates reads: tips every reader can learn, what actually matters in your 20s: key life lessons, 10 daily self-growth questions for personal development, apply online.


Find Detailed information on:

  • Top Colleges & Universities
  • Popular Courses
  • Exam Preparation
  • Admissions & Eligibility
  • College Rankings

Sign Up or Login

Not a Member Yet! Join Us it's Free.

Already have account Please Login

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

5.3: Using Critical Thinking Skills- Decision Making and Problem Solving

  • Last updated
  • Save as PDF
  • Page ID 24260

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)


In previous lessons, you learned about characteristics of critical thinkers and information literacy. In this module, you will learn how to put those skills into action through the important processes of decision making and problem solving.

As with the process of developing information literacy, asking questions is an important part of decision making and problem solving. Thinking is born of questions. Questions wake us up. Questions alert us to hidden assumptions. Questions promote curiosity and create new distinctions. Questions open up options that otherwise go unexplored. Besides, teachers love questions.

We make decisions all the time, whether we realize it or not. Even avoiding decisions is a form of decision making. The student who puts off studying for a test until the last minute, for example, might really be saying, “I’ve decided this course is not important” or “I’ve decided not to give this course much time.”

Decisions are specific and lead to focused action. When we decide, we narrow down. We give up actions that are inconsistent with our decision.

In addition to decision making, critical thinking skills are important to solving problems. We encounter problems every single day, and having a solid process in place is important to solving them.

At the end of the lesson, you will learn how to put your critical thinking skills to use by reviewing an example of how critical thinking skills can help with making those everyday decisions.

Using Critical Thinking Skills: Asking Questions

Questions have practical power. Asking for directions can shave hours off a trip. Asking a librarian for help can save hours of research time. Asking how to address an instructor—by first name or formal title—can change your relationship with that person. Asking your academic advisor a question can alter your entire education. Asking people about their career plans can alter your career plans.

You can use the following strategies to develop questions for problem solving and decision making:

Ask questions that create possibilities. At any moment, you can ask a question that opens up a new possibility for someone.

  • Suppose a friend walks up to you and says, “People just never listen to me.” You listen carefully. Then you say, “Let me make sure I understand. Who, specifically, doesn’t listen to you? And how do you know they’re not listening?”
  • Another friend tells you, “I just lost my job to someone who has less experience. That should never happen.” You respond, “Wow, that’s hard. I’m sorry you lost your job. Who can help you find another job?”
  • A relative seeks your advice. “My mother-in-law makes me mad,” she says. “You’re having a hard time with this person,” you say. “What does she say and do when you feel mad at her? And are there times when you don’t get mad at her?”

These kinds of questions—asked with compassion and a sense of timing—can help people move from complaining about problems to solving them.

Discover new questions. Students sometimes say, “I don’t know what questions to ask.” Consider the following ways to create questions about any subject you want to study or about any

area of your life that you want to change:

  • Let your pen start moving. Sometimes you can access a deeper level of knowledge by taking out your pen, putting it on a piece of paper, and writing down questions—even before you know what to write. Don’t think. Just watch the pen move across the paper. Notice what appears. The results might be surprising.
  • Ask about what’s missing . Another way to invent useful questions is to notice what’s missing from your life and then ask how to supply it. For example, if you want to take better notes, you can write, “What’s missing is skill in note taking. How can I gain more skill in taking notes?” If you always feel rushed, you can write, “What’s missing is time. How do I create enough time in my day to actually do the things that I say I want to do?”
  • Pretend to be someone else. Another way to invent questions is first to think of someone you greatly respect. Then pretend you’re that person. Ask the questions you think she would ask.
  • What can I do when ... an instructor calls on me in class and I have no idea what to say? When a teacher doesn’t show up for class on time? When I feel overwhelmed with assignments?
  • How can I ... take the kind of courses that I want? Expand my career options? Become much more effective as a student, starting today?
  • When do I ... decide on a major? Transfer to another school? Meet with an instructor to discuss an upcoming term paper?
  • What else do I want to know about ... my academic plan? My career plan? My options for job hunting? My friends? My relatives? My spouse?
  • Who can I ask about ... my career options? My major? My love life? My values and purpose in life?

Many times you can quickly generate questions by simply asking yourself, “What else do I want to know?” Ask this question immediately after you read a paragraph in a book or listen to someone speak.

Start from the assumption that you are brilliant. Then ask questions to unlock your brilliance.

Using Critical Thinking Skills in Decision Making

As you develop your critical thinking skills, you can apply them as you make decisions. The following suggestions can help in your decision-making process:

Recognize decisions. Decisions are more than wishes or desires. There’s a world of difference between “I wish I could be a better student” and “I will take more powerful notes, read with greater retention, and review my class notes daily.” Deciding to eat fruit for dessert instead of ice cream rules out the next trip to the ice cream store.

Establish priorities. Some decisions are trivial. No matter what the outcome, your life is not affected much. Other decisions can shape your circumstances for years. Devote more time and energy to the decisions with big outcomes.

Base decisions on a life plan. The benefit of having long-term goals for our lives is that they provide a basis for many of our daily decisions. Being certain about what we want to accomplish this year and this month makes today’s choices more clear.

Balance learning styles in decision making. To make decisions more effectively, use all four modes of learning explained in a previous lesson. The key is to balance reflection with action, and thinking with experience. First, take the time to think creatively, and generate many options. Then think critically about the possible consequences of each option before choosing one. Remember, however, that thinking is no substitute for experience. Act on your chosen option, and notice what happens. If you’re not getting the results you want, then quickly return to creative thinking to invent new options.

Choose an overall strategy. Every time you make a decision, you choose a strategy—even when you’re not aware of it. Effective decision makers can articulate and choose from among several strategies. For example:

  • Find all of the available options, and choose one deliberately. Save this strategy for times when you have a relatively small number of options, each of which leads to noticeably different results.
  • Find all of the available options, and choose one randomly. This strategy can be risky. Save it for times when your options are basically similar and fairness is the main issue.
  • Limit the options, and then choose. When deciding which search engine to use, visit many search sites and then narrow the list down to two or three from which to choose.

Use time as an ally. Sometimes we face dilemmas—situations in which any course of action leads to undesirable consequences. In such cases, consider putting a decision on hold. Wait it out. Do nothing until the circumstances change, making one alternative clearly preferable to another.

Use intuition. Some decisions seem to make themselves. A solution pops into your mind, and you gain newfound clarity. Using intuition is not the same as forgetting about the decision or refusing to make it. Intuitive decisions usually arrive after we’ve gathered the relevant facts and faced a problem for some time.

Evaluate your decision. Hindsight is a source of insight. After you act on a decision, observe the consequences over time. Reflect on how well your decision worked and what you might have done differently.

Think of choices. This final suggestion involves some creative thinking. Consider that the word decide derives from the same roots as suicide and homicide . In the spirit of those words, a decision forever “kills” all other options. That’s kind of heavy. Instead, use the word choice , and see whether it frees up your thinking. When you choose , you express a preference for one option over others. However, those options remain live possibilities for the future. Choose for today, knowing that as you gain more wisdom and experience, you can choose again.

Using Critical Thinking Skills in Problem Solving

Think of problem solving as a process with four Ps : Define the problem , generate possibilities ,

create a plan , and perform your plan.

Step 1: Define the problem. To define a problem effectively, understand what a problem is—a mismatch between what you want and what you have. Problem solving is all about reducing the gap between these two factors.

Tell the truth about what’s present in your life right now, without shame or blame. For example: “I often get sleepy while reading my physics assignments, and after closing the book I cannot remember what I just read.”

Next, describe in detail what you want. Go for specifics: “I want to remain alert as I read about physics. I also want to accurately summarize each chapter I read.”

Remember that when we define a problem in limiting ways, our solutions merely generate new problems. As Albert Einstein said, “The world we have made is a result of the level of thinking we have done thus far. We cannot solve problems at the same level at which we created them” (Calaprice 2000).

This idea has many applications for success in school. An example is the student who struggles with note taking. The problem, she thinks, is that her notes are too sketchy. The logical solution, she decides, is to take more notes; her new goal is to write down almost everything her instructors say. No matter how fast and furiously she writes, she cannot capture all of the instructors’ comments.

Consider what happens when this student defines the problem in a new way. After more thought, she decides that her dilemma is not the quantity of her notes but their quality . She adopts a new format for taking notes, dividing her notepaper into two columns. In the right-hand column, she writes down only the main points of each lecture. In the left-hand column, she notes two or three supporting details for each point.

Over time, this student makes the joyous discovery that there are usually just three or four core ideas to remember from each lecture. She originally thought the solution was to take more notes. What really worked was taking notes in a new way.

Step 2: Generate possibilities. Now put on your creative thinking hat. Open up. Brainstorm as many possible solutions to the problem as you can. At this stage, quantity counts. As you generate possibilities, gather relevant facts. For example, when you’re faced with a dilemma about what courses to take next semester, get information on class times, locations, and instructors. If you haven’t decided which summer job offer to accept, gather information on salary, benefits, and working conditions.

Step 3: Create a plan. After rereading your problem definition and list of possible solutions, choose the solution that seems most workable. Think about specific actions that will reduce the gap between what you have and what you want. Visualize the steps you will take to make this solution a reality, and arrange them in chronological order. To make your plan even more powerful, put it in writing.

Step 4: Perform your plan. This step gets you off your chair and out into the world. Now you actually do what you have planned.

Ultimately, your skill in solving problems lies in how well you perform your plan. Through the quality of your actions, you become the architect of your own success.

When facing problems, experiment with these four Ps, and remember that the order of steps is not absolute. Also remember that any solution has the potential to create new problems. If that happens, cycle through the four Ps of problem solving again.

Critical Thinking Skills in Action: Thinking About Your Major, Part 1

One decision that troubles many students in higher education is the choice of a major. Weighing the benefits, costs, and outcomes of a possible major is an intellectual challenge. This choice is an opportunity to apply your critical thinking, decision-making, and problem-solving skills. The following suggestions will guide you through this seemingly overwhelming process.

The first step is to discover options. You can use the following suggestions to discover options for choosing your major:

Follow the fun. Perhaps you look forward to attending one of your classes and even like completing the assignments. This is a clue to your choice of major.

See whether you can find lasting patterns in the subjects and extracurricular activities that you’ve enjoyed over the years. Look for a major that allows you to continue and expand on these experiences.

Also, sit down with a stack of 3 × 5 cards and brainstorm answers to the following questions:

  • What do you enjoy doing most with your unscheduled time?
  • Imagine that you’re at a party and having a fascinating conversation. What is this conversation about?
  • What kind of problems do you enjoy solving—those that involve people? Products? Ideas?
  • What interests are revealed by your choices of reading material, television shows, and other entertainment?
  • What would an ideal day look like for you? Describe where you would live, who would be with you, and what you would do throughout the day. Do any of these visions suggest a possible major?

Questions like these can uncover a “fun factor” that energizes you to finish the work of completing a major.

Consider your abilities. In choosing a major, ability counts as much as interest. In addition to considering what you enjoy, think about times and places when you excelled. List the courses that you aced, the work assignments that you mastered, and the hobbies that led to rewards or recognition. Let your choice of a major reflect a discovery of your passions and potentials.

Use formal techniques for self-discovery. Explore questionnaires and inventories that are designed to correlate your interests with specific majors. Examples include the Strong Interest Inventory and the Self-Directed Search. Your academic advisor or someone in your school’s career planning office can give you more details about these and related assessments. For some fun, take several of them and meet with an advisor to interpret the results. Remember inventories can help you gain self-knowledge, and other people can offer valuable perspectives. However, what you do with all this input is entirely up to you.

Critical Thinking Skills in Action: Thinking About Your Major, Part 2

As you review the following additional suggestions of discovering options, think about what strategies you already use in your own decision-making process. Also think about what new strategies you might try in the future.

Link to long-term goals. Your choice of a major can fall into place once you determine what you want in life. Before you choose a major, back up to a bigger picture. List your core values, such as contributing to society, achieving financial security and professional recognition, enjoying good health, or making time for fun. Also write down specific goals that you want to accomplish 5 years, 10 years, or even 50 years from today.

Many students find that the prospect of getting what they want in life justifies all of the time, money, and day-to-day effort invested in going to school. Having a major gives you a powerful incentive for attending classes, taking part in discussions, reading textbooks, writing papers, and completing other assignments. When you see a clear connection between finishing school and creating the life of your dreams, the daily tasks of higher education become charged with meaning.

Ask other people. Key people in your life might have valuable suggestions about your choice of major. Ask for their ideas, and listen with an open mind. At the same time, distance yourself from any pressure to choose a major or career that fails to interest you. If you make a choice solely on the basis of the expectations of other people, you could end up with a major or even a career you don’t enjoy.

Gather information. Check your school’s catalog or website for a list of available majors. Here is a gold mine of information. Take a quick glance, and highlight all the majors that interest you. Then talk to students who have declared these majors. Also read the descriptions of courses required for these majors. Do you get excited about the chance to enroll in them? Pay attention to your gut feelings.

Also chat with instructors who teach courses in a specific major. Ask for copies of their class syllabi. Go to the bookstore and browse the required texts. Based on all of this information, write a list of prospective majors. Discuss them with an academic advisor and someone at your school’s career-planning center.

Invent a major. When choosing a major, you might not need to limit yourself to those listed in your school catalog. Many schools now have flexible programs that allow for independent study. Through such programs, you might be able to combine two existing majors or invent an entirely new one of your own.

Consider a complementary minor. You can add flexibility to your academic program by choosing a minor to complement or contrast with your major. The student who wants to be a minister could opt for a minor in English; all of those courses in composition can help in writing sermons. Or the student with a major in psychology might choose a minor in business administration, with the idea of managing a counseling service some day. An effective choice of a minor can expand your skills and career options.

Think critically about the link between your major and your career. Your career goals might have a significant impact on your choice of major.

You could pursue a rewarding career by choosing among several different majors. Even students planning to apply for law school or medical school have flexibility in their choice of majors. In addition, after graduation, many people tend to be employed in jobs that have little relationship to their major. And you might choose a career in the future that is unrelated to any currently available major.

Critical Thinking Skills in Action: Thinking About Your Major, Part 3

Once you have discovered all of your options, you can move on to the next step in the process— making a trial choice.

Make a Trial Choice

Pretend that you have to choose a major today. Based on the options for a major that you’ve already discovered, write down the first three ideas that come to mind. Review the list for a few minutes, and then choose one.

Evaluate Your Trial Choice

When you’ve made a trial choice of major, take on the role of a scientist. Treat your choice as a hypothesis, and then design a series of experiments to evaluate and test it. For example:

  • Schedule office meetings with instructors who teach courses in the major. Ask about required course work and career options in the field.
  • Discuss your trial choice with an academic advisor or career counselor.
  • Enroll in a course related to your possible major. Remember that introductory courses might not give you a realistic picture of the workload involved in advanced courses. Also, you might not be able to register for certain courses until you’ve actually declared a related major.
  • Find a volunteer experience, internship, part-time job, or service-learning experience related to the major.
  • Interview students who have declared the same major. Ask them in detail about their experiences and suggestions for success.
  • Interview people who work in a field related to the major and “shadow” them—that is, spend time with those people during their workday.
  • Think about whether you can complete your major given the amount of time and money that you plan to invest in higher education.
  • Consider whether declaring this major would require a transfer to another program or even another school.

If your “experiments” confirm your choice of major, celebrate that fact. If they result in choosing a new major, celebrate that outcome as well.

Also remember that higher education represents a safe place to test your choice of major—and to change your mind. As you sort through your options, help is always available from administrators, instructors, advisors, and peers.

Choose Again

Keep your choice of a major in perspective. There is probably no single “correct” choice. Your unique collection of skills is likely to provide the basis for majoring in several fields.

Odds are that you’ll change your major at least once—and that you’ll change careers several times during your life. One benefit of higher education is mobility. You gain the general skills and knowledge that can help you move into a new major or career field at any time.

Viewing a major as a one-time choice that determines your entire future can raise your stress levels. Instead, look at choosing a major as the start of a continuing path that involves discovery, choice, and passionate action.

As you review this example of how you can use critical thinking to make a decision about choosing your major, think about how you will use your critical thinking to make decisions and solve problems in the future.

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

3.3.1: Welcome to Unit 3

  • Last updated
  • Save as PDF
  • Page ID 47775

  • Shantel Ivits
  • Vancouver Community College via BCCampus

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)


In these pages, you will take a walk along the forests, mountains, and oceans of British Columbia. You will read about some of the animals of our province. Along the way, you will learn many new skills that will help you be a stronger reader and writer.

Learning Goals

In Unit 3, you will:

  • Tell the difference between fact and opinion
  • Read open syllables
  • Read and write compound words
  • Read r-controlled syllables
  • Form an opinion
  • Write sentences using continuous present tense
  • Use clues to understand hard words
  • Read and understand homonyms

Talking about Time

A big part of being a good student is learning how to use your time well.

Read this story about Gus. Gus is very busy. How does he get by?

My name is Gus. I am 22 years old. I am a very busy guy. I work in a kitchen all day. I am a really good cook. One day, I want to go to college to become a chef. So after work, I go to night school where I learn to read and write. I have a learning disability. I went to high school in British Columbia, but I fell through the cracks. Sure, going to work and school at the same time is hard. I won’t give up, though. I just have to plan my time well. I always leave for work and school earlier than I need to. That way, I will always be on time. I make to-do lists to keep track of my homework and chores. I set myself deadlines for each task on my list. Thinking about my dream of being a chef helps me meet my deadlines, even when I don’t feel like it. And I make sure to leave lots of time for sleep, so that I’ll be at my best during the day. Like I said, I’m a busy guy. I’ve got big dreams.

Writing Task

Write the title Talking about Time in your notebook. On a new line, answer these questions:

  • What strategies does Gus use to plan his time?
  • What strategies would you like to use to plan your time?

In this unit, you will read about three different animals that call British Columbia home. As you go through each chapter, think about ways you can use your time well.


  1. System 1 & 2 Daniel Kahneman

    critical thinking system 1 and 2

  2. 6 Main Types of Critical Thinking Skills (With Examples)

    critical thinking system 1 and 2

  3. Critical Thinking Definition, Skills, and Examples

    critical thinking system 1 and 2

  4. Critical Thinking Skills

    critical thinking system 1 and 2

  5. 2 Ways of Thinking

    critical thinking system 1 and 2

  6. Critical Thinking

    critical thinking system 1 and 2


  1. Simple Problem Solving Models

  2. மனசா மூளையா!! Thinking Fast & Slow

  3. What is critical thinking

  4. Intelligence Analysis Skills: Critical Thinking (Part 1

  5. LOGIC and Critical Thinking Chapter one part 1

  6. Ethiopian university freshman courses|logic and critical thinking chapter 2 part 2


  1. System 1 and System 2 Thinking

    There are three common misconceptions that have emerged in popular culture. 5. First is the idea that System 1 and System 2 thinking literally represents our brain structure. This is false, and Kahneman even says that "there is no part of the brain that either of the systems would call home." 10. Second is the idea that System 1 thinking ...

  2. System 1 and System 2 Thinking: How We Make Decisions

    System 1's advantage in decision-making lays in its tremendous decision-making speed and output. System 2's advantage lays with complex problem solving like thinking through multiple possibilities, thinking in steps or ordering events. (System 1 and System 2: Illustrative example of differences in decision-making)

  3. There Are Two Types and Two Systems of Cognitive Processes

    System 1 is a fast, automatic process that quickly sizes up a situation and jumps to a conclusion. System 2 is a slower, more deliberate process that attempts to work through the problem more ...

  4. Kahneman Fast and Slow Thinking: System 1 and 2 Explained by SUE

    Heuristics can be mental shortcuts that ease the cognitive load of making a decision. A heuristic is our automatic brain at work. If we bring it back to Kahneman's thinking, a heuristic is simply a shortcut our automatic (system 1) brain makes to save the mental energy of our deliberate (system 2) brain. This is our survival mechanism at play.

  5. Of 2 Minds: How Fast and Slow Thinking Shape Perception and Choice

    These two systems that the brain uses to process information are the focus of Nobelist Daniel Kahneman's new book, Thinking, Fast and Slow (Farrar, Straus and Giroux, LLC., 2011). The following ...

  6. PDF Psychologists at the Gate: A Review of Daniel Kahneman's Thinking, Fast

    2. System 1 and System 2 Kahneman's book is organized around the metaphor of System 1 and System 2, adopted from Stanovich and West (2000). As the title of the book suggests, System 1 corresponds to thinking fast, and System 2 to thinking slow. Kahneman describes System 1 in many evocative ways: it is intuitive, auto-

  7. The False Dilemma: System 1 vs. System 2

    System 1 isn't some error-prone decision-making process, and System 2 isn't devoid of errors. Biases, motivated reasoning, and fallacious reasoning affect all decision-making, whether it is ...

  8. Kahneman's Mind-Clarifying Strangers: System 1 & System 2

    System 1 "is the brain's fast, automatic, intuitive approach, System 2 "the mind's slower, analytical mode, where reason dominates.". Kahneman says "System 1 is…more influential ...

  9. The Pros and Cons of Identifying Critical Thinking with System 2

    The dual-process model of cognition but most especially its reflective component, system 2 processing, shows strong conceptual links with critical thinking. In fact, the salient characteristics of system 2 processing are so strikingly close to that of critical thinking, that it is tempting to claim that critical thinking is system 2 processing, no more and no less. In this article, I consider ...

  10. Thinking, Fast and Slow

    Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman.The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.. The book delineates rational and non-rational motivations or triggers associated with each type of thinking process, and how ...

  11. System 1 vs. System 2: Thinking shouldn't always be rational

    Share System 1 vs. System 2 thinking: Why it isn't strategic to always be rational on Twitter (X) ... critical thinking Ethics logic psychology religion. In this article

  12. Understanding How We Think: System 1 and System 2

    Abstract. System 1 thinking enables us to reach a judgment quickly and effortlessly based on incomplete and even contradictory information. This ability has developed during evolution and contributed to the survival of our species, especially at the beginning of human development. System 2 "kicks in" when we encounter a complex calculation ...

  13. Neurobiology of System 1 and System 2 Thinking

    Conclusion. The neurobiology of System 1 and System 2 thinking provides valuable insights into the cognitive processes and behavioral implications that underlie human decision-making and cognition ...

  14. Moving Beyond System 1 and System 2

    For instance, Kahneman (2011) distinguishes between System 1 and System 2 thinking not only on the basis of speed (fast vs. slow) but also in terms of intentionality (unintentional vs. intentional), consciousness (unconscious vs. conscious), and the nature of processing (associative vs. propositional). This implies that all fast thinking is ...

  15. Lessons from Thinking, Fast & Slow: System 1 and System 2

    The concept os two thinking systems, System 1 Thinking and System 2 Thinking, was created by the Nobel Prize winner and the intellectual godfather of behavioural economics, Daniel Kahneman in the book Thinking, Fast & Slow.He and his great collaborator Amos Tversky framed human thinking in two forms that they call System 1 and System 2. According to Kahneman and Tversky, human judgment and ...

  16. System 1 Thinking: How It Works (And When You Shouldn't Trust It)

    System 1 thinking automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions. System 1 thinking can detect errors and recruits System 2 for additional firepower. Kahneman tells a story of a veteran firefighter who entered a burning ...

  17. System 1 and System 2 Thinking

    FACT 2: System 1 and 2 work in tandem, not as separate entities. Another myth or common misconception is that system 1 and 2 are hierarchical processes with one occurring before the other. In more general terms this means that people often think system 1 thinking occurs first and system 2 thinking following later if necessary.

  18. System 1 vs. System 2 Thinking

    While the majority of cognitive psychologists now embrace the dual-processing theory of the mind, Systems 1 and 2, there are still some who disagree. Most evolutionary psychologists, in contrast, dispute the existence of System 2, a domain-general mind, although some disagree. However, a consensus is growing in favor of System 2, although evolutionary psychologists' concerns must be addressed.

  19. Systems 1 and 2 thinking processes and cognitive reflection testing in

    Methods. The CRT is a three-question test designed to measure the ability of respondents to activate metacognitive processes and switch to System 2 (analytic) thinking where System 1 (intuitive) thinking would lead them astray. Each CRT question has a correct analytical (System 2) answer and an incorrect intuitive (System 1) answer.

  20. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  21. Critical Thinking -- System 1 or 2 Thinking

    2011 •. Emily Lai. Critical thinking includes the component skills of analyzing arguments, making inferences using inductive or deductive reasoning, judging or evaluating, and making decisions or solving problems. Background knowledge is a necessary but not a sufficient condition for enabling critical thought within a given subject.

  22. 5.1: Understanding Critical Thinking

    Characteristics and Behaviors of Critical Thinkers, Part 1. The highest levels of critical thinking call for the highest investments of time and energy. Also, moving from a lower level of thinking to a higher level often requires courage and an ability to tolerate discomfort. Give yourself permission to experiment, practice, and learn from ...

  23. What Are Critical Thinking Skills and Why Are They Important?

    According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills. Very helpful in promoting creativity. Important for self-reflection.

  24. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  25. 5.2: Becoming a Critical Thinker

    Level 5 involves genuine critical thinking. At this level, you agree with an idea, disagree with it, or suspend judgment about it until you get more information. In addition, you give reasons for your opinion and offer supporting evidence. Some key words in level 5 questions are critique, defend, and comment.

  26. Teach Critical, Creative, and Independent Thinking

    Independent Thinking. Independent thinking refers to the ability to form one's own opinions and make decisions without undue influence from others. It involves self-reflection, confidence, and the courage to stand by one's beliefs. Independent thinkers are better equipped to lead, innovate, and contribute meaningfully to society.

  27. 5.3: Using Critical Thinking Skills- Decision Making and Problem

    Using Critical Thinking Skills in Problem Solving. Think of problem solving as a process with four Ps: Define the problem, generate possibilities, create a plan, and perform your plan. Step 1: Define the problem. To define a problem effectively, understand what a problem is—a mismatch between what you want and what you have.

  28. Bloom's taxonomy

    Bloom's taxonomy is a set of three hierarchical models used for classification of educational learning objectives into levels of complexity and specificity. The three lists cover the learning objectives in cognitive, affective and psychomotor domains. The cognitive domain list has been the primary focus of most traditional education and is frequently used to structure curriculum learning ...

  29. 1.2: Symbolic Selves in Society

    Literacy and Critical Thinking A Theory of Literate Action - Literate Action II (Bazerman) 1: Chapters 1.2: Symbolic Selves in Society - Vygotsky on Language and Formation of the Social Mind ... This disciplined functional system provides structure to both partners' contributions, making available to the less knowledgeable partner hints about ...

  30. 3.3.1: Welcome to Unit 3

    In Unit 3, you will: Tell the difference between fact and opinion. Read open syllables. Read and write compound words. Read r-controlled syllables. Form an opinion. Write sentences using continuous present tense. Use clues to understand hard words. Read and understand homonyms.