• How it works

researchprospect post subheader

What is Critical Thinking in Academics – Guide With Examples

Published by Grace Graffin at October 17th, 2023 , Revised On October 17, 2023

In an era dominated by vast amounts of information, the ability to discern, evaluate, and form independent conclusions is more crucial than ever. Enter the realm of “critical thinking.” But what does this term truly mean? 

What is Critical Thinking?

Critical thinking is the disciplined art of analysing and evaluating information or situations by applying a range of intellectual skills. It goes beyond mere memorisation or blind acceptance of information, demanding a deeper understanding and assessment of evidence, context, and implications.

Moreover, paraphrasing in sources is an essential skill in critical thinking, as it allows for representing another’s ideas in one’s own words, ensuring comprehension.

Critical thinking is not just an academic buzzword but an essential tool. In academic settings, it serves as the backbone of genuine understanding and the springboard for innovation. When students embrace critical thinking, they move from being passive recipients of information to active participants in their own learning journey.

They question, evaluate, and synthesise information from various sources, fostering an intellectual curiosity that extends beyond the classroom. Part of this involves understanding how to integrate sources into their work, which means not only including information from various places, but also doing so in a cohesive and logical way.

The importance of critical thinking in academics cannot be overstated. It equips students with the skills to discern credible sources from unreliable ones, develop well-informed arguments, and approach problems with a solution-oriented mindset.

The Origins and Evolution of Critical Thinking

The idea of critical thinking isn’t a new-age concept. Its roots reach back into ancient civilisations, moulding the foundations of philosophy, science, and education. To appreciate its evolution, it’s vital to delve into its historical context and the influential thinkers who have championed it.

Historical Perspective on the Concept of Critical Thinking

The seeds of critical thinking can be traced back to Ancient Greece, particularly in the city-state of Athens. Here, the practice of debate, dialogue, and philosophical inquiry was valued and was seen as a route to knowledge and wisdom. This era prized the art of questioning, investigating, and exploring diverse viewpoints to reach enlightened conclusions.

In medieval Islamic civilisation, scholars in centres of learning, such as the House of Wisdom in Baghdad, played a pivotal role in advancing critical thought. Their works encompassed vast areas, including philosophy, mathematics, and medicine, often intertwining rigorous empirical observations with analytical reasoning.

The Renaissance period further nurtured critical thinking as it was a time of revival in art, culture, and intellect. This era championed humanistic values, focusing on human potential and achievements. It saw the rebirth of scientific inquiry, scepticism about religious dogma, and an emphasis on empirical evidence.

Philosophers and Educators Who Championed Critical Thinking

Several philosophers and educators stand out for their remarkable contributions to the sphere of critical thinking:

Known for the Socratic method, a form of cooperative argumentative dialogue, Socrates would ask probing questions, forcing his pupils to think deeply about their beliefs and assumptions. His methodology still influences modern education, emphasising the answer and the path of reasoning that leads to it.

A student of Socrates, Plato believed in the importance of reason and inquiry. His allegory of the cave highlights the difference between blindly accepting information and seeking true knowledge.

He placed great emphasis on empirical evidence and logic. His works on syllogism and deductive reasoning laid the foundation for systematic critical thought.

Al-Farabi And Ibn Rushd (Averroes)

Islamic philosophers, who harmonised Greek philosophy with Islamic thought, emphasised the importance of rationality and critical inquiry.

Sir Francis Bacon

An advocate for the scientific method, Bacon believed that knowledge should be based on empirical evidence, observation, and experimentation rather than mere reliance on accepted truths.

A modern proponent of critical thinking, Dewey viewed it as an active, persistent, and careful consideration of a belief or supposed form of knowledge. He emphasised that students should be taught to think for themselves rather than just memorise facts.

Paulo Freire

Recognised for his ideas on “problem-posing education,” Freire believed that students should be encouraged to question, reflect upon, and respond to societal issues, fostering critical consciousness.

Characteristics of Critical Thinkers

Critical thinkers are not defined merely by the knowledge they possess, but by the manner in which they process, analyse, and use that knowledge. While the profile of a critical thinker can be multifaceted, certain core traits distinguish them. Let’s delve into these characteristics:

1. Open-mindedness

Open-mindedness refers to the willingness to consider different ideas, opinions, and perspectives, even if they challenge one’s existing beliefs. It allows critical thinkers to avoid being trapped in their own biases or preconceived notions. By being open to diverse viewpoints, they can make more informed and holistic decisions.

  • Listening to a debate without immediately taking sides.
  • Reading literature from different cultures to understand various world views.

2. Analytical Nature

An analytical nature entails the ability to break down complex problems or information into smaller, manageable parts to understand the whole better. Being analytical enables individuals to see patterns, relationships, and inconsistencies, allowing for deeper comprehension and better problem-solving.

  • Evaluating a research paper by examining its methodology, results, and conclusions separately.
  • Breaking down the components of a business strategy to assess its viability.

3. Scepticism

Scepticism is the tendency to question and doubt claims or assertions until sufficient evidence is presented. Skepticism ensures that critical thinkers do not accept information at face value. They seek evidence and are cautious about jumping to conclusions without verification.

  • Questioning the results of a study that lacks a control group.
  • Doubting a sensational news headline and researching further before believing or sharing it.

4. Intellectual Humility

Intellectual humility involves recognising and accepting the limitations of one’s knowledge and understanding. It is about being aware that one does not have all the answers. This trait prevents arrogance and overconfidence. Critical thinkers with intellectual humility are open to learning and receptive to constructive criticism.

  • Admitting when one is wrong in a discussion.
  • Actively seeking feedback on a project or idea to enhance it.

5. Logical Reasoning

Logical reasoning is the ability to think sequentially and make connections between concepts in a coherent manner. It involves drawing conclusions that logically follow from the available information. Logical reasoning ensures that decisions and conclusions are sound and based on valid premises. It helps avoid fallacies and cognitive biases.

  • Using deductive reasoning to derive a specific conclusion from a general statement.
  • Evaluating an argument for potential logical fallacies like “slippery slope” or “ad hominem.”

The Difference Between Critical Thinking and Memorisation

In today’s rapidly changing educational landscape, there is an ongoing debate about the importance of rote memorisation versus the significance of cultivating critical thinking skills. Both have their place in learning, but they serve very different purposes.

Nature Of Learning

  • Rote Learning: Involves memorising information exactly as it is, without necessarily understanding its context or underlying meaning. It’s akin to storing data as-is, without processing.
  • Analytical Processing (Critical Thinking): Involves understanding, questioning, and connecting new information with existing knowledge. It’s less about storage and more about comprehension and application.

Depth of Engagement

  • Rote Learning: Often remains at the surface level. Students might remember facts for a test, but might forget them shortly after.
  • Analytical Processing: Engages deeper cognitive skills. When students think critically, they’re more likely to retain information because they’ve processed it deeper.

Application in New Situations

  • Rote Learning: Information memorised through rote often does not easily apply to new or unfamiliar situations, since it is detached from understanding.
  • Analytical Processing: Promotes adaptability. Critical thinkers can transfer knowledge and skills to different contexts because they understand underlying concepts and principles.

Why Critical Thinking Produces Long-Term Academic Benefits

Here are the benefits of critical thinking in academics. 

Enhanced Retention

Critical thinking often involves active learning—discussions, problem-solving, and debates—which promotes better retention than passive memorisation.

Skill Development

Beyond content knowledge, critical thinking develops skills like analysis, synthesis, source evaluation , and problem-solving. These are invaluable in higher education and professional settings.

Adaptability

In an ever-evolving world, the ability to adapt is crucial. Critical thinkers are better equipped to learn and adapt because they don’t just know facts; they understand concepts.

Lifelong Learning

Critical thinkers are naturally curious. They seek to understand, question, and explore, turning them into lifelong learners who continually seek knowledge and personal growth.

Improved Decision-Making

Analytical processing allows students to evaluate various perspectives, weigh evidence, and make well-informed decisions, a skill far beyond academics.

Preparation for Real-World Challenges

The real world does not come with a textbook. Critical thinkers can navigate unexpected challenges, connect disparate pieces of information, and innovate solutions.

Steps in the Critical Thinking Process

Critical thinking is more than just a skill—it is a structured process. By following a systematic approach, critical thinkers can navigate complex issues and ensure their conclusions are well-informed and reasoned. Here’s a breakdown of the steps involved:

Step 1. Identification and Clarification of the Problem or Question

Recognizing that a problem or question exists and understanding its nature. It’s about defining the issue clearly, without ambiguity. A well-defined problem serves as the foundation for the subsequent steps. The entire process may become misguided without a clear understanding of what’s being addressed.

Example: Instead of a vague problem like “improving the environment,” a more specific question could be “How can urban areas reduce air pollution?”

Step 2. Gathering Information and Evidence

Actively seeking relevant data, facts, and evidence. This might involve research, observations, experiments, or discussions. Reliable decisions are based on solid evidence. The quality and relevance of the information gathered can heavily influence the final conclusion.

Example: To address urban air pollution, one might gather data on current pollution levels, sources of pollutants, existing policies, and strategies employed by other cities.

Step 3. Analysing the Information

Breaking down the gathered information, scrutinising its validity, and identifying patterns, contradictions, and relationships. This step ensures that the information is not just accepted at face value. Critical thinkers can differentiate between relevant and irrelevant information and detect biases or inaccuracies by analysing data.

Example: When examining data on pollution, one might notice that certain industries are major contributors or that pollution levels rise significantly at specific times of the year.

Step 4. Drawing Conclusions and Making Decisions

After thorough analysis, formulating an informed perspective, solution, or decision-based on the evidence. This is the culmination of the previous steps. Here, the critical thinker synthesises the information and applies logic to arrive at a reasoned conclusion.

Example: Based on the analysis, one might conclude that regulating specific industries and promoting public transportation during peak pollution periods can help reduce urban air pollution.

Step 5. Reflecting on the Process And The Conclusions Reached

Take a step back to assess the entire process, considering any potential biases, errors, or alternative perspectives. It is also about evaluating the feasibility and implications of the conclusions. Reflection ensures continuous learning and improvement. Individuals can refine their approach to future problems by evaluating their thinking process.

Example: Reflecting on the proposed solution to reduce pollution, one might consider its economic implications, potential industry resistance, and the need for public awareness campaigns.

The research done by our experts have:

  • Precision and Clarity
  • Zero Plagiarism
  • Authentic Sources

module 4 critical thinking identification research and analysis

The Role of Critical Thinking in Different Academic Subjects

Critical thinking is a universal skill applicable across disciplines. Its methodologies might differ based on the subject, but its core principles remain consistent. Let us explore how critical thinking manifests in various academic domains:

1. Sciences

  • Hypothesis Testing: Science often begins with a hypothesis—a proposed explanation for a phenomenon. Critical thinking is essential in formulating a testable hypothesis and determining its validity based on experimental results.
  • Experimental Design: Designing experiments requires careful planning to ensure valid and reliable results. Critical thinking aids in identifying variables, ensuring controls, and determining the best methodologies to obtain accurate data.
  • Example: In a biology experiment to test the effect of light on plant growth, critical thinking helps ensure variables like water and soil quality are consistent, allowing for a fair assessment of the light’s impact.

2. Humanities

  • Analysing Texts: Humanities often involve studying texts—literature, historical documents, or philosophical treatises. Critical thinking lets students decode themes, discern authorial intent, and recognise underlying assumptions or biases.
  • Understanding Contexts: Recognizing a text or artwork’s cultural, historical, or social contexts is pivotal. Critical thinking allows for a deeper appreciation of these contexts, providing a holistic understanding of the subject.
  • Example: When studying Shakespeare’s “Othello,” critical thinking aids in understanding the play’s exploration of jealousy, race, and betrayal, while also appreciating its historical context in Elizabethan England.

3. Social Sciences

  • Evaluating Arguments: Social sciences, such as sociology or political science, often present various theories or arguments about societal structures and behaviours. Critical thinking aids in assessing the merits of these arguments and recognising their implications.
  • Understanding Biases: Since social sciences study human societies, they’re susceptible to biases. Critical thinking helps identify potential biases in research or theories, ensuring a more objective understanding.
  • Example: In studying economic policies, critical thinking helps weigh the benefits and drawbacks of different economic models, considering both empirical data and theoretical arguments.

4. Mathematics

  • Problem-Solving: Mathematics is more than just numbers; it is about solving problems. Critical thinking enables students to identify the best strategies to tackle problems, ensuring efficient and accurate solutions.
  • Logical Deduction: Mathematical proofs and theorems rely on logical steps. Critical thinking ensures that each step is valid and the conclusions sound.
  • Example: In geometry, when proving that two triangles are congruent, critical thinking helps ensure that each criterion (like side lengths or angles) is met and the logic of the proof is coherent.

Examples of Critical Thinking in Academics

Some of the critical thinking examples in academics are discussed below. 

Case Study 1: Evaluating A Scientific Research Paper

Scenario: A research paper claims that a new herbal supplement significantly improves memory in elderly individuals.

Critical Thinking Application:

Scrutinising Methodology:

  • Was the study double-blind and placebo-controlled?
  • How large was the sample size?
  • Were the groups randomised?
  • Were there any potential confounding variables?

Assessing Conclusions:

  • Do the results conclusively support the claim, or are there other potential explanations?
  • Are the statistical analyses robust, and do they show a significant difference?
  • Is the effect size clinically relevant or just statistically significant?

Considering Broader Context:

  • How does this study compare with existing literature on the subject?
  • Were there any conflicts of interest, such as funding from the supplement company?

Critical analysis determined that while the study showed statistical significance, the effect size was minimal. Additionally, the sample size was small, and there was potential bias as the supplement manufacturer funded the study.

Case Study 2: Analysing a Literary Text

Scenario: A reading of F. Scott Fitzgerald’s “The Great Gatsby.”

Understanding Symbolism:

  • What does the green light represent for Gatsby and in the broader context of the American Dream?
  • How does the Valley of Ashes symbolise societal decay?

Recognising Authorial Intent:

  • Why might Fitzgerald depict the characters’ lavish lifestyles amid underlying dissatisfaction?
  • What critiques of American society is Fitzgerald potentially making?

Contextual Analysis:

  • How does the era in which the novel was written (Roaring Twenties) influence its themes and characters?

Through critical analysis, the reader recognises that while “The Great Gatsby” is a tale of love and ambition, it’s also a poignant critique of the hollowness of the American Dream and the societal excesses of the 1920s.

Case Study 3: Decoding Historical Events

Scenario: The events leading up to the American Revolution.

Considering Multiple Perspectives:

  • How did the British government view the colonies and their demands?
  • What were the diverse perspectives within the American colonies, considering loyalists and patriots?

Assessing Validity of Sources:

  • Which accounts are primary sources, and which are secondary?
  • Are there potential biases in these accounts, based on their origins?

Analysing Causation and Correlation:

  • Were taxes and representation the sole reasons for the revolution, or were there deeper economic and philosophical reasons?

Through critical analysis, the student understands that while taxation without representation was a significant catalyst, the American Revolution was also influenced by Enlightenment ideas, economic interests, and long-standing grievances against colonial policies.

Challenges to Developing Critical Thinking Skills

In our complex and rapidly changing world, the importance of critical thinking cannot be overstated. However, various challenges can impede the cultivation of these vital skills. 

1. Common Misconceptions and Cognitive Biases

Human brains often take shortcuts in processing information, leading to cognitive biases. Additionally, certain misconceptions about what constitutes critical thinking can hinder its development.

  • Confirmation Bias: The tendency to search for, interpret, and recall information that confirms one’s pre-existing beliefs.
  • Anchoring Bias: Relying too heavily on the first piece of information encountered when making decisions.
  • Misconception: Believing that critical thinking merely means being critical or negative about ideas, rather than evaluating them objectively.

These biases can skew perception and decision-making, making it challenging to objectively approach issues.

2. The Influence of Technology and Social Media

While providing unprecedented access to information, the digital age also presents unique challenges. The barrage of information, the immediacy of social media reactions, and algorithms that cater to user preferences can hinder critical thought.

  • Information Overload: The sheer volume of online data can make it difficult to discern credible sources from unreliable ones.
  • Clickbait and Misinformation: Articles with sensational titles designed to generate clicks might lack depth or accuracy.
  • Algorithmic Bias: Platforms showing users content based on past preferences can limit exposure to diverse viewpoints.

Relying too heavily on technology and social media can lead to superficial understanding, reduced attention spans, and a narrow worldview.

3. The Danger of Echo Chambers and Confirmation Bias

An echo chamber is a situation in which beliefs are amplified or reinforced by communication and repetition inside a closed system, cutting off differing viewpoints.

  • Social Media Groups: Joining groups or following pages that only align with one’s beliefs can create a feedback loop, reinforcing existing opinions without challenge.
  • Selective Media Consumption: Only watching news channels or reading websites that align with one’s political or social views.

Echo chambers reinforce confirmation bias, limit exposure to diverse perspectives, and can polarise opinions, making objective, critical evaluation of issues challenging.

Benefits of Promoting Critical Thinking in Education

When cultivated and promoted in educational settings, critical thinking can have transformative effects on students, equipping them with vital skills to navigate their academic journey and beyond. Here’s an exploration of the manifold benefits of emphasising critical thinking in education:

Improved Problem-Solving Skills

Critical thinking enables students to approach problems methodically, breaking them down into manageable parts, analysing each aspect, and synthesising solutions.

  • Academic: Enhances students’ ability to tackle complex assignments, research projects, and unfamiliar topics.
  • Beyond School: Prepares students for real-world challenges where they might encounter problems without predefined solutions.

Enhanced Creativity and Innovation

Critical thinking is not just analytical but also involves lateral thinking, helping students see connections between disparate ideas and encouraging imaginative solutions.

  • Academic: Promotes richer discussions, more creative projects, and the ability to view topics from multiple angles.
  • Beyond School: Equips students for careers and situations where innovative solutions can lead to advancements in fields like technology, arts, or social entrepreneurship.

Better Decision-Making Abilities

Critical thinkers evaluate information thoroughly, weigh potential outcomes, and make decisions based on evidence and reason rather than impulse or peer pressure.

  • Academic: Helps students make informed choices about their studies, research directions, or group projects.
  • Beyond School: Prepares students to make sound decisions in personal and professional spheres, from financial choices to ethical dilemmas.

Greater Resilience in the Face of Complex Challenges

Critical thinking nurtures a growth mindset. When students think critically, they are more likely to view challenges as opportunities for learning rather than insurmountable obstacles.

  • Academic: Increases perseverance in difficult subjects, promoting a deeper understanding rather than superficial learning. Students become more resilient in handling academic pressures and setbacks.
  • Beyond School: Cultivates individuals who can navigate the complexities of modern life, from career challenges to societal changes, with resilience and adaptability.

Frequently Asked Questions

What is critical thinking.

Critical thinking is the objective analysis and evaluation of an issue to form a judgment. It involves gathering relevant information, discerning potential biases, logically connecting ideas, and questioning assumptions. Essential for informed decision-making, it promotes scepticism and requires the ability to think independently and rationally.

What makes critical thinking?

Critical thinking arises from questioning assumptions, evaluating evidence, discerning fact from opinion, recognising biases, and logically connecting ideas. It demands curiosity, scepticism, and an open mind. By continuously challenging one’s beliefs and considering alternative viewpoints, one cultivates the ability to think clearly, rationally, and independently.

What is the purpose of critical thinking?

The purpose of critical thinking is to enable informed decisions by analysing and evaluating information objectively. It fosters understanding, problem-solving, and clarity, reducing the influence of biases and misconceptions. Through critical thinking, individuals discern truth, make reasoned judgments, and engage more effectively in discussions and debates.

How to improve critical thinking?

  • Cultivate curiosity by asking questions.
  • Practice active listening.
  • Read widely and diversely.
  • Engage in discussions and debates.
  • Reflect on your thought processes.
  • Identify biases and challenge assumptions.
  • Solve problems systematically.

What are some critical thinking skills?

  • Analysis: breaking concepts into parts.
  • Evaluation: judging information’s validity.
  • Inference: drawing logical conclusions.
  • Explanation: articulating reasons.
  • Interpretation: understanding meaning.
  • Problem-solving: devising effective solutions.
  • Decision-making: choosing the best options.

What is information literacy?

Information literacy is the ability to find, evaluate, and use information effectively. It encompasses understanding where to locate information, determining its credibility, distinguishing between facts and opinions, and using it responsibly. Essential in the digital age, it equips individuals to navigate the vast sea of data and make informed decisions.

What makes a credible source?

  • Authorship by experts or professionals.
  • Reliable publisher or institution backing.
  • Transparent sourcing and references.
  • Absence of bias or clear disclosure of it.
  • Recent publications or timely updates.
  • Peer review or editorial oversight.
  • Clear, logical arguments.
  • Reputability in its field or domain.

How do I analyse information critically?

  • Determine the source’s credibility.
  • Identify the main arguments or points.
  • Examine the evidence provided.
  • Spot inconsistencies or fallacies.
  • Detect biases or unspoken assumptions.
  • Cross-check facts with other sources.
  • Evaluate the relevance to your context.
  • Reflect on your own biases or beliefs.

You May Also Like

In our vast world of information, conveying ideas in our own words is crucial. This brings us to the practice of “paraphrasing.” 

A credible source can be trusted to provide accurate, reliable, and unbiased information. Credible sources are essential for various purposes, including academic research, journalism, decision-making, and gaining knowledge on various topics.

In academia, research, journalism, and writing, the skill of quoting sources is fundamental. Accurate and proper quoting adds credibility to your work and demonstrates respect for the original authors and their ideas.

USEFUL LINKS

LEARNING RESOURCES

researchprospect-reviews-trust-site

COMPANY DETAILS

Research-Prospect-Writing-Service

  • How It Works

Library Home

Critical Thinking in Academic Research - Second Edition

(5 reviews)

module 4 critical thinking identification research and analysis

Cindy Gruwell, University of West Florida

Robin Ewing, St. Cloud State University

Copyright Year: 2022

Last Update: 2023

Publisher: Minnesota State Colleges and Universities

Language: English

Formats Available

Conditions of use.

Attribution-ShareAlike

Learn more about reviews.

Reviewed by Erin Weldon, Director of Instructional Design and Development, Trine University on 8/28/24

The textbook offers a comprehensive overview of academic research with a focus on essential tools and strategies for researchers. It includes a valuable section on barriers of critical thinking. To enhance its utility, consider adding a section... read more

Comprehensiveness rating: 5 see less

The textbook offers a comprehensive overview of academic research with a focus on essential tools and strategies for researchers. It includes a valuable section on barriers of critical thinking. To enhance its utility, consider adding a section highlighting the advantages or benefits of critical thinking, which can help mitigate these barriers. While discussing challenges is crucial, perhaps positioning this section later in the textbook would allow readers to build a stronger foundation before addressing potential obstacles. A suggestion for the readers is to read the barriers chapter toward the end after you have a strong foundation of critical thinking in academic research.

Content Accuracy rating: 5

The textbook presents unbiased information and offers descriptive evidence to support its arguments on critical thinking topics. Examples illustrate the application of critical thinking in research. Throughout the text, valuable tips encourage readers to cultivate critical thinking skills and provide practical suggestions that they can put into action. For instance, in Chapter 5, learners are provided with tips on how to keep track of information. I appreciate the citation management software options. This is a great resource to get readers started on their research journey.

Relevance/Longevity rating: 5

The textbook effectively aligns with current research methods and strategies. The text includes the most recent academic trends. Its strategies provide learners with practical guidance on conducting thorough research, ensuring proper attribution and compliance with copyright regulations, which are increasingly important in today's world.

Clarity rating: 4

The textbook provides clear chapters that coincide with the purpose. I appreciate the examples to clarify the strategies for critical thinking in research. A few interactive quizzes could benefit from clearer instructions to enhance their effectiveness. For instance, in Chapter 8, providing specific guidance within the Main Concepts section would clarify the quiz's purpose. While the questions may appear factual, they actually require learners to identify the main concept or topic. This added clarity would improve the overall learning experience.

Consistency rating: 5

The textbook's topics are interconnected and progressively build upon one another, forming a comprehensive and reliable resource. The text begins by introducing key concepts of critical thinking, then progresses into effective research strategies, skills, and methodologies, concluding with a discussion of ethical considerations and standards for responsible research practices.

Modularity rating: 5

The text displays excellent readability. I appreciate the awareness of accessibility with the use of alt-text on the images. There should be little to no barriers for readers. To further enhance inclusivity, adding audio narration to the text would provide an alternative format that is easily accessible within the book.

Organization/Structure/Flow rating: 4

The overall organization of the textbook is clear and concise. See comment above about rearranging the barriers chapter later in the book such as after a section on the benefits as mentioned above.

Interface rating: 5

There does not appear to be any navigational issues throughout the book. From an accessibility standpoint, everything appears to be accessible in a way that would not distract or cause barriers to the reader.

Grammatical Errors rating: 5

No grammatical errors present or noticeable that I could see.

Cultural Relevance rating: 5

This comprehensive textbook fosters an inclusive learning environment by presenting a diverse range of examples that illustrate various perspectives. Readers are encouraged to develop a deep understanding of different cultures and experiences and appreciate the value of diverse viewpoints.

This text is an excellent resource for anyone who needs a textbook for a research class,. his comprehensive text serves as a valuable resource for students and researchers, providing a solid foundation for coursework in research methods.

Reviewed by Julie Jaszkowiak, Community Faculty, Metropolitan State University on 12/22/23

Organized in 11 parts, this his textbook includes introductory information about critical thinking and details about the academic research process. The basics of critical thinking related to doing academic research in Parts I and II. Parts III –... read more

Organized in 11 parts, this his textbook includes introductory information about critical thinking and details about the academic research process. The basics of critical thinking related to doing academic research in Parts I and II. Parts III – XI provide specifics on various steps in doing academic research including details on finding and citing source material. There is a linked table of contents so the reader is able to jump to a specific section as needed. There is also a works cited page with information and links to works used for this textbook.

The content of this textbook is accurate and error free. It contains examples that demonstrate concepts from a variety of disciplines such as “hard science” or “popular culture” that assist in eliminating bias. The authors are librarians so it is clear that their experience as such leads to clear and unbiased content.

General concepts about critical thinking and academic research methodology is well defined and should not become obsolete. Specific content regarding use of citation tools and attribution structure may change but the links to various research sites allow for simple updates.

Clarity rating: 5

This textbook is written in a conversational manner that allows for a more personal interaction with the textbook. It is like the reader is having a conversation with a librarian. Each part has an introduction section that fully defines concepts and terms used for that part.

In addition to the written content, this textbook contains links to short quizzes at the end of each section. This is consistent throughout each part. Embedded links to additional information are included as necessary.

Modularity rating: 4

This textbook is arranged in 11 modular parts with each part having multiple sections. All of these are linked so a reader can go to a distinct part or section to find specific information. There are some links that refer back to previous sections in the document. It can be challenging to return to where you were once you have jumped to a different section.

Organization/Structure/Flow rating: 5

There is clear definition as to what information is contained within each of the parts and subsequent sections. The textbook follows the logical flow of the process of researching and writing a research paper.

Interface rating: 4

The pictures have alternative text that appears when you hover over the text. There is one picture on page 102 that is a link to where the downloaded picture is from. The pictures are clear and supportive of the text for a visual learner. All the links work and go to either the correct area of the textbook or to a valid website. If you are going to use the embedded links to go to other sections of the textbook you need to keep track of where you are as it can sometimes get confusing as to where you went based on clicking links.

Grammatical Errors rating: 4

This is not really a grammatical error but I did notice on some of the quizzes if you misspelled a work for fill in the blank it was incorrect. It was also sometimes challenging to come up with the correct word for the fill in the blanks.

There are no examples or text that are culturally insensitive or offensive. The examples are general and would be applicable to a variety of students study many different academic subjects. There are references and information to many research tools from traditional such as checking out books and articles from the library to more current such as blogs and other electronic sources. This information appeals to a wide expanse of student populations.

I really enjoyed the quizzes at the end of each section. It is very beneficial to test your knowledge and comprehension of what you just read. Often I had to return and reread the content more critically based on my quiz results! They are just the right length to not disrupt the overall reading of the textbook and cover the important content and learning objectives.

Reviewed by Sara Stigberg, Adjunct Reference Librarian, Truman College, City Colleges of Chicago on 3/15/23

Critical Thinking in Academic Research thoroughly covers the basics of academic research for undergraduates, including well-guided deeper dives into relevant areas. The authors root their introduction to academic research principles and practices... read more

Critical Thinking in Academic Research thoroughly covers the basics of academic research for undergraduates, including well-guided deeper dives into relevant areas. The authors root their introduction to academic research principles and practices in the Western philosophical tradition, focused on developing students' critical thinking skills and habits around inquiry, rationales, and frameworks for research.

This text conforms to the principles and frames of the Framework for Information Literacy for Higher Education, published by the Association of College and Research Libraries. It includes excellent, clear, step-by-step guides to help students understand rationales and techniques for academic research.

Essential for our current information climate, the authors present relevant information for students who may be new to academic research, in ways and with content that is not too broad or too narrow, or likely to change drastically in the near future.

The authors use clear and well-considered language and explanations of ideas and terms, contextualizing the scholarly research process and tools in a relatable manner. As mentioned earlier, this text includes excellent step-by-step guides, as well as illustrations, visualizations, and videos to instruct students in conducting academic research.

(4.75) The terminology and framework of this text are consistent. Early discussions of critical thinking skills are tied in to content in later chapters, with regard to selecting different types of sources and search tools, as well as rationales for choosing various formats of source references. Consciously making the theme of critical thinking as applied to the stages of academic research more explicit and frequent within the text would further strengthen it, however.

Chapters are divided in a logical, progressive manner throughout the text. The use of embedded links to further readings and some other relevant sections of the text are an excellent way of providing references and further online information, without overwhelming or side-tracking the reader.

Topics in the text are organized in logical, progressive order, transitioning cleanly from one focus to the next. Each chapter begins with a helpful outline of topics that will be covered within it.

There are no technical issues with the interface for this text. Interactive learning tools such as the many self-checks and short quizzes that are included throughout the text are a great bonus for reinforcing student learning, and the easily-accessible table of contents was very helpful. There are some slight inconsistencies across chapters, however, relative to formatting images and text and spacing, and an image was missing in the section on Narrowing a Topic. Justifying copy rather than aligning-left would prevent hyphenation, making the text more streamlined.

(4.75) A few minor punctuation errors are present.

The authors of this text use culturally-relevant examples and inclusive language. The chapter on Barriers to Critical Thinking works directly to break down bias and preconceived notions.

Overall, Critical Thinking in Academic Research is an excellent general textbook for teaching the whys and hows of academic research to undergraduates. A discussion of annotated bibliographies would be a great addition for future editions of the text. ---- (As an aside for the authors, I am curious if the anonymous data from the self-checks and quizzes is being collected and analyzed for assessment purposes. I'm sure it would be interesting!)

Reviewed by Ann Bell-Pfeifer, Program Director/ Instructor, Minnesota State Community and Technical College on 2/15/23

The book has in depth coverage of academic research. A formal glossary and index were not included. read more

Comprehensiveness rating: 4 see less

The book has in depth coverage of academic research. A formal glossary and index were not included.

The book appears error free and factual.

The content is current and would support students who are pursuing writing academic research papers.

Excellent explanations for specific terms were included throughout the text.

The text is easy to follow with a standardized format and structure.

The text contains headings and topics in each section.

It is easy to follow the format and review each section.

The associated links were useful and not distracting.

No evidence of grammatical errors were found in the book.

The book is inclusive.

The book was informative, easy to follow, and sequential allowing the reader to digest each section before moving into another.

Reviewed by Jenny Inker, Assistant Professor, Virginia Commonwealth University on 8/23/22

This book provides a comprehensive yet easily comprehensible introduction to critical thinking in academic research. The author lays a foundation with an introduction to the concepts of critical thinking and analyzing and making arguments, and... read more

This book provides a comprehensive yet easily comprehensible introduction to critical thinking in academic research. The author lays a foundation with an introduction to the concepts of critical thinking and analyzing and making arguments, and then moves into the details of developing research questions and identifying and appropriately using research sources. There are many wonderful links to other open access publications for those who wish to read more or go deeper.

The content of the book appears to be accurate and free of bias.

The examples used throughout the book are relevant and up-to-date, making it easy to see how to apply the concepts in real life.

The text is very accessibly written and the content is presented in a simple, yet powerful way that helps the reader grasp the concepts easily. There are many short, interactive exercises scattered throughout each chapter of the book so that the reader can test their own knowledge as they go along. It would be even better if the author had provided some simple feedback explaining why quiz answers are correct or incorrect in order to bolster learning, but this is a very minor point and the interactive exercises still work well without this.

The book appears consistent throughout with regard to use of terminology and tone of writing. The basic concepts introduced in the early chapters are used consistently throughout the later chapters.

This book has been wonderfully designed into bite sized chunks that do not overwhelm the reader. This is perhaps its best feature, as this encourages the reader to take in a bit of information, digest it, check their understanding of it, and then move on to the next concept. I loved this!

The book is organized in a manner that introduces the basic architecture of critical thinking first, and then moves on to apply it to the subject of academic research. While the entire book would be helpful for college students (undergraduates particularly), the earlier chapters on critical thinking and argumentation also stand well on their own and would be of great utility to students in general.

This book was extremely easy to navigate with a clear, drop down list of chapters and subheadings on the left side of the screen. When the reader clicks on links to additional material, these open up in a new tab which keeps things clear and organized. Images and charts were clear and the overall organization is very easy to follow.

I came across no grammatical errors in the text.

Cultural Relevance rating: 4

This is perhaps an area where the book could do a little more. I did not come across anything that seemed culturally insensitive or offensive but on the other hand, the book might have taken more opportunities to represent a greater diversity of races, ethnicities, and backgrounds.

This book seems tailor made for undergraduate college students and I would highly recommend it. I think it has some use for graduate students as well, although some of the examples are perhaps little basic for this purpose. As well as using this book to guide students on doing academic research, I think it could also be used as a very helpful introduction to the concept of critical thinking by focusing solely on chapters 1-4.

Table of Contents

  • Introduction
  • Part I. What is Critical Thinking?
  • Part II. Barriers to Critical Thinking
  • Part III. Analyzing Arguments
  • Part IV. Making an Argument
  • Part V. Research Questions
  • Part VI. Sources and Information Needs
  • Part VII. Types of Sources
  • Part VIII. Precision Searching
  • Part IX. Evaluating Sources
  • Part X. Ethical Use and Citing Sources
  • Part XI. Copyright Basics
  • Works Cited
  • About the Authors

Ancillary Material

About the book.

Critical Thinking in Academic Research - 2nd Edition provides examples and easy-to-understand explanations to equip students with the skills to develop research questions, evaluate and choose the right sources, search for information, and understand arguments. This 2nd Edition includes new content based on student feedback as well as additional interactive elements throughout the text.

About the Contributors

Cindy Gruwell is an Assistant Librarian/Coordinator of Scholarly Communication at the University of West Florida. She is the library liaison to the department of biology and the College of Health which has extensive nursing programs, public health, health administration, movement, and medical laboratory sciences. In addition to supporting health sciences faculty, she oversees the Argo IRCommons (Institutional Repository) and provides scholarly communication services to faculty across campus. Cindy graduated with her BA (history) and MLS from the University of California, Los Angeles and has a Masters in Education from Bemidji State University. Cindy’s research interests include academic research support, publishing, and teaching.

Robin Ewing is a Professor/Collections Librarian at St. Cloud State University. Robin is the liaison to the College of Education and Learning Design. She oversees content selection for the Library’s collections. Robin graduated with her BBA (Management) and MLIS from the University of Oklahoma. She also has a Masters of Arts in Teaching from Bemidji State University. Robin’s research interests include collection analysis, assessment, and online teaching.

Contribute to this Page

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Don't submit your assignments before you do this

The academic proofreading tool has been trained on 1000s of academic texts. Making it the most accurate and reliable proofreading tool for students. Free citation check included.

module 4 critical thinking identification research and analysis

Try for free

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved September 10, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Pardon Our Interruption

As you were browsing something about your browser made us think you were a bot. There are a few reasons this might happen:

  • You've disabled JavaScript in your web browser.
  • You're a power user moving through this website with super-human speed.
  • You've disabled cookies in your web browser.
  • A third-party browser plugin, such as Ghostery or NoScript, is preventing JavaScript from running. Additional information is available in this support article .

To regain access, please make sure that cookies and JavaScript are enabled before reloading the page.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of jintell

Critical Thinking: A Model of Intelligence for Solving Real-World Problems

Diane f. halpern.

1 Department of Psychology, Claremont McKenna College, Emerita, Altadena, CA 91001, USA

Dana S. Dunn

2 Department of Psychology, Moravian College, Bethlehem, PA 18018, USA; ude.naivarom@nnud

Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. A high IQ is correlated with many important outcomes (e.g., academic prominence, reduced crime), but it does not protect against cognitive biases, partisan thinking, reactance, or confirmation bias, among others. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests. Similarly, some scholars argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Other investigators advocate for critical thinking as a model of intelligence specifically designed for addressing real-world problems. Yes, intelligence (i.e., critical thinking) can be enhanced and used for solving a real-world problem such as COVID-19, which we use as an example of contemporary problems that need a new approach.

1. Introduction

The editors of this Special Issue asked authors to respond to a deceptively simple statement: “How Intelligence Can Be a Solution to Consequential World Problems.” This statement holds many complexities, including how intelligence is defined and which theories are designed to address real-world problems.

2. The Problem with Using Standardized IQ Measures for Real-World Problems

For the most part, we identify high intelligence as having a high score on a standardized test of intelligence. Like any test score, IQ can only reflect what is on the given test. Most contemporary standardized measures of intelligence include vocabulary, working memory, spatial skills, analogies, processing speed, and puzzle-like elements (e.g., Wechsler Adult Intelligence Scale Fourth Edition; see ( Drozdick et al. 2012 )). Measures of IQ correlate with many important outcomes, including academic performance ( Kretzschmar et al. 2016 ), job-related skills ( Hunter and Schmidt 1996 ), reduced likelihood of criminal behavior ( Burhan et al. 2014 ), and for those with exceptionally high IQs, obtaining a doctorate and publishing scholarly articles ( McCabe et al. 2020 ). Gottfredson ( 1997, p. 81 ) summarized these effects when she said the “predictive validity of g is ubiquitous.” More recent research using longitudinal data, found that general mental abilities and specific abilities are good predictors of several work variables including job prestige, and income ( Lang and Kell 2020 ). Although assessments of IQ are useful in many contexts, having a high IQ does not protect against falling for common cognitive fallacies (e.g., blind spot bias, reactance, anecdotal reasoning), relying on biased and blatantly one-sided information sources, failing to consider information that does not conform to one’s preferred view of reality (confirmation bias), resisting pressure to think and act in a certain way, among others. This point was clearly articulated by Stanovich ( 2009, p. 3 ) when he stated that,” IQ tests measure only a small set of the thinking abilities that people need.”

3. Which Theories of Intelligence Are Relevant to the Question?

Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. For example, Grossmann et al. ( 2013 ) cite many studies in which IQ scores have not predicted well-being, including life satisfaction and longevity. Using a stratified random sample of Americans, these investigators found that wise reasoning is associated with life satisfaction, and that “there was no association between intelligence and well-being” (p. 944). (critical thinking [CT] is often referred to as “wise reasoning” or “rational thinking,”). Similar results were reported by Wirthwein and Rost ( 2011 ) who compared life satisfaction in several domains for gifted adults and adults of average intelligence. There were no differences in any of the measures of subjective well-being, except for leisure, which was significantly lower for the gifted adults. Additional research in a series of experiments by Stanovich and West ( 2008 ) found that participants with high cognitive ability were as likely as others to endorse positions that are consistent with their biases, and they were equally likely to prefer one-sided arguments over those that provided a balanced argument. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests (e.g., Sternberg 2019 ). Similarly, Stanovich and West ( 2014 ) argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Halpern and Butler ( 2020 ) advocate for CT as a useful model of intelligence for addressing real-world problems because it was designed for this purpose. Although there is much overlap among these more recent theories, often using different terms for similar concepts, we use Halpern and Butler’s conceptualization to make our point: Yes, intelligence (i.e., CT) can be enhanced and used for solving a real-world problem like COVID-19.

4. Critical Thinking as an Applied Model for Intelligence

One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson ( 2020, p. 205 ): “the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life.” Using this definition, the question of whether intelligent thinking can solve a world problem like the novel coronavirus is a resounding “yes” because solutions to real-world novel problems are part of his definition. This is a popular idea in the general public. For example, over 1000 business managers and hiring executives said that they want employees who can think critically based on the belief that CT skills will help them solve work-related problems ( Hart Research Associates 2018 ).

We define CT as the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal directed--the kind of thinking involved in solving problems, formulating inferences, calculating likelihoods, and making decisions, when the thinker is using skills that are thoughtful and effective for the particular context and type of thinking task. International surveys conducted by the OECD ( 2019, p. 16 ) established “key information-processing competencies” that are “highly transferable, in that they are relevant to many social contexts and work situations; and ‘learnable’ and therefore subject to the influence of policy.” One of these skills is problem solving, which is one subset of CT skills.

The CT model of intelligence is comprised of two components: (1) understanding information at a deep, meaningful level and (2) appropriate use of CT skills. The underlying idea is that CT skills can be identified, taught, and learned, and when they are recognized and applied in novel settings, the individual is demonstrating intelligent thought. CT skills include judging the credibility of an information source, making cost–benefit calculations, recognizing regression to the mean, understanding the limits of extrapolation, muting reactance responses, using analogical reasoning, rating the strength of reasons that support and fail to support a conclusion, and recognizing hindsight bias or confirmation bias, among others. Critical thinkers use these skills appropriately, without prompting, and usually with conscious intent in a variety of settings.

One of the key concepts in this model is that CT skills transfer in appropriate situations. Thus, assessments using situational judgments are needed to assess whether particular skills have transferred to a novel situation where it is appropriate. In an assessment created by the first author ( Halpern 2018 ), short paragraphs provide information about 20 different everyday scenarios (e.g., A speaker at the meeting of your local school board reported that when drug use rises, grades decline; so schools need to enforce a “war on drugs” to improve student grades); participants provide two response formats for every scenario: (a) constructed responses where they respond with short written responses, followed by (b) forced choice responses (e.g., multiple choice, rating or ranking of alternatives) for the same situations.

There is a large and growing empirical literature to support the assertion that CT skills can be learned and will transfer (when taught for transfer). See for example, Holmes et al. ( 2015 ), who wrote in the prestigious Proceedings of the National Academy of Sciences , that there was “significant and sustained improvement in students’ critical thinking behavior” (p. 11,199) for students who received CT instruction. Abrami et al. ( 2015, para. 1 ) concluded from a meta-analysis that “there are effective strategies for teaching CT skills, both generic and content specific, and CT dispositions, at all educational levels and across all disciplinary areas.” Abrami et al. ( 2008, para. 1 ), included 341 effect sizes in a meta-analysis. They wrote: “findings make it clear that improvement in students’ CT skills and dispositions cannot be a matter of implicit expectation.” A strong test of whether CT skills can be used for real-word problems comes from research by Butler et al. ( 2017 ). Community adults and college students (N = 244) completed several scales including an assessment of CT, an intelligence test, and an inventory of real-life events. Both CT scores and intelligence scores predicted individual outcomes on the inventory of real-life events, but CT was a stronger predictor.

Heijltjes et al. ( 2015, p. 487 ) randomly assigned participants to either a CT instruction group or one of six other control conditions. They found that “only participants assigned to CT instruction improved their reasoning skills.” Similarly, when Halpern et al. ( 2012 ) used random assignment of participants to either a learning group where they were taught scientific reasoning skills using a game format or a control condition (which also used computerized learning and was similar in length), participants in the scientific skills learning group showed higher proportional learning gains than students who did not play the game. As the body of additional supportive research is too large to report here, interested readers can find additional lists of CT skills and support for the assertion that these skills can be learned and will transfer in Halpern and Dunn ( Forthcoming ). There is a clear need for more high-quality research on the application and transfer of CT and its relationship to IQ.

5. Pandemics: COVID-19 as a Consequential Real-World Problem

A pandemic occurs when a disease runs rampant over an entire country or even the world. Pandemics have occurred throughout history: At the time of writing this article, COVID-19 is a world-wide pandemic whose actual death rate is unknown but estimated with projections of several million over the course of 2021 and beyond ( Mega 2020 ). Although vaccines are available, it will take some time to inoculate most or much of the world’s population. Since March 2020, national and international health agencies have created a list of actions that can slow and hopefully stop the spread of COVID (e.g., wearing face masks, practicing social distancing, avoiding group gatherings), yet many people in the United States and other countries have resisted their advice.

Could instruction in CT encourage more people to accept and comply with simple life-saving measures? There are many possible reasons to believe that by increasing citizens’ CT abilities, this problematic trend can be reversed for, at least, some unknown percentage of the population. We recognize the long history of social and cognitive research showing that changing attitudes and behaviors is difficult, and it would be unrealistic to expect that individuals with extreme beliefs supported by their social group and consistent with their political ideologies are likely to change. For example, an Iranian cleric and an orthodox rabbi both claimed (separately) that the COVID-19 vaccine can make people gay ( Marr 2021 ). These unfounded opinions are based on deeply held prejudicial beliefs that we expect to be resistant to CT. We are targeting those individuals who beliefs are less extreme and may be based on reasonable reservations, such as concern about the hasty development of the vaccine and the lack of long-term data on its effects. There should be some unknown proportion of individuals who can change their COVID-19-related beliefs and actions with appropriate instruction in CT. CT can be a (partial) antidote for the chaos of the modern world with armies of bots creating content on social media, political and other forces deliberately attempting to confuse issues, and almost all media labeled “fake news” by social influencers (i.e., people with followers that sometimes run to millions on various social media). Here, are some CT skills that could be helpful in getting more people to think more critically about pandemic-related issues.

Reasoning by Analogy and Judging the Credibility of the Source of Information

Early communications about the ability of masks to prevent the spread of COVID from national health agencies were not consistent. In many regions of the world, the benefits of wearing masks incited prolonged and acrimonious debates ( Tang 2020 ). However, after the initial confusion, virtually all of the global and national health organizations (e.g., WHO, National Health Service in the U. K., U. S. Centers for Disease Control and Prevention) endorse masks as a way to slow the spread of COVID ( Cheng et al. 2020 ; Chu et al. 2020 ). However, as we know, some people do not trust governmental agencies and often cite the conflicting information that was originally given as a reason for not wearing a mask. There are varied reasons for refusing to wear a mask, but the one most often cited is that it is against civil liberties ( Smith 2020 ). Reasoning by analogy is an appropriate CT skill for evaluating this belief (and a key skill in legal thinking). It might be useful to cite some of the many laws that already regulate our behavior such as, requiring health inspections for restaurants, setting speed limits, mandating seat belts when riding in a car, and establishing the age at which someone can consume alcohol. Individuals would be asked to consider how the mandate to wear a mask compares to these and other regulatory laws.

Another reason why some people resist the measures suggested by virtually every health agency concerns questions about whom to believe. Could training in CT change the beliefs and actions of even a small percentage of those opposed to wearing masks? Such training would include considering the following questions with practice across a wide domain of knowledge: (a) Does the source have sufficient expertise? (b) Is the expertise recent and relevant? (c) Is there a potential for gain by the information source, such as financial gain? (d) What would the ideal information source be and how close is the current source to the ideal? (e) Does the information source offer evidence that what they are recommending is likely to be correct? (f) Have you traced URLs to determine if the information in front of you really came from the alleged source?, etc. Of course, not everyone will respond in the same way to each question, so there is little likelihood that we would all think alike, but these questions provide a framework for evaluating credibility. Donovan et al. ( 2015 ) were successful using a similar approach to improve dynamic decision-making by asking participants to reflect on questions that relate to the decision. Imagine the effect of rigorous large-scale education in CT from elementary through secondary schools, as well as at the university-level. As stated above, empirical evidence has shown that people can become better thinkers with appropriate instruction in CT. With training, could we encourage some portion of the population to become more astute at judging the credibility of a source of information? It is an experiment worth trying.

6. Making Cost—Benefit Assessments for Actions That Would Slow the Spread of COVID-19

Historical records show that refusal to wear a mask during a pandemic is not a new reaction. The epidemic of 1918 also included mandates to wear masks, which drew public backlash. Then, as now, many people refused, even when they were told that it was a symbol of “wartime patriotism” because the 1918 pandemic occurred during World War I ( Lovelace 2020 ). CT instruction would include instruction in why and how to compute cost–benefit analyses. Estimates of “lives saved” by wearing a mask can be made meaningful with graphical displays that allow more people to understand large numbers. Gigerenzer ( 2020 ) found that people can understand risk ratios in medicine when the numbers are presented as frequencies instead of probabilities. If this information were used when presenting the likelihood of illness and death from COVID-19, could we increase the numbers of people who understand the severity of this disease? Small scale studies by Gigerenzer have shown that it is possible.

Analyzing Arguments to Determine Degree of Support for a Conclusion

The process of analyzing arguments requires that individuals rate the strength of support for and against a conclusion. By engaging in this practice, they must consider evidence and reasoning that may run counter to a preferred outcome. Kozyreva et al. ( 2020 ) call the deliberate failure to consider both supporting and conflicting data “deliberate ignorance”—avoiding or failing to consider information that could be useful in decision-making because it may collide with an existing belief. When applied to COVID-19, people would have to decide if the evidence for and against wearing a face mask is a reasonable way to stop the spread of this disease, and if they conclude that it is not, what are the costs and benefits of not wearing masks at a time when governmental health organizations are making them mandatory in public spaces? Again, we wonder if rigorous and systematic instruction in argument analysis would result in more positive attitudes and behaviors that relate to wearing a mask or other real-world problems. We believe that it is an experiment worth doing.

7. Conclusions

We believe that teaching CT is a worthwhile approach for educating the general public in order to improve reasoning and motivate actions to address, avert, or ameliorate real-world problems like the COVID-19 pandemic. Evidence suggests that CT can guide intelligent responses to societal and global problems. We are NOT claiming that CT skills will be a universal solution for the many real-world problems that we confront in contemporary society, or that everyone will substitute CT for other decision-making practices, but we do believe that systematic education in CT can help many people become better thinkers, and we believe that this is an important step toward creating a society that values and practices routine CT. The challenges are great, but the tools to tackle them are available, if we are willing to use them.

Author Contributions

Conceptualization, D.F.H. and D.S.D.; resources, D.F.H.; data curation, writing—original draft preparation, D.F.H.; writing—review and editing, D.F.H. and D.S.D. All authors have read and agreed to the published version of the manuscript.

This research received no external funding.

Institutional Review Board Statement

No IRB Review.

Informed Consent Statement

No Informed Consent.

Conflicts of Interest

The authors declare no conflict of interest.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Wade C. Anne, Surkes Michael A., Tamim Rana, Zhang Dai. Instructional interventions affecting critical thinking skills and dispositions: A Stage 1 meta-analysis. Review of Educational Research. 2008; 78 :1102–34. doi: 10.3102/0034654308326084. [ CrossRef ] [ Google Scholar ]
  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Waddington David I., Wade C. Anne. Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research. 2015; 85 :275–341. doi: 10.3102/0034654314551063. [ CrossRef ] [ Google Scholar ]
  • Burhan Nik Ahmad Sufian, Kurniawan Yohan, Sidek Abdul Halim, Mohamad Mohd Rosli. Crimes and the Bell curve: Th e role of people with high, average, and low intelligence. Intelligence. 2014; 47 :12–22. doi: 10.1016/j.intell.2014.08.005. [ CrossRef ] [ Google Scholar ]
  • Butler Heather A., Pentoney Christopher, Bong Maebelle P. Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Thinking Skills and Creativity. 2017; 25 :38–46. doi: 10.1016/j.tsc.2017.06.005. [ CrossRef ] [ Google Scholar ]
  • Cheng Vincent Chi-Chung, Wong Shuk-Ching, Chuang Vivien Wai-Man, So Simon Yung-Chun, Chen Jonathan Hon-Kwan, Sridhar Sidharth, To Kelvin Kai-Wwang, Chan Jasper Fuk-Wu, Hung Ivan Fan-Ngai, Ho Pak-Leung, et al. The role of community-wide wearing of face mask for control of coronavirus disease 2019 (COVID-19) epidemic due to SARS-CoV-2. Journal of Infectious Disease. 2020; 81 :107–14. doi: 10.1016/j.jinf.2020.04.024. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chu Derek K., Aki Elie A., Duda Stephanie, Solo Karla, Yaacoub Sally, Schunemann Holger J. Physical distancing, face masks, and eye protection to prevent person-to-person transmission of SARS-CoV-2 and COVID-19: A system atic review and meta-analysis. Lancet. 2020; 395 :1973–87. doi: 10.1016/S0140-6736(20)31142-9. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Donovan Sarah J., Guss C. Dominick, Naslund Dag. Improving dynamic decision-making through training and self-re flection. Judgment and Decision Making. 2015; 10 :284–95. [ Google Scholar ]
  • Drozdick Lisa Whipple, Wahlstrom Dustin, Zhu Jianjun, Weiss Lawrence G. The Wechsler Adult Intelligence Scale—Fourth Edition and the Wechsler Memory Scale—Fourth Edition. In: Flanagan Dawn P., Harrison Patti L., editors. Contemporary Intellectual as Sessment: Theories, Tests, and Issues. The Guilford Press; New York: 2012. pp. 197–223. [ Google Scholar ]
  • Gigerenzer Gerd. When all is just a click away: Is critical thinking obsolete in the digital age? In: Sternberg Robert J., Halpern Diane F., editors. Critical Thinking IN Psychology. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 197–223. [ Google Scholar ]
  • Gottfredson Linda S. Why g matters: The complexity of everyday life. Intelligence. 1997; 24 :79–132. doi: 10.1016/S0160-2896(97)90014-3. [ CrossRef ] [ Google Scholar ]
  • Grossmann Igor, Varnum Michael E. W., Na Jinkyung, Kitayama Shinobu, Nisbett Richard E. A route to well-being: Intelligence ver sus wise reasoning. Journal of Experimental Psychology: General. 2013; 142 :944–53. doi: 10.1037/a0029560. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F. Halpern Critical Thinking Assessment. Schuhfried Test Publishers; Modling: 2018. [(accessed on 30 March 2021)]. Available online: www.schuhfried.com [ Google Scholar ]
  • Halpern Diane F., Butler Heather A. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The nature of Intelligence. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 183–96. [ Google Scholar ]
  • Halpern Diane F., Dunn Dana S. Thought and Knowledge: An Introduction to Critical Thinking. 6th ed. Taylor & Francis; New York: Forthcoming. in press. [ Google Scholar ]
  • Halpern Diane F., Millis Keith, Graesser Arthur, Butler Heather, Forsyth Carol, Cai Zhiqiang. Operation ARA: A computerized learn ing game that teaches critical thinking and scientific reasoning. Thinking Skills and Creativity. 2012; 7 :93–100. doi: 10.1016/j.tsc.2012.03.006. [ CrossRef ] [ Google Scholar ]
  • Hart Research Associates [(accessed on 30 March 2021)]; Employers Express Confidence in Colleges and Universities: See College as Worth the Investment, New Research Finds. 2018 Aug 29; Available online: https://hartresearch.com/employers-express-confidence-in-colleges-and-universities-see-college-as-worth-the-investment-new-research-finds/
  • Heijltjes Anita, Gog Tamara van, Lippink Jimmie, Paas Fred. Unraveling the effects of critical thinking instructions, practice, and self-explanation on students’ reasoning performance. Instructional Science. 2015; 43 :487–506. doi: 10.1007/s11251-015-9347-8. [ CrossRef ] [ Google Scholar ]
  • Holmes Natasha G., Wieman Carl E., Bonn DougA. Teaching critical thinking. Proceedings of the National Academy of Sciences. 2015; 112 :11199–204. doi: 10.1073/pnas.1505329112. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hunter John E., Schmidt Frank L. Intelligence and job performance: Economic and social implications. Psychology, Public Policy, and Law. 1996; 2 :447–72. doi: 10.1037/1076-8971.2.3-4.447. [ CrossRef ] [ Google Scholar ]
  • Kozyreva Anastasia, Lewandowsky Stephan, Hertwig Ralph. Citizens versus the internet: Confronting digital challenges with cognitive tools. [(accessed on 30 March 2021)]; Psychological Science in the Public Interest. 2020 21 doi: 10.1177/1529100620946707. Available online: https://www.psychologi calscience.org/publications/confronting-digital-challenges-with-cognitive-tools.html [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kretzschmar Andre, Neubert Jonas C., Wusternberg Sascha, Greiff Samuel. Construct validity of complex problem- solv ing: A comprehensive view on different facts of intelligence and school grades. Intelligence. 2016; 54 :55–69. doi: 10.1016/j.intell.2015.11.004. [ CrossRef ] [ Google Scholar ]
  • Lang Jonas W.B., Kell Harrison J. General mental ability and specific abilities: Their relative importance for extrinsic career success. Journal of Applied Psychology. 2020; 105 :1047–61. doi: 10.1037/apl0000472. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lovelace Berkeley., Jr. Medical Historians Compare the Coronavirus to the 1918 Flu Pandemic: Both Were Highly Political. [(accessed on 30 March 2021)]; CNBC. 2020 Available online: https://www.cnbc.com/2020/09/28/comparing-1918-flu-vs-corona virus.html?fbclid=IwAR1RAVRUOIdN9qqvNnMPimf5Q4XfV-pn_qdC3DwcfnPu9kavwumDI2zq9Xs
  • Marr Rhuaridh. Iranian Cleric Claims COVID-19 Vaccine Can Make People Gay. [(accessed on 30 March 2021)]; Metro Weekly. 2021 Available online: https://www.metroweekly.com/2021/02/iranian-cleric-claims-covid-19-vaccine-can-make-people-gay/
  • McCabe Kira O., Lubinski David, Benbow Camilla P. Who shines most among the brightest?: A 25-year longitudinal study of elite STEM graduate students. Journal of Personality and Social Psychology. 2020; 119 :390–416. doi: 10.1037/pspp0000239. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mega Emiliano R. COVID Has Killed more than One Million People. How Many more will Die? [(accessed on 30 March 2021)]; Nature. 2020 Available online: https://www.nature.com/articles/d41586-020-02762-y [ PubMed ]
  • Nickerson Raymond S. Developing intelligence through instruction. In: Sternberg Robert J., editor. The Cambridge Handbook of Intelligence. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 205–37. [ Google Scholar ]
  • OECD . The Survey of Adult Skills: Reader’s Companion. 3rd ed. OECD Publishing; Paris: 2019. OECD Skills Studies. [ CrossRef ] [ Google Scholar ]
  • Smith Matthew. Why won’t Britons Wear Face Masks? [(accessed on 30 March 2021)]; YouGov. 2020 Available online: https://yougov.co.uk/topics/health/articles-reports/2020/07/15/why-wont-britons-wear-face-masks
  • Stanovich Keith E. What Intelligence Tests Miss: The Psychology of Rational Thought. Yale University Press; New Haven: 2009. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict my-side bias and one-sided thinking biases. Thinking & Reasoning. 2008; 14 :129–67. doi: 10.1080/13546780701679764. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F. What intelligence tests miss. The Psychologist. 2014; 27 :80–83. doi: 10.5840/inquiryctnews201126216. [ CrossRef ] [ Google Scholar ]
  • Sternberg Robert J. A theory of adaptive intelligence and its relation to general intelligence. Journal of Intelligence. 2019; 7 :23. doi: 10.3390/jintelligence7040023. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tang Julian W. COVID-19: Interpreting scientific evidence—Uncertainty, confusion, and delays. BMC Infectious Diseases. 2020; 20 :653. doi: 10.1186/s12879-020-05387-8. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wirthwein Linda, Rost Detlef H. Giftedness and subjective well-being: A study with adults. Learning and Individuals Differences. 2011; 21 :182–86. doi: 10.1016/j.lindif.2011.01.001. [ CrossRef ] [ Google Scholar ]

Module 5: Thinking and Analysis

Putting it together: thinking and analysis.

As we’ve just learned, one of the key skills every student needs is the ability to think critically. Once you’ve learned to critically examine the content you come into contact with, you can then think creatively to come up with new—and potentially better—solutions to the problems we find in the classroom and, ultimately, in the world.

The following text is an essay by Dr. Andrew Robert Baker, “Thinking Critically and Creatively.” In the first few paragraphs, Dr. Baker underscores how essential critical thinking is to improving as students, teachers, and researchers. Dr. Baker continues by illuminating some of the many ways that college students will be exposed to creative thinking and how it can enrich their learning experiences.

Thinking Critically and Creatively

Critical thinking skills are perhaps the most fundamental skills involved in making judgments and solving problems. You use them every day, and you can continue improving them.

The ability to think critically about a matter—to analyze a question, situation, or problem down to its most basic parts—is what helps us evaluate the accuracy and truthfulness of statements, claims, and information we read and hear. It is the sharp knife that, when honed, separates fact from fiction, honesty from lies, and the accurate from the misleading. We all use this skill to one degree or another almost every day. For example, we use critical thinking every day as we consider the latest consumer products and why one particular product is the best among its peers. Is it a quality product because a celebrity endorses it? Because a lot of other people may have used it? Because it is made by one company versus another? Or perhaps because it is made in one country or another? These are questions representative of critical thinking.

The academic setting demands more of us in terms of critical thinking than everyday life. It demands that we evaluate information and analyze myriad issues. It is the environment where our critical thinking skills can be the difference between success and failure. In this environment we must consider information in an analytical, critical manner. We must ask questions—What is the source of this information? Is this source an expert one and what makes it so? Are there multiple perspectives to consider on an issue? Do multiple sources agree or disagree on an issue? Does quality research substantiate information or opinion? Do I have any personal biases that may affect my consideration of this information?

It is only through purposeful, frequent, intentional questioning such as this that we can sharpen our critical thinking skills and improve as students, learners and researchers.

While critical thinking analyzes information and roots out the true nature and facets of problems, it is creative thinking that drives progress forward when it comes to solving these problems. Exceptional creative thinkers are people that invent new solutions to existing problems that do not rely on past or current solutions. They are the ones who invent solution C when everyone else is still arguing between A and B. Creative thinking skills involve using strategies to clear the mind so that our thoughts and ideas can transcend the current limitations of a problem and allow us to see beyond barriers that prevent new solutions from being found.

Brainstorming is the simplest example of intentional creative thinking that most people have tried at least once. With the quick generation of many ideas at once, we can block-out our brain’s natural tendency to limit our solution-generating abilities so we can access and combine many possible solutions/thoughts and invent new ones. It is sort of like sprinting through a race’s finish line only to find there is new track on the other side and we can keep going, if we choose. As with critical thinking, higher education both demands creative thinking from us and is the perfect place to practice and develop the skill. Everything from word problems in a math class, to opinion or persuasive speeches and papers, call upon our creative thinking skills to generate new solutions and perspectives in response to our professor’s demands. Creative thinking skills ask questions such as—What if? Why not? What else is out there? Can I combine perspectives/solutions? What is something no one else has brought-up? What is being forgotten/ignored? What about ______? It is the opening of doors and options that follows problem-identification.

Consider an assignment that required you to compare two different authors on the topic of education and select and defend one as better. Now add to this scenario that your professor clearly prefers one author over the other. While critical thinking can get you as far as identifying the similarities and differences between these authors and evaluating their merits, it is creative thinking that you must use if you wish to challenge your professor’s opinion and invent new perspectives on the authors that have not previously been considered.

So, what can we do to develop our critical and creative thinking skills? Although many students may dislike it, group work is an excellent way to develop our thinking skills. Many times I have heard from students their disdain for working in groups based on scheduling, varied levels of commitment to the group or project, and personality conflicts too, of course. True—it’s not always easy, but that is why it is so effective. When we work collaboratively on a project or problem we bring many brains to bear on a subject. These different brains will naturally develop varied ways of solving or explaining problems and examining information. To the observant individual we see that this places us in a constant state of back and forth critical/creative thinking modes.

For example, in group work we are simultaneously analyzing information and generating solutions on our own, while challenging other’s analyses/ideas and responding to challenges to our own analyses/ideas. This is part of why students tend to avoid group work—it challenges us as thinkers and forces us to analyze others while defending ourselves, which is not something we are used to or comfortable with as most of our educational experiences involve solo work. Your professors know this—that’s why we assign it—to help you grow as students, learners, and thinkers!

—Dr. Andrew Robert Baker,  Foundations of Academic Success: Words of Wisdom

Contribute!

Improve this page Learn More

  • Putting It Together: Thinking and Analysis. Provided by : Lumen Learning. License : CC BY: Attribution
  • Foundations of Academic Success. Authored by : Thomas C. Priester, editor. Provided by : Open SUNY Textbooks. Located at : http://textbooks.opensuny.org/foundations-of-academic-success/ . License : CC BY-NC-SA: Attribution-NonCommercial-ShareAlike

Footer Logo Lumen Waymaker

  • Open access
  • Published: 09 September 2024

The transfer effect of computational thinking (CT)-STEM: a systematic literature review and meta-analysis

  • Zuokun Li 1 &
  • Pey Tee Oon   ORCID: orcid.org/0000-0002-1732-7953 1  

International Journal of STEM Education volume  11 , Article number:  44 ( 2024 ) Cite this article

257 Accesses

1 Altmetric

Metrics details

Integrating computational thinking (CT) into STEM education has recently drawn significant attention, strengthened by the premise that CT and STEM are mutually reinforcing. Previous CT-STEM studies have examined theoretical interpretations, instructional strategies, and assessment targets. However, few have endeavored to delineate the transfer effects of CT-STEM on the development of cognitive and noncognitive benefits. Given this research gap, we conducted a systematic literature review and meta-analysis to provide deeper insights.

We analyzed results from 37 studies involving 7,832 students with 96 effect sizes. Our key findings include: (i) identification of 36 benefits; (ii) a moderate overall transfer effect, with moderate effects also observed for both near and far transfers; (iii) a stronger effect on cognitive benefits compared to noncognitive benefits, regardless of the transfer type; (iv) significant moderation by educational level, sample size, instructional strategies, and intervention duration on overall and near-transfer effects, with only educational level and sample size being significant moderators for far-transfer effects.

Conclusions

This study analyzes the cognitive and noncognitive benefits arising from CT-STEM’s transfer effects, providing new insights to foster more effective STEM classroom teaching.

Introduction

In recent years, computational thinking (CT) has emerged as one of the driving forces behind the resurgence of computer science in school curriculums, spanning from pre-school to higher education (Bers et al., 2014 ; Polat et al., 2021 ; Tikva & Tambouris, 2021a ). CT is complex, with many different definitions (Shute et al., 2017 ). Wing ( 2006 , p. 33) defines CT as a process that involves solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science (CS). Contrary to a common perception that CT belongs solely to CS, gradually, it has come to represent a universally applicable attitude and skill set (Tekdal, 2021 ) involving cross-disciplinary literacy (Ye et al., 2022 ), which can be applied to solving a wide range of problems within CS and other disciplines (Lai & Wong, 2022 ). Simply put, CT involves thinking like a computer scientist when solving problems, and it is a universal competence that everyone, not just computer scientists, should acquire (Hsu et al., 2018 ). Developing CT competency not only helps one acquire domain-specific knowledge but enhances one’s general ability to solve problems across various academic fields (Lu et al., 2022 ; Wing, 2008 ; Woo & Falloon, 2022 ; Xu et al., 2022 ), including STEM (science, technology, engineering, and mathematics) (Chen et al., 2023a ; Lee & Malyn-Smith, 2020 ; Wang et al., 2022a ; Waterman et al., 2020 ; Weintrop et al., 2016 ), the social sciences, and liberal arts (Knochel & Patton, 2015 ).

Given the importance of CT competency, integrating it into STEM education (CT-STEM) has emerged as a trend in recent years (Lee et al., 2020 ; Li & Anderson, 2020 ; Merino-Armero et al., 2022 ). CT-STEM represents the integration of CT practices with STEM learning content or context, grounded in the premise that a reciprocal relationship between STEM content learning and CT can enrich student learning (Cheng et al., 2023 ). Existing research supports that CT-STEM enhances student learning in two ways (Li et al., 2020b ). First, CT, viewed as a set of practices for bridging disciplinary teaching, shifts traditional subject forms towards computational-based STEM content learning (Wiebe et al., 2020 ). Engaging students in discipline-specific CT practices like modeling and simulation has been shown to improve their content understanding (Grover & Pea, 2013 ; Hurt et al., 2023 ) and enhance learning (Aksit & Wiebe, 2020 ; Rodríguez-Martínez et al., 2019 ; Yin et al., 2020 ). Another way is to take CT as a transdisciplinary thinking process and practice, providing a structured problem-solving framework that can reduce subject fixation (Ng et al., 2023 ). Aligning with integrated STEM (iSTEM) teaching, this approach equips students with critical skills such as analytical thinking, data manipulation, algorithmic thinking, collaboration, and creative solution development in authentic contexts (Tikva & Tambouris, 2021b ). Such skills are increasingly vital for addressing complex problems in a rapidly evolving digital and artificial intelligence-driven world.

Despite the growing interest in CT-STEM (Li et al., 2020b ; Tekdal, 2021 ), recent reviews indicate a focus on theoretical interpretations (Lee & Malyn-Smith, 2020 ; Weintrop et al., 2016 ), instructional strategies (Hutchins et al., 2020a ; Ma et al., 2021 ; Rachmatullah & Wiebe, 2022 ), and assessment targets (Bortz et al., 2020 ; Román- González et al., 2017). Although previous meta-analyses have shown CT-STEM’s positive impact on students meeting learning outcomes (Cheng et al., 2023 ), there is a gap in systematically analyzing its benefits, particularly in differentiating student learning via transfer effects (Popat & Starkey, 2019 ; Ye et al., 2022 ). Transfer, a key educational concept categorized as near and far transfer based on the theory of “common elements” (Perkins & Salomon, 1992 ), is crucial for understanding and evaluating CT-STEM’s utility and developing effective pedagogies. Previous studies have concentrated on cognitive learning outcomes (Cheng et al., 2023 ; Zhang & Wong, 2023 ) but offer limited insight into CT-STEM’s transfer effects on noncognitive outcomes like affective and social skills (Lai et al., 2023 ; Tang et al., 2020 ; Zhang et al., 2023 ). Given that CT-STEM effects extend beyond the cognitive domain (Ezeamuzie & Leung, 2021; Lu et al., 2022 ), it is equally important to recognize and nurture noncognitive benefits like self-efficacy, cooperativity, and communication in CT-STEM practices (Yun & Cho, 2022 ).

To better understand and evaluate CT-STEM transfer effects on students’ cognitive and noncognitive benefits acquisition, we systematically review published CT-STEM effects using PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines (Moher et al., 2010 ). We employ meta-analysis to quantify these effects and identify moderating variables. The following research questions guide our study:

RQ1: What cognitive and noncognitive benefits are acquired from CT-STEM’s near and far transfer effects?

RQ2: (a) What are the overall transfer effects of CT-STEM on cognitive and noncognitive benefits mentioned in Q1? and (b) What are the moderators of this effect?

RQ3: (a) What are the near and far transfer effects of CT-STEM on cognitive and noncognitive benefits mentioned in Q1? and (b) What are the moderators of these effects?

Literature review

  • Computational thinking (CT)

The concept of procedural thinking was first introduced by Papert ( 1980 ), who connected programming to procedural thinking and laid a foundation for CT (Merino-Armero et al., 2022 ). Although Papert was the first to describe CT, Wing ( 2006 , 2008 , 2011 ) brought considerable attention back to the term, a focus that continues to date (Brennan & Resnick, 2012 ; Chen et al., 2023a ). Various other definitions have emerged in the literature, and there is no consensus definition of CT (Barr & Stephenson, 2011 ; Grover & Pea, 2013 ; Shute et al., 2017 ). The definitions of CT often incorporate programming and computing concepts (e.g., Israel-Fishelson & Hershkovitz, 2022 ) or consider CT to be a set of elements associated with both computing concepts and problem-solving skills (e.g., Kalelioglu et al., 2016 ; Piatti et al., 2022 ). From the former perspective, many researchers defined CT based on programming and computing concepts. For example, Denner et al. ( 2012 ) defined CT as a united competence composed of three key dimensions of CT: programming, documenting and understanding software, and designing for usability. An alternative defining framework (Brennan & Resnick, 2012 ), originating from a programming context (i.e., Scratch), focuses on CT concepts and practices, including computational terms of sequences, loops, conditionals, debugging, and reusing.

Viewed from the latter perspective, CT deviates from the competencies typically associated with simple computing or programming activities. Instead, it is characterized as a set of competencies encompassing domain-specific knowledge/skills in programming and problem-solving skills for non-programming scenarios (Lai & Ellefson, 2023 ; Li et al., 2020a ; Tsai et al., 2021 , 2022 ). Using this broad viewpoint, CT can be defined as a universally applicable skill set involved in problem-solving processes. For instance, ISTE and CSTA (2011) developed an operational definition of CT, which refers to a problem-solving process covering core skills, such as abstraction , problem reformulation , data practices , algorithmic thinking , automation & modeling & simulation, and generalization . Selby and Woollard ( 2013 ) proposed a process-oriented definition of CT based on its five essential practices: abstraction , decomposition , algorithmic thinking , evaluation , and generalization . Shute et al. ( 2017 ) provided a cross-disciplinary definition centered on solving problems effectively and efficiently, categorizing CT into six practices: decomposition , abstraction , algorithm design , debugging , iteration , and generalization . In all these cases, the essence of CT lies in a computer scientist’s approach to problems, which is a skill applicable to everyone’s daily life and across all learning domains.

The above classification of definitions mainly focuses on the cognitive aspect of CT. Other researchers have suggested that CT contains not only a cognitive component (Román-González et al., 2017 ) but also a noncognitive component, highlighting important dispositions and attitudes, including confidence in dealing with complexity, persistence in working with difficult problems, tolerance for ambiguity, the ability to deal with open-ended problems, and the ability to communicate and work with others to achieve a common goal or solution (Barr & Stephenson, 2011 ; CSTA & ISTE, 2011 ).

In short, while computational thinking (CT) is frequently associated with programming, its scope has significantly expanded over the years (Hurt et al., 2023 ; Kafai & Proctor, 2022 ). Building on these prior efforts, we define CT as a problem-solving/thought process that involves selecting and applying the appropriate tools and practices for solving problems effectively and efficiently. As a multifaceted set of skills and attitudes, CT includes both cognitive aspects, highlighting students’ interdisciplinary practices/skills, and noncognitive aspects like communication and collaboration.

Integrating CT in STEM education (CT-STEM)

There is an urgent need to bring CT into disciplinary classrooms to prepare students for new integrated fields (e.g., computational biology, computational physics, etc.) as practiced in the realistic professional world. To address this, a growing body of research and practice has focused on integrating CT into specific and iSTEM lessons (Jocius et al., 2021 ). This integration, i.e., CT-STEM, refers to the infusion of CT practices with STEM content/context, with the aim of enhancing students’ CT skills and STEM knowledge (Cheng et al., 2023 ). Accordingly, CT-STEM serves a dual purpose: one, it has the potential to foster the development of student CT practices and skills; and another, it simultaneously deepens students’ disciplinary understanding and improves learning performance within and across disciplines (Waterman et al., 2020 ). Current research reveals two potential ways this integration facilitates students’ STEM learning. First, integrating CT into STEM provides students with an essential, structured framework by characterizing CT as a thought process and general competency, with disciplinary classrooms offering “a meaningful context (and set of problems) within which CT can be applied” (Weintrop et al., 2016 , p. 128). Key processes of this problem-solving approach include: formulating problems computationally, data processing for solving problems, automating/simulating/modeling solutions, evaluating solutions, and generalizing solutions (Lyon & Magana, 2021 ; Wang et al., 2022a ). Engaging in these practices aids students in applying STEM content to complex problem-solving and develops their potential as future scientists and innovators, aligning with iSTEM teaching.

In addition, introducing CT within disciplinary classroom instruction transforms traditional STEM subject formats into an integrated computational-based approach. This way takes a specific set of CT practices naturally integrated into different STEM disciplines to facilitate students’ content learning (Li et al., 2020b ; Weller et al., 2022 ). Weintrop et al. ( 2016 ) identified four categories of CT practices in math and science education: data practices , modeling and simulation practices , computational problem-solving practices , and systems thinking practices . Engaging students in systems thinking practices can simplify the understanding of systems and phenomena within the STEM disciplines (Grover & Pea, 2013 ). Integrating CT involves students in data practices , modeling , simulation and/or using computational tools such as programming to generate representations, rules, and reasoning structures (Phillips et al., 2023 ). This aids in formulating predictions and explanations, visualizing systems, testing hypotheses, and enhancing students’ understanding of scientific phenomena and mechanisms (Eidin et al., 2024 ). When comparing the previously mentioned two integrated ways, the first places specific attention on developing discipline-general CT, while the second emphasizes improving students’ learning of disciplinary content and developing discipline-specific CT (Li et al., 2020b ).

Practical aspects of CT-STEM have also been explored in the literature, including instructional strategies and assessment targets. Scholars have attempted different instructional strategies for CT-STEM implementation to achieve the designated educational purpose. These strategies can be categorized as instructional models (e.g., problem-driven strategies and project-based strategies), topic contexts (e.g., game-based strategies, and modeling- and simulation-based strategies), scaffolding strategies, and collaborative strategies (Wang et al., 2022a ) (see Table  1 ) . Typically, in instructional models , CT is viewed as an essential competency, guiding students to create interdisciplinary artifacts and solve specific real-world problems. Li et al. ( 2023 ) integrated CT as a core thought model into a project-based learning process, focusing on student-designed products for practical problems. Compatible with instructional models, a variety of instruction strategies based on topic contexts have been used, such as game design, computational modeling and simulation, and robotics. These also called plugged-in activities, typically involve computer programming for performing STEM tasks (Adanır et al., 2024 ). In contrast, unplugged activities operate independently of computers, involving physical movements or using certain objects to illustrate abstract STEM concepts or principles (Barth-Cohen et al., 2019 ; Chen et al. 2023b ). In combination with the above strategies, scaffolding strategies have been designed and utilized in CT-STEM to reduce students’ cognitive load and provide support for their self-regulated learning, such as guidance and adaptive, peer-, and resource-scaffolding. In addition, educators have employed various collaborative strategies (e.g., Think-Pair-Share practice) to enhance students’ cooperative and communicative skills in CT-STEM learning (Tikva & Tambouris, 2021a ). In short, the use of different types of instructional strategies serves as a significant factor in influencing the effectiveness of CT-STEM.

Prior research has focused on assessment targets within the cognitive and noncognitive domains (Tang et al., 2020 ; Wang et al., 2022a ). The former includes direct cognitive manifestations such as knowledge and skills related to CT constructs and STEM constructs, as well as domain-general mental abilities such as creativity and critical thinking (Tang et al., 2020 ). Wang et al. ( 2022a ) reported CT-STEM studies targeted cognitive domain assessments, which included assessments of students’ CT concepts and skills, programming knowledge and skills, and STEM achievements. These constructs were mainly measured through tests, including validated and self-developed tests. Other researchers characterize CT as a general thinking skill and employ performance scales for measurement (e.g., Korkmaz et al., 2017 ; Tsai et al., 2019 , 2021 ). The assessment of the noncognitive domain focused on students’ dispositions and attitudes towards CT-STEM (Lai & Wong, 2022 ), including self-efficacy, interest, and cooperativity, mainly measured by surveys/scales.

In summary, CT-STEM has garnered considerable attention from researchers, primarily exploring theoretical interpretations of how a reciprocal relationship between STEM and CT can enrich student learning. CT-STEM is implemented through the development and application of varied instructional strategies, with assessments aimed at understanding its effects on students’ cognitive and noncognitive domains. While these are important contributions, there is a notable lack of systematic and empirical evidence concerning the differentiated benefits of CT-STEM integration. We aim to address this deficit by differentiating benefits via transfer effects and systematically synthesizing pertinent research in this field.

Transfer effect of learning

Transference or transfer effect refers to the ability to apply what one has known or learned in one situation to another (Singley & Anderson, 1989 ), standing at the heart of education as it highlights the flexible application of acquired knowledge (OECD, 2018 ). Perkins and Salomon ( 1992 ) defined transfer as the process of transferring learning and performance from one context to another, possibly even in a dissimilar context. From a cognitivist perspective, knowledge, seen as a stable mental entity, can traditionally be summoned and adapted to new situations under the right circumstances (Day & Goldstone, 2012 ). Nevertheless, this traditional approach has been subject to extensive criticism, particularly from those who hold a constructivist perspective. From their view, the transfer of learning is not a static application of knowledge to a new context but rather the “byproduct of participation in particular situations” (Day & Goldstone, 2012 )—a standpoint widely acknowledged and endorsed by most researchers. Despite the broad consensus on this view (Scherer et al., 2019 ), some questions remain: How can a successful transfer occur? What factors define “other” or “new” contexts?

One prominent explanation for the successful transfer of knowledge is the theory of “common elements” (Singley & Anderson, 1989 ), which hypothesizes that successful transfer depends upon the elements that two different contexts or problem situations share (Scherer et al., 2019 ). Thus, based on this theory, the transfer effect can be divided into near transfer and far transfer (Perkins & Salomon, 1992 ). Near transfer occurs when successful skills and strategies are transferred between contexts that are similar, i.e., contexts that are closely related and require similar skills and strategies to be performed; conversely, far transfer occurs when successful skills or strategies are transferred between contexts that are inherently different (Perkins & Salomon, 1992 ). Essentially, the transfer effect is determined by the similarity or overlap between the contexts and problems in which the skills were acquired and new different problems that are encountered in the future (Baldwin & Ford, 1988 ). Simply put, there is a greater chance of transference between related contexts or problem situations (near-transfer) than between divergent situations (far-transfer). Since transfer effects are inherently situation-specific, they depend highly on the circumstances under which the skills/knowledge were acquired and the overlap with the new situation (Lobato, 2006 ).

While far-transfer effects are less likely to occur, numerous studies have reported far-transfer effects, albeit to varying extents (Bransford & Schwartz, 1999 ). Scherer et al. ( 2019 ) reported a moderate effect ( g  = 0.47) indicative of far transfer effects in learning computer programming, while Sala and Gobet ( 2016 ) found relatively limited evidence of far transfer effects within the domains of chess instruction and music education: successful transfer was only observed in situations that required skills similar to those acquired in the interventions. The extent of far-transfer can fluctuate across different contexts, indicating a need for further exploration within different disciplines and learning contexts.

The transfer effects of CT-STEM

The transfer effects of learning computer programming have been explored (Bernardo & Morris, 1994 ; Pirolli & Recker, 1994 ; Scherer et al., 2019 , 2020 ). For instance, students learning BASIC programming demonstrated that acquiring programming knowledge significantly enhanced the students’ abilities to solve verbal and mathematical problems; however, no significant differences were found in mathematical modeling and procedural comprehension (Bernardo & Morris, 1994 ). Scherer et al. ( 2019 ) conducted a meta-analysis exploring the effects of transferring computer programming knowledge on students’ cognitive benefits. They identified positive skill transfers from learning programming to areas such as creative thinking, mathematical abilities, and spatial skills. Beyond cognitive benefits, Popat and Starkey ( 2019 ) and Melro et al. ( 2023 ) indicate that learning programming also contributes to noncognitive benefits like collaboration and communication.

Programming can be a conduit for teaching, learning, and assessing CT and a mechanism to expose students to CT by creating computational artifacts. Although programming skills and CT share a close relationship and overlap in several aspects (e.g., application of algorithms, abstraction, and automation), they are not identical (Ezeamuzie & Leung, 2022 )—the latter (i.e., CT) also involves incorporating computational perspectives and computational participation (i.e., the student’s understanding of himself or herself, and their interactions with others and technology; Shue et al., 2017). CT can also be taught without programming through so-called unplugged activities. Hence, research on the transfer of programming only addresses a limited aspect of the CT transference.

Research on CT transfer effects has recently surged (Liu & Jeong, 2022 ; Ye et al., 2022 ). In a meta-analysis, Ye et al. ( 2022 ) reported a positive transfer effect beyond computer programming in understanding science, engineering, mathematics, and the humanities. Using in-game CT supports, Liu and Jeong ( 2022 ) reported a significant improvement in student CT skills at the near transfer level but not at the far transfer level. Correlation analyses by Román-González et al. ( 2017 ) demonstrated a significant relationship between CT and other cognitive abilities, which is collaborated by Xu et al.’s ( 2022 ) study, showing CT relates to numerous cognitive and learning abilities in other domains, such as reasoning, creative thinking, and arithmetic fluency. Other studies attribute cognitive benefits to CT, such as executive functions (Arfé et al., 2019 ). Although the results from correlation analyses cannot provide definitive causal evidence, they offer valuable insights and directions for future investigations, including potential meta-analysis studies.

While several systematic reviews and meta-analyses have been conducted on programming and CT transfer effects, there is a scarcity of meta-analysis that investigate the transfer effects of CT-STEM and the variables that moderate these effects. Cheng et al. ( 2023 ) explored the overall effect of CT-STEM on students’ STEM learning performance within a K-12 education context and reported a large effect size ( g  = 0.85) between pretest and posttest scores on STEM learning outcomes. They investigated moderating variables in the models, including student grade levels, STEM disciplines, intervention durations, and types of interventions. Of these, only the intervention durations had a significant moderating effect. While their work offers evidence supporting the effectiveness of CT-STEM on students’ learning outcomes, evidenced by a large effect size, we identified three notable shortcomings: First, their meta-analysis lacked a focus on potential benefits that can be derived from CT-STEM integration, particularly in terms of differentiating learning outcomes from the perspective of transfer effects. Existing meta-analyses have found that effect sizes vary considerably across various types of learning outcomes (Sala & Gobet, 2017 ; Scherer et al., 2019 ). This variation indicates that CT-STEM may not benefit different categories of learning outcomes equally. Second, the study focused only on cognitive learning outcomes, omitting noncognitive effects that may be fostered by CT-STEM. As noted earlier, although CT is primarily a cognitive psychological construct associated with cognitive benefits, it also has a complementary noncognitive aspect (Román-González et al., 2018 ). The synergy between CT and STEM holds promise for delivering cognitive and noncognitive benefits to students. Third, their inclusion of only studies that employed one-group pretest–posttest designs may contribute to biased outcomes, limiting the potential representativeness and robustness of the research findings (Cuijpers et al., 2017 ). Morris and DeShon ( 2002 ) posited that combining effect sizes from different study designs, both rationally and empirically, would lead to more reliable and comprehensive conclusions.

While various studies have validated the transfer effect of programming and CT, a systematic examination of CT-STEM’s transfer effects remains an area for further exploration. Our review identified key gaps, including a lack of differentiation in learning outcomes, insufficient focus on noncognitive benefits, and limitations in research robustness. Additionally, practical challenges, such as identifying effective activities and methods for CT integration into STEM, as well as determining optimal intervention durations, need to be addressed. We address these issues by investigating the transfer effects of CT-STEM, combining effect sizes from diverse studies, and considering both cognitive and noncognitive domains. We also identify practical factors that could influence these effects through moderator analysis. Our goal is to enhance instructional design in CT-STEM and provide new insights and guidance for both practitioners and researchers in the field.

Conceptual framework for the present study

Drawing from Mayer’s ( 2011 , 2015 ) framework, we synthesized evidence on the CT-STEM transfer effects and the contextual conditions that enhance instructional effectiveness. This framework, widely used to evaluate technology-based interventions like computer programming and educational robotics (Chen et al., 2018 ; Sun & Zhou, 2022 ; Tsai & Tsai, 2018 ), offers a multifaceted perspective on instructional methods. It allows for the exploration of three types of research questions: (a) Learning consequences, by examining the benefits of specific instructional methods; (b) Media comparison, by assessing the effectiveness of instructional methods; and (c) Value-added teaching, by investigating how changes in teaching conditions affect student performance. Chen et al. ( 2018 ) highlights this framework’s aptitude for systematically organizing and mapping domains and study contexts, accommodating diverse research foci.

Transferring this framework to the context of CT-STEM instruction (see Fig.  1 ), we systematically summarize the learning sequences through CT-STEM’s transfer effect. Based on our literature review section, we have categorized these sequences into four types: (a) Cognitive benefits through near transfer effect (CNT); (b) Noncognitive benefits through near transfer effect (NCNT); (c) Cognitive benefits through far transfer effect (CFT); and (d) Noncognitive benefits through far transfer effect (NCFT). This study synthesizes evidence on CT-STEM’s effectiveness per transfer type and examines various moderators affecting these effects. We considered sample features (e.g., educational level and sample size) and study features (e.g., study design, subject, instructional strategy, and intervention duration) as potential moderators affecting the transferability of CT-STEM. Previous CT-related studies indicated that these moderators contribute to variance in the effect sizes (Lai & Wong, 2022 ; Scherer et al., 2020 ; Sun & Zhou, 2022 ; Ye et al., 2022 ).

figure 1

Conceptual framework of the present meta-analysis

Methodology

We collected and analyzed literature on the transfer effects of CT-STEM using a rigorous systematic review process (Jesson et al., 2011 ), adhering to the PRISMA guidelines (Moher et al., 2010 ).

Database and keywords

We initially searched for key works on CT and STEM in seven databases: Web of Science, Science Direct, Springer, Wily, IEEE Xplore Digital Library, Sage, and Taylor & Francis. In the search, CT was explicitly confined to “computational thinking.” The major intervention approaches were included, such as programming, plugged activities, and unplugged activities. For STEM, we used the following terms: STEM, science, technology, engineering, and mathematics, and further supplemented “science” with discipline-specific terms like “physics,” “chemistry,” and “biology.” Additionally, we added “game design” and “robotics” to complement “technology,” as these are significant technical contexts for CT. As a final step, we searched for full peer-reviewed articles in the databases using keyword groupings, focusing exclusively on educational and educational research fields: (“Computational thinking” OR “programming” OR “plugged activity” OR “unplugged activity”) AND (“STEM” OR “technology” OR “engineering” OR “mathematics” OR “physics” OR “chemistry” OR “biology” OR “game design” OR “robotics”). The initial search included articles published between January 1, 2011, and March 1, 2023, as professional CT-STEM fields were formed and gained popularity after 2011 (Lee & Malyn-Smith, 2020 ; Malyn-Smith & Ippolito, 2011 ). This initial search yielded 12,358 publications, which were then subjected to further screening.

Inclusion and exclusion criteria

The inclusion and exclusion criteria for articles were detailed in Table  2 . This study examined the transfer effects of CT-STEM, exploring both near and far transfer effects on cognitive and noncognitive benefits acquisition. Eligible studies included those with experimental or quasi-experimental designs, such as Independent-groups pretest–posttest (IGPP), Independent-groups posttest (IGP), and Single-group pretest–posttest (SGPP), reporting pretest and posttest or solely posttest performance. Articles where CT was not integrated with STEM content or context, or if the authors did not conceptualize or assert their studies as integrating CT with STEM learning, were excluded. Studies focusing on programming tools like Scratch or robotics, without involving other STEM content or contexts were excluded. Since STEM education often emphasizes situated learning, with contexts from social studies, culture, language, and arts (Kelley & Knowles, 2016 ), articles in other disciplines (e.g., social sciences, literacy, and culture) that involve CT activities, such as designing digital stories and games (Zha et al., 2021 ), were included. We did not limit the educational context (e.g., K-12 or higher education) since the effects of CT-STEM differ at various educational levels, and including both enables a more comprehensive understanding. The methods of assessment after the CT-STEM interventions were unrestricted. Inclusion criteria for studies necessitated reporting at least one cognitive (e.g., critical thinking or school achievement) or noncognitive (e.g., communication or collaboration) benefit using performance-based outcome measures. Studies reporting only behavioral (e.g., response times and number and sequence of actions) were excluded. Eligibility also depended on providing adequate statistical data for effect size calculation, requiring details like sample sizes, standard deviations, means, t -values, F -values, or z -scores.

Study selection

Figure  2 shows the three selection stages: identification , screening , and eligibility evaluation . After the initial search, automatic and manual searching were used to eliminate duplicates. Two independent researchers used the inclusion and exclusion criteria to screen the article titles and abstracts, eliminating those that did not fit the criteria. Following this, the texts of the remaining articles were scrutinized and assessed using the criteria requirements for inclusion in the final sample. The interrater agreement was high (Cohen’s Kappa coefficient = 0.92). All disagreements were resolved by discussing and reviewing. This selection process yielded 32 studies that met the eligibility criteria. Lastly, a “snowball” search method (Petersen & Valdez, 2005 ) was used to find additional articles that met the criteria. Both backward and forward snowballing using the identified papers resulted in an additional five papers. Overall, the search and evaluation process yielded 37 articles for analysis (a complete list of references for these included studies can be found in Supplementary Material A1).

figure 2

The selection flowchart used based on PRISMA approach

Data extraction and analysis

Coding of studies.

We modified the systematic review coding scheme spreadsheet (Scherer et al., 2019 ; Ye et al., 2022 ), which was used to document and extract information. It includes basic study details (reference, publication year, and journal), four types of outcome variables, sample features (educational level and sample size), study characteristics (study design, subject, instructional strategy, and intervention duration), and statistical data for effect size calculation. To ensure the reliability of the coding, each study was coded by two researchers using the coding scheme. The interrater reliability was 0.93 using the Kappa coefficient, and discrepancies were settled in discussion sessions until mutual agreement was reached.

Outcome variables

To ascertain which cognitive and noncognitive benefits can be derived through CT-STEM transference, we constructed a hierarchical structure and classified these benefits into four categories: CNT, NCNT, CFT, and NCFT (see Table  3 ). CNT (i.e., domain-specific cognitive skills/knowledge) occurs when skills or knowledge acquired in CT-STEM are applied to a domain that is closely related, such as CT knowledge/concepts and CT practices/skills (Scherer et al., 2019 ; Sun & Zhou, 2022 ). In the included studies, CNT was measured using (a) validated tests, such as the Computation Thinking test (CTt), and (b) self-developed tests/tasks for evaluating students’ comprehension of subject-specific concepts and knowledge. NCNT pertains to shifts in students’ attitudes, motivations, self-efficacy, or perceptions concerning the related domain (e.g., CT-STEM, iSTEM, STEM, or programming) following their engagement with CT-STEM (Bloom & Krathwohl, 1956 ). Measures for NCNT in the selected studies primarily utilized standardized scales, with some employing self-developed scales.

CFT (i.e., domain-general cognitive skills) manifests when the skills attained from the CT-STEM are applied to different domains (Doleck et al., 2017 ; Xu et al., 2022 ). These skills, such as reasoning skills, creativity, and critical thinking, were mostly assessed by standardized scales and various tests like the Bebras test, TOPS test, Computational Thinking Scale (CTS) (e.g., Korkmaz et al., 2017 ; Tsai et al., 2019 , 2021 ), and Cornell Critical Thinking test (CCTT). NCFT involves the transfer of skills from CT-STEM to higher-order noncognitive learning outcomes such as cooperativity and communication (OECD, 2018 ). Measurement techniques for this category included validated scales along with specific self-developed tasks. Then, we calculated the measured frequency of each benefit in the selected papers (N = 37) and used bar charts for visualization to answer RQ1.

Moderator variables

Based on the framework presented in Fig.  1 and previous meta-analyses in CT-STEM and related fields (e.g., educational robotics, programming, and CT), we examined two types of moderators for their potential role in enhancing the transferability within CT-STEM (see Table  4 ). The variables included: (1) Sample features. Sample features comprised the educational levels targeted by the intervention—kindergarten, primary school, secondary school, and university/college—and the sample size , with the latter equating to class size in educational contexts and exhibiting variability across studies; (2) Study features. The design of the primary studies was coded as either an IGPP, an IGP, or a SGPP. Considering the possibility of multiple designs occurring within one study, we elected to code them independently (Scherer et al., 2020 ). Next to the subject , the coding of categories is primarily predicated on the intervention transfer area (Ye et al., 2022 ). When CT is integrated into several subjects, we coded such studies as “Multiple STEM subjects” accordingly. Based on Wang et al.’s ( 2022a ) review, we assigned instructional strategy as additional possible moderating variables and coded them as “instructional models,” “topic contexts,” “scaffolding strategies,” and “collaborative strategies.” Table  1 provides an account of these instructional strategies and contains sample references; Supplementary Material A2 contains more detailed descriptions of these strategies for each included study. Finally, the length of the intervention was extracted and later coded as < 1 week, one week-1 month, one month-1 semester, > 1 semester, and not mentioned.

Calculating effect sizes

We computed effect sizes using the Comprehensive Meta-Analysis (CMA) Software 3.0 (Borenstein et al., 2013 ). To increase the number of articles in our meta-analysis, we included three types of study designs (Morris & DeShon, 2002 ). Despite potential time bias and selection bias, our study used the same metric (i.e., raw-score metric) for calculating effect sizes. This metric is insensitive to variations in ρ and is recommended when homogeneity of ρ cannot be assumed or tested empirically (Morris & DeShon, 2002 ). These calculations were based on the means and standard deviations of the student learning outcome data. If these values were not reported in the studies, we used other statistics to calculate the standardized mean difference, such as t -values, z -scores, F -values, Cohen’s d , SE , and Confidence intervals (95% CI) (Borenstein et al., 2009 ). All reported p -values are two-tailed unless otherwise reported.

We calculated the effect sizes by the metric of Hedges’ g , which allows the integration of results from varied research designs with minimal bias and provides a global measure of CT-STEM effectiveness (Sun et al., 2021 ). Hedges’ g was interpreted by Hedges and Olkin’s ( 2014 ) assertion, in which 0.20–0.49 indicates low effect, 0.50–0.79 indicates medium effect, and 0.8 and above indicates high effect. CMA 3.0 empirically supports the amalgamation of multiple study designs in a single analysis (Borenstein et al., 2013 ). Leveraging this feature, we used experimental designs as a moderator to mitigate potential bias (Morris & DeShon, 2002 ). The statistically nonsignificant p -value of the Q test ( p  = 0.343) failed to reject the null hypothesis of no difference between mean effect sizes calculated from alternate designs. Therefore, effect sizes from different designs can be meaningfully combined (Delen & Sen, 2023 ; Morris & DeShon, 2002 ). Due to substantial variations in outcome measures and environments across studies, we employed the random-effects model to address RQ2 (a) and RQ3 (a) in this study by calculating overall and subgroup effect sizes (Borenstein et al., 2021 ; Xu et al., 2019 ).

Non-independence

We calculated one effect size per study to ensure the independence of the effect sizes; however, if a study reported multiple benefits that did not overlap, the effect size for each benefit was included in the analysis. Additionally, when a study reported effect sizes for separate groups of students (e.g., students in grades 1, 2, and 3) where the participants did not overlap, the effect sizes for each group were considered independent samples (Lipsey & Wilson, 2001 ). When a study reported multiple assessments (e.g., midterm and final exams) in one subject area, we selected the most comprehensive assessment (Bai et al., 2020 ).

Analyses of heterogeneity

Heterogeneity was detected using the I 2 test (i.e., there is a degree of inconsistency in the studies’ results), which was calculated to show the ratio of between-groups variance to the total variation across effect sizes, revealing the effect sizes variation stemming from the differences among studies (Shamseer et al., 2015 ). Then, we conducted a moderator analysis to pinpoint potential sources of variance in transfer effect sizes, including examining the overall, near, and far transfer effects, to address the RQ2 (b) and RQ3 (b).

Publication bias

We conducted three additional analyses to determine if publication bias affected the review results. They included a funnel plot, Egger’s test, and the classic fail-safe N. The funnel plot is a graphical tool that compares effect sizes to standard errors to check if publication bias distorted treatment effects (Egger et al., 1997 ). We used the Egger test to examine symmetry and quantify the amount of bias captured by the funnel plot (Bai et al., 2020 ; Borenstein, 2005 ). The classic fail-safe N was calculated to address the issue of publication bias affecting the effect size. Specifically, when the meta-analysis results are significant, it is essential to calculate the number of lost and unpublished studies that should be included to make the compound effect insignificant (Rosenthal, 1979 ). According to Rosenberg ( 2005 ), the fail-safe N (X) should reach 5 k + 10 to ensure that X is large relative to k (the number of independent effect sizes). The greater the fail-safe N value, the smaller the publication bias.

Cognitive and noncognitive benefits through CT-STEM’s transfer effect (RQ1)

Our investigation of CT-STEM transference revealed 36 benefits, detailed in Fig.  3 . This includes benefits from both near and far transfer: seventeen cognitive and eight noncognitive benefits were attributed to near transfer (CNT and NCNT, respectively), while nine cognitive and two noncognitive benefits resulted from far transfer (CFT and NCFT, respectively).

figure 3

The measured frequency of documented benefits in selected publications

The top five benefits most frequently documented in empirical CT-STEM research were mathematics achievement ( f  = 9), CT knowledge/ concepts ( f  = 7), CT ( f  = 5), physics achievement ( f  = 5), and self-efficacy ( f  = 5). The notable medium frequency of certain NCNT, such as self-efficacy and motivation, highlights a dual focus in research: enhancing both cognitive skills and noncognitive gains in students involved in CT-STEM. There has been greater integration of CT into mathematics and science; however, other disciplines (e.g., biology, chemistry, social science, and culture) have received less attention. The limited observation of NCFT (only two identified) underscores the potential for broader research explorations.

CT-STEM’s overall transfer effects and moderator analysis (RQ2)

Overall transfer effects of ct-stem (rq2a).

In total, 37 primary studies involving 7832 students were included in the sample, yielding 96 effect sizes. Among these studies, 62% (23 studies) utilized an IGPP design, 35% (13 studies) adopted an SGPP design, and 3% (1 study) employed an IGP design. In this meta-analysis, we first analyzed 37 empirical studies using a random model. Our finding shows a significant overall effect size favoring the transfer effect of CT-STEM on both cognitive and noncognitive benefits for students ( g  = 0.601, 95% CI [0.510–0.691], Z  = 12.976, p  < 0.001) (see Fig.  4 ). The heterogeneity test results showed a significant Q value ( Q  = 853.052, I 2  = 88.864, p  < 0.001), suggesting substantial heterogeneity in the study effect sizes. Thus, a moderator analysis of different contextual variables would be required in subsequent analyses.

figure 4

Forest plot of effect size (Hedges’ g ) in the random-effect model

To assess potential publication bias in our meta-analysis, we generated a funnel plot and performed the Classic Fail-safe N and Egger tests. As depicted in Fig.  5 , the studies were primarily evenly distributed on both sides of the funnel plot and located in the middle to upper effective areas (Egger et al., 1997 ). The Classic Fail-safe N value was 4702, significantly exceeding the conservative threshold of 5 k + 10 (490). Moreover, Egger’s Intercept was 1.01, [− 0.03–2.05] with a p -value of 0.06, which indicates no publication bias in our data set.

figure 5

Funnel plot (Hedges’ g ) of overall transfer effects

Moderator analysis of overall transfer effects (RQ2b)

We examined six variables as potential moderators, including educational level , sample size , study design , subject , instructional strategy , and intervention duration , using the random model to identify the origins of heterogeneity (see Table  5 ). The moderator analysis indicated no significant differences in effect size among various study designs ( QB  = 2.142, df  = 2, p  = 0.343). This suggests that different designs estimate a similar treatment effect, allowing for a combined analysis of effect sizes across designs (Morris & DeShon, 2002 ). Further, the analysis showed that the subject did not significantly moderate the CT-STEM benefits ( QB  = 13.374, df  = 9, p  = 0.146), indicating effective CT integration across various STEM disciplines ( g  = 0.567, p  < 0.001). However, we observed a notable exception in social science ( g  = 0.727, p  = 0.185), where the integration effect was not significant, in contrast to significant effects in subjects like engineering ( g  = 0.883, p  < 0.001) and science ( g  = 0.875, p  < 0.001).

Significant moderator effects were found in educational level ( QB  = 13.679, df  = 3, p  = 0.003), sample size ( QB  = 48.032, df  = 3, p  < 0.001), instructional strategy ( QB  = 7.387, df  = 2, p  = 0.025), and intervention duration ( QB  = 22.950, df  = 3, p  < 0.001). Specifically, educational levels showed different effects: medium for kindergarten ( g  = 0.777, p  < 0.001), elementary ( g  = 0.613, p  < 0.001), and secondary students ( g  = 0.690, p  < 0.001), but lower for university students ( g  = 0.366, p  < 0.001). This indicates a stronger CT-STEM impact in the lower grades. Smaller sample size groups (less than 50 students) exhibited the highest effect size ( g  = 0.826, p  < 0.001), while larger groups (over 150 students) showed the lowest ( g  = 0.233, p  < 0.001), suggesting a decrease in effect with increasing class size. Instructional strategy was a significant moderator, indicating that the intervention strategy type significantly impacts CT-STEM’s transfer effects. Strategies involving topic contexts (e.g., modeling, simulation, robotics, programming) had the largest effect ( g  = 0.647, p  < 0.001), followed by scaffolding methods (e.g., (meta)cognitive scaffolding) ( g  = 0.492, p  < 0.001), with the instructional model strategy showing the smallest effect ( g  = 0.394, p  < 0.001). In addition, intervention duration was a critical moderator. The most significant effect was observed in interventions lasting between one week and one month ( g  = 0.736, p  < 0.001), with longer durations showing diminishing effects.

CT-STEM’s near and far transfer effects and moderator analysis (RQ3)

Near transfer effect by cognitive and noncognitive benefits (rq3a).

To further analyze the effect size of CT-STEM near-transfer, we focused on a subgroup encompassing both cognitive and noncognitive benefits, as detailed in Table  6 . We observed that the effect size for CT-STEM near-transfer is 0.645 (95% CI [0.536–0.753], Z  = 11.609, p  < 0.001), indicating a moderate impact on near-transfer benefits, with cognitive benefits demonstrating a larger effect size ( g  = 0.672, 95% CI [0.540–0.804], Z  = 9.978, p  < 0.001) compared to noncognitive benefits ( g  = 0.547, 95% CI [0.388–0.706], Z  = 6.735, p  < 0.001). This suggests that CT-STEM interventions are more impactful on cognitive aspects, e.g., CT skills, programming abilities, and algorithmic thinking, than noncognitive aspects, such as self-efficacy, learning motivation, and attitudes.

We utilized a funnel plot to assess and illustrate the publication bias of the study (see Fig.  6 ). The majority of the studies cluster in the effective area of the plot. The symmetric distribution of studies on the funnel plot’s left and right sides suggests a minimal publication bias. Furthermore, Egger’s test yielded a result of t (70) = 0.85 with a p -value of 0.40, reinforcing this indication. The Classic Fail-safe N was calculated to be 6539, substantially exceeding the estimated number of unpublished studies (5 k + 10 = 370). Therefore, these results collectively suggest that publication bias has a negligible impact on the CT-STEM’s near-transfer effects.

figure 6

Funnel plot (Hedges’ g ) of near-transfer effect

Far transfer effect by cognitive and noncognitive benefits (RQ3a)

In examining CT-STEM far-transfer as a specific subgroup (see Table  6 ), we found a moderate effect size ( g  = 0.444, 95% CI [0.312–0.576], Z  = 6.596, p  < 0.001), indicating a significant positive impact of CT-STEM on students’ generic skills, including creativity, critical thinking, and problem-solving. A comparison of effect sizes between cognitive and noncognitive benefits revealed that cognitive benefits ( g  = 0.466, 95% CI [0.321–0.611], Z  = 6.289, p  < 0.001) were more pronounced than noncognitive benefits ( g  = 0.393, 95% CI [0.011–0.775], Z  = 1.833, p  = 0.044). The results show that CT-STEM effectively enhances cognitive and noncognitive skills in the far-transfer type. The far-transfer effect is more significant for cognitive abilities such as general thinking and problem-solving skills than noncognitive skills.

The funnel plot for far-transfer effects (see Fig.  7 ) shows some degree of asymmetry, which was further substantiated by Egger’s Test, yielding t (24) = 3.90 with a p -value of less than 0.001. Although the calculated Fail-safe N (N = 794) is considerably larger than the threshold of 5 k + 10 (130), this discrepancy does suggest the possibility of some publication bias in the far-transfer effects of our study.

figure 7

Funnel plot (Hedges’ g ) of far-transfer effect

Heterogeneity and moderator analysis of near and far transfer effects (RQ3b)

We conducted heterogeneity assessments for each subgroup, focusing on near-transfer and far-transfer effects. The significant Q statistic values indicated high heterogeneity in both groups ( Q near  = 671.379, I 2  = 89.425%, p  < 0.001; Q fa r  = 93.552, I 2  = 75.415%, p  < 0.001). We then explored moderating effects based on educational level , sample size , subject , instructional strategy , and intervention duration . The results showed that the near-transfer effect of CT-STEM is moderated by educational level , sample size , instructional strategy , and intervention duration (see Table  7 ). In contrast, the far-transfer effect is moderated only by educational level and sample size (see Table  8 ). These findings suggest that the near-transfer effect is more susceptible to contextual factors variations than the far-transfer effect.

Discussion and implications

This study examined the transfer effects of CT-STEM on students’ cognitive and noncognitive skills. We conducted a systematic literature review and a meta-analysis approach. The main findings and implications of this study are discussed in the following sections.

Cognitive and noncognitive benefits through CT-STEM transfer effects

RQ1 asks what are the cognitive and noncognitive benefits derived from the transfer effects of CT-STEM. From 37 empirical studies, we identified 36 benefits, categorized into four types: CNT, CFT, NCNT, and NCFT. These benefits are consistent with findings in prior studies (e.g., Melro et al., 2023 ; Román-González et al., 2018 ; Scherer et al., 2019 ; Tsarava et al., 2022 ; Ye et al., 2022 ), indicating CT-STEM provides cognitive and noncognitive benefits but also fosters development of domain-specific and domain-general skills. Most prior research has focused on CT-STEM’s impact on students’ mathematics achievement, CT skills/concepts, self-efficacy, and cooperativity. Our results further suggest that CT-STEM enhances cognitive skills while significantly contributing to affective and social learning outcomes. This finding supports the view that while CT is primarily cognitive, akin to problem-solving abilities, it has a significant noncognitive aspect (Román-González et al., 2018 ). An illustrative example is the study by Wang et al. ( 2022b ), which developed a non-programming, unplugged-in CT program in mathematics, that effectively improved students’ CT skills, cooperation tendencies, and perceptions of CT.

Most transfer studies to date have primarily focused on students’ mathematics and science achievement, with less emphasis on other subjects like physics, biology, and chemistry. One reason is the overlap in thinking practices among these disciplines and CT (Rich et al., 2019 ; Ye et al. 2023 ). For example, modeling and simulating complex phenomena in these subjects foster problem decomposition skills, crucial in mathematics, science, and CS. Additionally, CT offers an analytical and systematic framework for problem-solving, a key aspect in tackling complex mathematical and scientific problems (Berland & Wilensky, 2015 ). Despite this, CT’s potential in a wider range of subjects remains underexplored (Ye et al., 2022 ). Previous studies have identified potential challenges in integrating CT into diverse STEM disciplines (Kite & Park, 2023 ; Li et al., 2020a ), and finding suitable curriculum topics that effectively utilize CT’s benefits can be difficult. Beyond mathematics, CT-STEM transfer studies have looked at topics like ecology (Christensen & Lombardi, 2023 ; Rachmatullah & Wiebe, 2022 ), force and motion (Aksit & Wiebe, 2020 ; Hutchins et al., 2020a , 2020b ), and chemical reactions (Chongo et al., 2021 ). This situation indicates a need for exploring a broader range of STEM topics to fully leverage the synergy between CT and STEM.

Our review identified only two far-noncognitive benefits of CT-STEM, suggesting these benefits may be harder to measure. Gutman and Schoon ( 2013 ) noted that far-noncognitive skills like perseverance and persistence have variable measurement robustness and are context-dependent. Mirroring the research methods of Israel-Fishelson and Hershkovitz ( 2021 ) and Falloon ( 2016 ), we recommend further capturing and analyzing students’ behaviors through recordings or log files from learning platforms. Additionally, few studies have focused on these competencies in CT-STEM, highlighting a promising direction for future CT-STEM integration efforts.

CT-STEM’s transfer effects

For RQ2 (a) and RQ3 (a), our meta-analysis indicates positive impacts on both cognitive ( g  = 0.628) and noncognitive benefits ( g  = 0.510), each showing moderate effect sizes. This finding supports the use of CT-STEM in enhancing students’ cognitive and noncognitive skills, as suggested by Lee et al. ( 2020 ), who argue that integrating CT in STEM encourages deeper engagement in authentic STEM practices, thereby developing a broad spectrum of skills, including cognitive and noncognitive aspects.

Our findings that cognitive benefits exhibit greater effect sizes than noncognitive benefits across both near-transfer and far-transfer, contrast with previous research by Kautz et al. ( 2014 ), which suggested noncognitive skills are more malleable. Two factors that might explain this disparity are gender and age. Gender may be a significant factor since CT-STEM requires students to utilize computational concepts, practices, and perspectives to solve complex, real-world problems, which can have inherent gender biases. For example, Czocher et al. ( 2019 ) found that female students often experience more frustration and lower engagement in CT-STEM, and similar studies report that they have lower interest, confidence, and self-efficacy than males (Wang et al., 2022b ). Jiang and Wong ( 2022 ) found no significant gender differences in cognitive skills like CT, indicating that the differences might lie in the affective skill domains, suggesting that students’ noncognitive skills might be less malleable than their cognitive skills in CT-STEM programs. As such, increasing students’ motivation, especially among girls, is a crucial issue for future studies (Tikva & Tambouris, 2021b ). Student age may be a contributing factor. Lechner et al. ( 2021 ) demonstrated that age influences skill adaptability, with younger individuals showing greater exploratory behavior and neural plasticity. Both characteristics are pivotal for cognitive development (e.g., reasoning skills and literacy) (Gualtieri & Finn, 2022 ), making cognitive skills more plastic than noncognitive skills. This aligns with our findings, where a significant proportion of studies (49%) focused on primary school settings, reinforcing the importance of early CT integration.

In comparing the near- and far-transfer effects, our analysis shows that the effect size for near-transfer is higher than that for far-transfer for both cognitive and noncognitive domains, aligning with previous findings that identified a strong effect of programming through near transfer ( g  = 0.75, 95% CI [0.39, 1.11]) and a moderate effect through far transfer ( g  = 0.47, 95% CI [0.35, 0.59]) (Scherer et al., 2019 ). One explanation is by the theory of “common elements” (Singley & Anderson, 1989 ), which suggests that skills developed through CT-STEM are more readily transferable to similar contexts due to shared conceptual commonalities and elements (Nouri et al., 2020 ; Scherer et al., 2019 ). Essentially, students proficient in a skill often find it easier to apply this proficiency to a related skill that shares foundational principles and strategies (Baldwin & Ford, 1988 ). Despite this, the far-transfer effects in CT-STEM do occur and are significant. We stress the importance of developing effective strategies that foster these far-transfer effects within the CT-STEM curriculum. One approach is identifying “common elements” and conceptual similarities between different discipline context and skills, thus promoting transference.

Contextual variables explaining variation in the CT-STEM’s transfer effects

In our meta-analysis (Q2 (b) and Q3 (b)), we examined the heterogeneity of CT-STEM’s overall, near-transfer, and far-transfer effects using moderators: educational level , sample size , study design , subject , instructional strategy , and intervention duration . For the overall transfer effects, we found significant variations in the effect size, with notably higher efficacy observed in grade school students than university students. This finding further advocates for the early integration of CT in STEM education (Nouri et al., 2020 ). This difference in CT-STEM’s impact can be attributed to two factors: (1) It correlates with students’ cognitive and noncognitive development, with early grades being crucial for acquiring these benefits (Jiang & Wong, 2022 ); (2) The hands-on, experiential nature of CT-STEM, utilizing tangible materials and interactive simulations, is particularly suited to the development and learning needs of young children (Thomas & Larwin, 2023 ). Also, class size emerged as a strong moderator (Li et al., 2022 ; Sun & Zhou, 2022 ; Sun et al., 2021 ), with smaller classes (under 50 students) showing more pronounced transfer effects. As class size increases, the impact of CT-STEM on skills development decreases, possibly due to logistical constraints e.g., space, equipment, and resources (Cheng et al., 2023 ). We also found significant differences due to instructional strategies . Learning activities involving computational modeling, simulation, and embodied learning yielded larger effect sizes. This supports constructivist educational methods like computational modeling for simulating complex phenomena and facilitating content learning (Basu et al., 2015 ; Sengupta et al., 2013 ). For intervention duration , we found that CT-STEM interventions of one week to one month are most effective in enhancing student’s learning outcomes, after which the effect size diminishes, in agreement with Sun et al. ( 2021 ). This time frame window may be due to the need to balance learning time and ongoing students’ interest and motivation, with extended durations leading to a decrease in motivation and interest as students adjust to the new learning method (Appleton et al., 2008 ; Cheng et al., 2023 ). Importantly, our analysis revealed that subject matter had little impact on CT-STEM benefits, suggesting broad applicability across various STEM subjects.

Our analysis of near- and far-transfer effects in CT-STEM shows that educational level , sample size , instructional strategy , and intervention duration significantly moderate near-transfer effects, while far-transfer effects are mainly moderated by educational level and sample size . One explanation is that near-transfer effects are linked to domain-specific skills, responding to particular instructional elements like strategies and duration (van Graaf et al., 2019 ). While far-transfer effects for domain-general skills like critical thinking show significant moderation primarily by educational level and sample size , rather than instructional design. This may be due to a predominant focus on domain-specific skills in current instructional designs (Geary et al., 2017 ). One attractive alternative is to consider CT as a transdisciplinary thinking practice and integrate it across various STEM subjects to enhance students’ domain-general skills development (Li et al., 2020b ).

The far-transfer effects are linked to cognitive development and social contexts, and thus influenced by educational level , which aligns with cognitive maturation and skill readiness (Jiang & Wong, 2021; Zhan et al., 2022 ). In addition, sample size also affects social skills and classroom dynamics (Sung et al., 2017 ; Yılmaz & Yılmaz, 2023 ). Therefore, in designing CT-STEM activities, it is crucial to consider age-appropriate objectives and learning content, as well as class size, for optimal development of cognitive and social skills. Future research should continue to explore these factors, particularly in developing social skills.

Theoretical and practical implications

This study provides new knowledge for CT-STEM research and informs CT-STEM instructional design and practice. This work extends the current understanding of CT-STEM’s transfer effects on students’ cognitive and noncognitive domains. Our findings support the premise that CT-STEM can significantly enhance the development of students’ cognitive and noncognitive skills through near and far transfer. In addition, we provide a simple hierarchical structure that integrates cognitive and noncognitive domains through a transfer perspective (see Table  3 ). This structure can guide researchers in systematically classifying and identifying measurable constructs, leading to a more comprehensive understanding of student learning in CT-STEM.

Analysis of moderators provides actionable guidance for CT-STEM instructional design to capitalize on positive transfer effects. For overall and near-transfer effects, we encourage early integration of CT into individual and iSTEM disciplines through informed designed activities. We show that smaller class sizes (under 50 students), interventions lasting one week to one month, and strategic selection of instructional methods like computational modeling promote more effective transference (see Tables 5 and 7 ). Consequently, we recommend that educators and instructional designers prioritize creating collaborative learning environments using both in-person, hybrid, and online collaborative platforms, reducing logistical issues and allowing for closer monitoring of group interactions and timely feedback. Flexible curriculum design, with durations ranging from intensive one-week modules to longer month-long projects, is key to maximizing transference learning effects. Given computational modeling’s central role in STEM (NGSS Lead States, 2013 ), we encourage educators looking to integrate CT into classroom teaching to consider it as a primary entry point. To support far-transfer, educators need to develop age-appropriate content and activities that align with students’ cognitive development progression (Zhang and Nouri, 2019 ), alongside fostering a collaborative culture that nurtures social skills. For instructional models that have shown the greatest effect sizes (see Table  8 ), we strongly encourage teachers, especially those with prior experience in CT integration, to develop instructional models based on engineering design processes (Wiebe et al., 2020 ) that engage students in problem-solving and the creation of creative artifacts to foster their higher-order thinking skills.

This systematic literature review and meta-analysis examined the cognitive and noncognitive benefits of CT-STEM’s transfer effects. Analyzing 96 effect sizes from 37 qualifying studies, we found: (a) 36 distinct CT-STEM benefits across four categories, namely, CNT, CFT, NCNT, and NCFT; (b) CT-STEM had overall medium and significant impacts on four categories of benefits ( g  = 0.601); (c) the effect size of near-transfer ( g  = 0.645) was greater than that of far-transfer ( g  = 0.444), and cognitive benefits ( g  = 0.628) consistently showed a larger effect size than noncognitive benefits ( g  = 0.510); (d) educational level, sample size, instructional strategy, and intervention duration significantly moderated both overall and near-transfer effects, while far-transfer effects were significantly moderated only by educational level and sample size. Our findings provide a roadmap for curriculum designers and teachers to more effectively and efficiently integrate CT into STEM education at all grade levels, enhancing student development of both cognitive and noncognitive skills.

This study has several limitations. Although it uses a comprehensive review of the literature across seven databases, some specialized sources might have been overlooked. This highlights the need for future research to include more specialized/professional databases for an additional understanding of CT-STEM’s transfer effects. While the standardization of effect sizes and moderator analysis helped to mitigate potential biases from diverse study designs, further methodological enhancements are warranted in future studies. The findings on noncognitive benefits through far transfer (NCFT), such as social competencies, are limited by the nature of the research dataset and the limited research available (Lai & Wong, 2022 ; Lai et al., 2023 ). This indicates a need for the rigorous development of measurement tools and instructional designs in this area. Finally, we investigated six moderators within CT-STEM but did not examine aspects like curriculum characteristics and teachers’ experience. These areas, due to their qualitative nature and infrequent reporting in our sample studies, were not included but are significant avenues for future research. Despite these limitations, the study’s contributions are significant, as it systematically elucidates the cognitive and noncognitive benefits from CT-STEM transfer effects and provides robust evidence. The identified moderators aid educators in facilitating the occurrence of transfer within classroom teaching.

Availability of data and materials

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request.

Adanır, G. A., Delen, I., & Gulbahar, Y. (2024). Research trends in K-5 computational thinking education: A bibliometric analysis and ideas to move forward. Education and Information Technologies, 29 , 3589–3614. https://doi.org/10.1007/s10639-023-11974-4

Article   Google Scholar  

Aksit, O., & Wiebe, E. N. (2020). Exploring force and motion concepts in middle grades using computational modeling: A classroom intervention study. Journal of Science Education and Technology, 29 , 65–82. https://doi.org/10.1007/s10956-019-09800-z

Angeli, C. (2022). The effects of scaffolded programming scripts on pre-service teachers’ computational thinking: Developing algorithmic thinking through programming robots. International Journal of Child-Computer Interaction, 31 , 100329. https://doi.org/10.1016/j.ijcci.2021.100329

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools, 45 (5), 369–386. https://doi.org/10.1002/pits.20303

Arfé, B., Vardanega, T., Montuori, C., & Lavanga, M. (2019). Coding in primary grades boosts children’s executive functions. Frontiers in Psychology, 10 , 2713. https://doi.org/10.3389/fpsyg.2019.02713

Bai, S., Hew, K. F., & Huang, B. (2020). Does gamification improve student learning outcome? Evidence from a meta-analysis and synthesis of qualitative data in educational contexts. Educational Research Review, 30 , 100322. https://doi.org/10.1016/j.edurev.2020.100322

Baldwin, T. T., & Ford, J. K. (1988). Transfer of training: A review and directions for future research. Personnel Psychology, 41 (1), 63–105. https://doi.org/10.1111/j.1744-65701988.tb00632.x

Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: What is involved and what is the role of the computer science education community? Acm Inroads, 2 (1), 48–54. https://doi.org/10.1145/1929887.1929905

Barth-Cohen, L., Montoya, B., & Shen, J. (2019). Walk like a robot: A no-tech coding activity to teach computational thinking. Science Scope , 42 (9), 12–17. https://www.jstor.org/stable/26899024

Basu, S., Sengupta, P., & Biswas, G. (2015). A scaffolding framework to support learning of emergent phenomena using multi-agent-based simulation environments. Research in Science Education, 45 , 293–324. https://doi.org/10.1007/s11165-014-9424-z

Berland, M., & Wilensky, U. (2015). Comparing virtual and physical robotics environments for supporting complex systems and computational thinking. Journal of Science Education and Technology, 24 , 628–647. https://doi.org/10.1007/s10956-015-9552-x

Bernardo, M. A., & Morris, J. D. (1994). Transfer effects of a high school computer programming course on mathematical modeling, procedural comprehension, and verbal problem solution. Journal of Research on Computing in Education, 26 (4), 523–536. https://doi.org/10.1080/08886504.1994.10782108

Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: Exploration of an early childhood robotics curriculum. Computers & Education, 72 , 145–157. https://doi.org/10.1016/j.compedu.2013.10.020

Bloom, B. S., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals by a committee of college and university examiners . Handbook I: Cognitive domain . Longmans, Green.

Borenstein, M. (2005). Software for publication bias. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 193–220). John Wiley & Sons. https://doi.org/10.1002/0470870168

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Random-effects model. In Introduction to meta-analysis (pp. 69–75). John Wiley & Sons. https://doi.org/10.1002/9780470743386

Borenstein, M., Hedges, L., Higgins, J., & Rothstein, H. (2013). Comprehensive Meta Analysis (Version 3) [Computer software]. Biostat.

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2021). Subgroup analyses. In Introduction to meta-analysis (2nd ed., pp. 161–195). John Wiley & Sons.

Bortz, W. W., Gautam, A., Tatar, D., & Lipscomb, K. (2020). Missing in measurement: Why identifying learning in integrated domains is so hard. Journal of Science Education and Technology, 29 , 121–136. https://doi.org/10.1007/s10956-019-09805-8

Bransford, J. D., & Schwartz, D. L. (1999). Chapter 3: Rethinking transfer: A simple proposal with multiple implications. Review of Research in Education, 24 (1), 61–100. https://doi.org/10.3102/0091732X024001061

Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 Annual Meeting of the American Educational Research Association (pp. 1–25). Vancouver, BC. http://scratched.gse.harvard.edu/ct/files/AERA2012.pdf

Chen, H. E., Sun, D., Hsu, T. C., Yang, Y., & Sun, J. (2023a). Visualising trends in computational thinking research from 2012 to 2021: A bibliometric analysis. Thinking Skills and Creativity, 47 , 101224. https://doi.org/10.1016/j.tsc.2022.101224

Chen, J., Wang, M., Kirschner, P. A., & Tsai, C.-C. (2018). The role of collaboration, computer use, learning environments, and supporting strategies in CSCL: A meta-analysis. Review of Educational Research, 88 (6), 799–843. https://doi.org/10.3102/0034654318791584

Chen, P., Yang, D., Metwally, A. H. S., Lavonen, J., & Wang, X. (2023b). Fostering computational thinking through unplugged activities: A systematic literature review and meta-analysis. International Journal of STEM Education, 10 , 47. https://doi.org/10.1186/s40594-023-00434-7

Cheng, L., Wang, X., & Ritzhaupt, A. D. (2023). The effects of computational thinking integration in STEM on students’ learning performance in K-12 Education: A Meta-analysis. Journal of Educational Computing Research, 61 (2), 416–443. https://doi.org/10.1177/07356331221114183

Chongo, S., Osman, K., & Nayan, N. A. (2021). Impact of the plugged-in and unplugged chemistry computational thinking modules on achievement in chemistry. EURASIA Journal of Mathematics, Science and Technology Education, 17 (4), em1953. https://doi.org/10.29333/ejmste/10789

Christensen, D., & Lombardi, D. (2023). Biological evolution learning and computational thinking: Enhancing understanding through integration of disciplinary core knowledge and scientific practice. International Journal of Science Education, 45 (4), 293–313. https://doi.org/10.1080/09500693.2022.2160221

CSTA & ISTE. (2011). Operational definition of computational thinking for K–12 education . Retrieved from http://csta.acm.org/Curriculum/sub/CurrFiles/CompThinkingFlyer.pdf

Cuijpers, P., Weitz, E., Cristea, I. A., & Twisk, J. (2017). Pre-post effect sizes should be avoided in meta-analyses. Epidemiology and Psychiatric Sciences, 26 (4), 364–368. https://doi.org/10.1017/S2045796016000809

Czocher, J. A., Melhuish, K., & Kandasamy, S. S. (2019). Building mathematics self-efficacy of STEM undergraduates through mathematical modelling. International Journal of Mathematical Education in Science and Technology, 51 (6), 807–834. https://doi.org/10.1080/0020739X.2019.1634223

Day, S. B., & Goldstone, R. L. (2012). The import of knowledge export: Connecting findings and theories of transfer of learning. Educational Psychologist, 47 (3), 153–176. https://doi.org/10.1080/00461520.2012.696438

Delen, I., & Sen, S. (2023). Effect of design-based learning on achievement in K-12 education: A meta-analysis. Journal of Research in Science Teaching, 60 (2), 330–356. https://doi.org/10.1002/tea.21800

Denner, J., Werner, L., & Ortiz, E. (2012). Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts? Computers & Education, 58 (1), 240–249. https://doi.org/10.1016/j.compedu.2011.08.006

Doleck, T., Bazelais, P., Lemay, D. J., Saxena, A., & Basnet, R. B. (2017). Algorithmic thinking, cooperativity, creativity, critical thinking, and problem solving: Exploring the relationship between computational thinking skills and academic performance. Journal of Computers in Education, 4 , 355–369. https://doi.org/10.1007/s40692-017-0090-9

Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315 (7109), 629–634. https://doi.org/10.1136/bmj.315.7109.629

Eidin, E., Bielik, T., Touitou, I., Bowers, J., McIntyre, C., Damelin, D., & Krajcik, J. (2024). Thinking in terms of change over time: Opportunities and challenges of using system dynamics models. Journal of Science Education and Technology, 33 , 1–28. https://doi.org/10.1007/s10956-023-10047-y

Ezeamuzie, N. O., & Leung, J. S. C. (2022). Computational thinking through an empirical lens: A systematic review of literature. Journal of Educational Computing Research, 60 (2), 481–511. https://doi.org/10.1177/07356331211033158

Falloon, G. (2016). An analysis of young students’ thinking when completing basic coding tasks using Scratch Jnr. On the iPad. Journal of Computer Assisted Learning, 32 (6), 576–593. https://doi.org/10.1111/jcal.12155

Fanchamps, N. L. J. A., Slangen, L., Hennissen, P., & Specht, M. (2021). The influence of SRA programming on algorithmic thinking and self-efficacy using Lego robotics in two types of instruction. International Journal of Technology and Design Education, 31 , 203–222. https://doi.org/10.1007/s10798-019-09559-9

Geary, D. C., Nicholas, A., Li, Y., & Sun, J. (2017). Developmental change in the influence of domain-general abilities and domain-specific knowledge on mathematics achievement: An eight-year longitudinal study. Journal of Educational Psychology, 109 (5), 680–693. https://doi.org/10.1037/edu0000159

Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42 (1), 38–43. https://doi.org/10.3102/0013189X12463051

Gualtieri, S., & Finn, A. S. (2022). The sweet spot: When children’s developing abilities, brains, and knowledge make them better learners than adults. Perspectives on Psychological Science, 17 (5), 1322–1338. https://doi.org/10.1177/17456916211045971

Gutman, L. M., & Schoon, I. (2013). The impact of non-cognitive skills on outcomes for young people . University of London, Institute of Education.

Google Scholar  

Guven, G., Kozcu Cakir, N., Sulun, Y., Cetin, G., & Guven, E. (2022). Arduino-assisted robotics coding applications integrated into the 5E learning model in science teaching. Journal of Research on Technology in Education, 54 (1), 108–126. https://doi.org/10.1080/15391523.2020.1812136

Hedges, L. V., & Olkin, I. (2014). Statistical methods for meta-analysis . Academic Press.

Hsu, T.-C., Abelson, H., Lao, N., & Chen, S.-C. (2021). Is it possible for young students to learn the AI-STEAM application with experiential learning? Sustainability, 13 (19), 11114. https://doi.org/10.3390/su131911114

Hsu, T.-C., Chang, S.-C., & Hung, Y.-T. (2018). How to learn and how to teach computational thinking: Suggestions based on a review of the literature. Computers & Education, 126 , 296–310. https://doi.org/10.1016/j.compedu.2018.07.004

Hurt, T., Greenwald, E., Allan, S., Cannady, M. A., Krakowski, A., Brodsky, L., Collins, M. A., Montgomery, R., & Dorph, R. (2023). The computational thinking for science (CT-S) framework: Operationalizing CT-S for K–12 science education researchers and educators. International Journal of STEM Education, 10 , 1. https://doi.org/10.1186/s40594-022-00391-7

Hutchins, N. M., Biswas, G., Maróti, M., Lédeczi, Á., Grover, S., Wolf, R., Blair, K. P., Chin, D., Conlin, L., Basu, S., & McElhaney, K. (2020a). C2STEM: A system for synergistic learning of physics and computational thinking. Journal of Science Education and Technology, 29 , 83–100. https://doi.org/10.1007/s10956-019-09804-9

Hutchins, N. M., Biswas, G., Zhang, N., Snyder, C., Lédeczi, Á., & Maróti, M. (2020b). Domain-specific modeling languages in computer-based learning environments: A systematic approach to support science learning through computational modeling. International Journal of Artificial Intelligence in Education, 30 , 537–580. https://doi.org/10.1007/s40593-020-00209-z

Israel-Fishelson, R., & Hershkovitz, A. (2021). Micro-persistence and difficulty in a game-based learning environment for computational thinking acquisition. Journal of Computer Assisted Learning, 37 (3), 839–850. https://doi.org/10.1111/jcal.12527

Israel-Fishelson, R., & Hershkovitz, A. (2022). Studying interrelations of computational thinking and creativity: A scoping review (2011–2020). Computers & Education, 176 , 104353. https://doi.org/10.1016/j.compedu.2021.104353

Jesson, J., Matheson, L., & Lacey, F. M. (2011). Doing your literature review: Traditional and systematic techniques (1st ed.). SAGE Publications.

Jiang, S., & Wong, G. K. W. (2022). Exploring age and gender differences of computational thinkers in primary school: A developmental perspective. Journal of Computer Assisted Learning, 38 (1), 60–75. https://doi.org/10.1111/jcal.12591

Jocius, R., O’Byrne, W. I., Albert, J., Joshi, D., Robinson, R., & Andrews, A. (2021). Infusing computational thinking into STEM teaching: From professional development to classroom practice. Educational Technology & Society, 24 (4), 166–179.

Kafai, Y. B., & Proctor, C. (2022). A revaluation of computational thinking in K–12 education: Moving toward computational literacies. Educational Researcher, 51 (2), 146–151. https://doi.org/10.3102/0013189X211057904

Kalelioglu, F., Gulbahar, Y., & Kukul, V. (2016). A framework for computational thinking based on a systematic research review. Baltic Journal of Modern Computing, 4 (3), 583–596.

Kautz, T., Heckman, J. J., Diris, R., ter Weel, B., & Borghans, L. (2014). Fostering and measuring skills: Improving cognitive and non-cognitive skills to promote lifetime success (OECD Education Working Papers No. 110). OECD Publishing. https://doi.org/10.1787/5jxsr7vr78f7-en

Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3 , 11. https://doi.org/10.1186/s40594-016-0046-z

Kite, V., & Park, S. (2023). What’s computational thinking? Secondary science teachers’ conceptualizations of computational thinking (CT) and perceived barriers to CT integration. Journal of Science Teacher Education, 34 (4), 391–414. https://doi.org/10.1080/1046560X.2022.2110068

Knochel, A. D., & Patton, R. M. (2015). If art education then critical digital making: Computational thinking and creative code. Studies in Art Education, 57 (1), 21–38.

Korkmaz, Ö., Çakir, R., & Özden, M. Y. (2017). A validity and reliability study of the computational thinking scales (CTS). Computers in Human Behavior , 72 , 558–569. https://doi.org/10.1016/j.chb.2017.01.005

Lai, R. P., & Ellefson, M. R. (2023). How multidimensional is computational thinking competency? A bi-factor model of the computational thinking challenge. Journal of Educational Computing Research, 61 (2), 259–282. https://doi.org/10.1177/07356331221121052

Lai, X., & Wong, G. K. W. (2022). Collaborative versus individual problem solving in computational thinking through programming: A meta-analysis. British Journal of Educational Technology, 53 (1), 150–170. https://doi.org/10.1111/bjet.13157

Lai, X., Ye, J., & Wong, G. K. W. (2023). Effectiveness of collaboration in developing computational thinking skills: A systematic review of social cognitive factors. Journal of Computer Assisted Learning, 39 (5), 1418–1435. https://doi.org/10.1111/jcal.12845

Lechner, C. M., Gauly, B., Miyamoto, A., & Wicht, A. (2021). Stability and change in adults’ literacy and numeracy skills: Evidence from two large-scale panel studies. Personality and Individual Differences, 180 , 110990. https://doi.org/10.1016/j.paid.2021.110990

Lee, I., Grover, S., Martin, F., Pillai, S., & Malyn-Smith, J. (2020). Computational thinking from a disciplinary perspective: Integrating computational thinking in K-12 science, technology, engineering, and mathematics education. Journal of Science Education and Technology, 29 , 1–8. https://doi.org/10.1007/s10956-019-09803-w

Lee, I., & Malyn-Smith, J. (2020). Computational thinking integration patterns along the framework defining computational thinking from a disciplinary perspective. Journal of Science Education and Technology, 29 , 9–18. https://doi.org/10.1007/s10956-019-09802-x

Leonard, J., Buss, A., Gamboa, R., Mitchell, M., Fashola, O. S., Hubert, T., & Almughyirah, S. (2016). Using robotics and game design to enhance children’s self-efficacy, STEM attitudes, and computational thinking skills. Journal of Science Education and Technology, 25 , 860–876. https://doi.org/10.1007/s10956-016-9628-2

Li, F., Wang, X., He, X., Cheng, L., & Wang, Y. (2022). The effectiveness of unplugged activities and programming exercises in computational thinking education: A meta-analysis. Education and Information Technologies, 27 , 7993–8013. https://doi.org/10.1007/s10639-022-10915-x

Li, X., Xie, K., Vongkulluksn, V., Stein, D., & Zhang, Y. (2023). Developing and testing a design-based learning approach to enhance elementary students’ self-perceived computational thinking. Journal of Research on Technology in Education, 55 (2), 344–368. https://doi.org/10.1080/15391523.2021.1962453

Li, Y., & Anderson, J. (2020). STEM integration: Diverse approaches to meet diverse needs. In J. Anderson & Y. Li (Eds.), Integrated approaches to STEM education: An international perspective (pp. 15–20). Springer. https://doi.org/10.1007/978-3-030-52229-2_2

Chapter   Google Scholar  

Li, Y., Schoenfeld, A. H., diSessa, A. A., Graesser, A. C., Benson, L. C., English, L. D., & Duschl, R. A. (2020a). Computational thinking is more about thinking than computing. Journal for STEM Education Research, 3 , 1–18. https://doi.org/10.1007/s41979-020-00030-2

Li, Y., Schoenfeld, A. H., diSessa, A. A., Graesser, A. C., Benson, L. C., English, L. D., & Duschl, R. A. (2020b). On computational thinking and STEM education. Journal for STEM Education Research, 3 , 147–166. https://doi.org/10.1007/s41979-020-00044-w

Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis . SAGE Publications Inc.

Liu, Z., & Jeong, A. C. (2022). Connecting learning and playing: The effects of in-game cognitive supports on the development and transfer of computational thinking skills. Educational Technology Research and Development, 70 , 1867–1891. https://doi.org/10.1007/s11423-022-10145-5

Lobato, J. (2006). Alternative perspectives on the transfer of learning: History, issues, and challenges for future research. The Journal of the Learning Sciences, 15 (4), 431–449. https://doi.org/10.1207/s15327809jls1504_1

Lu, C., Macdonald, R., Odell, B., Kokhan, V., Demmans Epp, C., & Cutumisu, M. (2022). A scoping review of computational thinking assessments in higher education. Journal of Computing in Higher Education, 34 , 416–461. https://doi.org/10.1007/s12528-021-09305-y

Lyon, J. A., & Magana, A. J. (2021). The use of engineering model-building activities to elicit computational thinking: A design-based research study. Journal of Engineering Education, 110 (1), 184–206. https://doi.org/10.1002/jee.20372

Ma, H., Zhao, M., Wang, H., Wan, X., Cavanaugh, T. W., & Liu, J. (2021). Promoting pupils’ computational thinking skills and self-efficacy: A problem-solving instructional approach. Educational Technology Research and Development, 69 , 1599–1616. https://doi.org/10.1007/s11423-021-10016-5

Malyn-Smith, J., & Ippolito, J. (2011). Profile of a computational thinking enabled STEM professional in America’s workplaces: Research Scientist (Unpublished manuscript) . Education Development Center, Inc.

Mayer, R. E. (2011). Multimedia learning and games. In S. Tobias & J. D. Fletcher (Eds.), Computer Games and Instruction (pp. 281–305). Information Age Publishing.

Mayer, R. E. (2015). On the need for research evidence to guide the design of computer games for learning. Educational Psychologist, 50 (4), 349–353. https://doi.org/10.1080/00461520.2015.1133307

Melro, A., Tarling, G., Fujita, T., & Kleine Staarman, J. (2023). What else can be learned when coding? A configurative literature review of learning opportunities through computational thinking. Journal of Educational Computing Research, 61 (4), 901–924. https://doi.org/10.1177/07356331221133822

Merino-Armero, J. M., González-Calero, J. A., & Cozar-Gutierrez, R. (2022). Computational thinking in K-12 education. An insight through meta-analysis. Journal of Research on Technology in Education, 54 (3), 410–437. https://doi.org/10.1080/15391523.2020.1870250

Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2010). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. International Journal of Surgery, 8 (5), 336–341. https://doi.org/10.1016/j.ijsu.2010.02.007

Morris, S. B., & DeShon, R. P. (2002). Combining effect size estimates in meta-analysis with repeated measures and independent-groups designs. Psychological Methods, 7 (1), 105. https://doi.org/10.1037/1082-989X.7.1.105

Ng, O. L., Leung, A., & Ye, H. (2023). Exploring computational thinking as a boundary object between mathematics and computer programming for STEM teaching and learning. ZDM Mathematics Education, 55 , 1315–1329. https://doi.org/10.1007/s11858-023-01509-z

NGSS Lead States. (2013). Next generation science standards: For states, by states . The National Academy Press.

Nouri, J., Zhang, L., Mannila, L., & Norén, E. (2020). Development of computational thinking, digital competence and 21st century skills when learning programming in K-9. Education Inquiry, 11 (1), 1–17. https://doi.org/10.1080/20004508.2019.1627844

OECD. (2018). Future of education and skills 2030: Conceptual learning framework . A literature summary for research on the transfer of learning (8th Informal Working Group Meeting, pp. 1–29). OECD Conference Centre, Paris, France.

Papert, S. A. (1980). Mindstorms: Children, computers, and powerful ideas . Basic Books.

Perkins, D. N., & Salomon, G. (1992). Transfer of learning. In T. N. Postlethwaite & T. Husen (Eds.), International Encyclopedia of Education (2nd ed., pp. 6452–6457). Pergamon Press.

Petersen, R. D., & Valdez, A. (2005). Using snowball-based methods in hidden populations to generate a randomized community sample of gang-affiliated adolescents. Youth Violence and Juvenile Justice, 3 (2), 151–167. https://doi.org/10.1177/1541204004273316

Phillips, A. M., Gouvea, E. J., Gravel, B. E., Beachemin, P. H., & Atherton, T. J. (2023). Physicality, modeling, and agency in a computational physics class. Physical Review Physics Education Research, 19 (1), 010121. https://doi.org/10.1103/PhysRevPhysEducRes.19.010121

Piatti, A., Adorni, G., El-Hamamsy, L., Negrini, L., Assaf, D., Gambardella, L., & Mondada, F. (2022). The CT-cube: A framework for the design and the assessment of computational thinking activities. Computers in Human Behavior Reports, 5 , 100166. https://doi.org/10.1016/j.chbr.2021.100166

Pirolli, P., & Recker, M. (1994). Learning strategies and transfer in the domain of programming. Cognition and Instruction, 12 (3), 235–275. https://doi.org/10.1207/s1532690xci1203_2

Polat, E., Hopcan, S., Kucuk, S., & Sisman, B. (2021). A comprehensive assessment of secondary school students’ computational thinking skills. British Journal of Educational Technology, 52 (5), 1965–1980. https://doi.org/10.1111/bjet.13092

Popat, S., & Starkey, L. (2019). Learning to code or coding to learn? A systematic review. Computers & Education, 128 , 365–376. https://doi.org/10.1016/j.compedu.2018.10.005

Rachmatullah, A., & Wiebe, E. N. (2022). Building a computational model of food webs: Impacts on middle school students’ computational and systems thinking skills. Journal of Research in Science Teaching, 59 (4), 585–618. https://doi.org/10.1002/tea.21738

Rich, K. M., Spaepen, E., Strickland, C., & Moran, C. (2019). Synergies and differences in mathematical and computational thinking: Implications for integrated instruction. Interactive Learning Environments, 28 (3), 272–283. https://doi.org/10.1080/10494820.2019.1612445

Rodríguez-Martínez, J. A., González-Calero, J. A., & Sáez-López, J. M. (2019). Computational thinking and mathematics using Scratch: An experiment with sixth-grade students. Interactive Learning Environments, 28 (3), 316–327. https://doi.org/10.1080/10494820.2019.1612448

Román-González, M., Pérez-González, J. C., & Jiménez-Fernández, C. (2017). Which cognitive abilities underlie computational thinking? Criterion validity of the computational thinking test. Computers in Human Behavior, 72 , 678–691. https://doi.org/10.1016/j.chb.2016.08.047

Román-González, M., Pérez-González, J. C., Moreno-León, J., & Robles, G. (2018). Extending the nomological network of computational thinking with noncognitive factors. Computers in Human Behavior, 80 , 441–459. https://doi.org/10.1016/j.chb.2017.09.030

Rosenberg, M. S. (2005). The file-drawer problem revisited: A general weighted method for calculating fail-safe numbers in meta-analysis. Evolution, 59 (2), 464–468. https://doi.org/10.1111/j.0014-3820.2005.tb01004.x

Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86 , 638–641.

Sala, G., & Gobet, F. (2016). Do the benefits of chess instruction transfer to academic and cognitive skills? A meta-analysis. Educational Research Review, 18 , 46–57. https://doi.org/10.1016/j.edurev.2016.02.002

Sala, G., & Gobet, F. (2017). Does far transfer exist? Negative evidence from chess, music, and working memory training. Current Directions in Psychological Science, 26 (6), 515–520. https://doi.org/10.1177/0963721417712760

Scherer, R., Siddiq, F., & Sánchez Viveros, B. (2019). The cognitive benefits of learning computer programming: A meta-analysis of transfer effects. Journal of Educational Psychology, 111 (5), 764–792. https://doi.org/10.1037/edu0000314

Scherer, R., Siddiq, F., & Sánchez Viveros, B. (2020). A meta-analysis of teaching and learning computer programming: Effective instructional approaches and conditions. Computers in Human Behavior, 109 , 106349. https://doi.org/10.1016/j.chb.2020.106349

Selby, C. C., & Woollard, J. (2013). Computational thinking: The developing definition. In Paper presented at the 18th annual conference on innovation and technology in computer science education , Canterbury.

Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18 , 351–380. https://doi.org/10.1007/s10639-012-9240-x

Shamseer, L., Moher, D., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., & Stewart, L. A. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: Elaboration and explanation. BMJ, 349 , g7647. https://doi.org/10.1136/bmj.g7647

Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22 , 142–158. https://doi.org/10.1016/j.edurev.2017.09.003

Singley, M. K., & Anderson, J. R. (1989). The transfer of cognitive skill . Harvard University Press.

Sun, L., Hu, L., & Zhou, D. (2021). Which way of design programming activities is more effective to promote K-12 students’ computational thinking skills? A meta-analysis. Journal of Computer Assisted Learning, 37 (4), 1048–1062. https://doi.org/10.1111/jcal.12545

Sun, L., & Zhou, D. (2022). Effective instruction conditions for educational robotics to develop programming ability of K-12 students: A meta-analysis. Journal of Computer Assisted Learning, 39 (2), 380–398. https://doi.org/10.1111/jcal.12750

Sung, W., Ahn, J., & Black, J. B. (2017). Introducing computational thinking to young learners: Practicing computational perspectives through embodiment in mathematics education. Technology, Knowledge and Learning, 22 , 443–463. https://doi.org/10.1007/s10758-017-9328-x

Sung, W., & Black, J. B. (2021). Factors to consider when designing effective learning: Infusing computational thinking in mathematics to support thinking-doing. Journal of Research on Technology in Education, 53 (4), 404–426. https://doi.org/10.1080/15391523.2020.1784066

Tang, X., Yin, Y., Lin, Q., Hadad, R., & Zhai, X. (2020). Assessing computational thinking: A systematic review of empirical studies. Computers & Education, 148 , 103798. https://doi.org/10.1016/j.compedu.2019.103798

Tekdal, M. (2021). Trends and development in research on computational thinking. Education and Information Technologies, 26 , 6499–6529. https://doi.org/10.1007/s10639-021-10617-w

Thomas, D. R., & Larwin, K. H. (2023). A meta-analytic investigation of the impact of middle school STEM education: Where are all the students of color? International Journal of STEM Education, 10 , 43. https://doi.org/10.1186/s40594-023-00425-8

Tikva, C., & Tambouris, E. (2021a). A systematic mapping study on teaching and learning computational thinking through programming in higher education. Thinking Skills and Creativity, 41 , 100849. https://doi.org/10.1016/j.tsc.2021.100849

Tikva, C., & Tambouris, E. (2021b). Mapping computational thinking through programming in K-12 education: A conceptual model based on a systematic literature review. Computers & Education, 162 , 104083. https://doi.org/10.1016/j.compedu.2020.104083

Tsai, M.-J., Liang, J.-C., & Hsu, C.-Y. (2021). The computational thinking scale for computer literacy education. Journal of Educational Computing Research, 59 (4), 579–602. https://doi.org/10.1177/0735633120972356

Tsai, M.-J., Liang, J.-C., Lee, S.W.-Y., & Hsu, C.-Y. (2022). Structural validation for the developmental model of computational thinking. Journal of Educational Computing Research, 60 (1), 56–73. https://doi.org/10.1177/07356331211017794

Tsai, M.-J., Wang, C.-Y., & Hsu, P.-F. (2019). Developing the computer programming self-efficacy scale for computer literacy education. Journal of Educational Computing Research, 56 (8), 1345–1360. https://doi.org/10.1177/0735633117746747

Tsai, Y.-L., & Tsai, C.-C. (2018). Digital game-based second-language vocabulary learning and conditions of research designs: A meta-analysis study. Computers & Education, 125 , 345–357. https://doi.org/10.1016/j.compedu.2018.06.020

Tsarava, K., Moeller, K., Román-González, M., Golle, J., Leifheit, L., Butz, M. V., & Ninaus, M. (2022). A cognitive definition of computational thinking in primary education. Computers & Education, 179 , 104425. https://doi.org/10.1016/j.compedu.2021.104425

van der Graaf, J., van de Sande, E., Gijsel, M., & Segers, E. (2019). A combined approach to strengthen children’s scientific thinking: Direct instruction on scientific reasoning and training of teacher’s verbal support. International Journal of Science Education, 41 (9), 1119–1138. https://doi.org/10.1080/09500693.2019.1594442

Wang, C., Shen, J., & Chao, J. (2022a). Integrating computational thinking in STEM education: A literature review. International Journal of Science and Mathematics Education, 20 , 1949–1972. https://doi.org/10.1007/s10763-021-10227-5

Wang, J., Zhang, Y., Hung, C. Y., Wang, Q., & Zheng, Y. (2022b). Exploring the characteristics of an optimal design of non-programming plugged learning for developing primary school students’ computational thinking in mathematics. Educational Technology Research and Development, 70 , 849–880. https://doi.org/10.1007/s11423-022-10093-0

Waterman, K. P., Goldsmith, L., & Pasquale, M. (2020). Integrating computational thinking into elementary science curriculum: An examination of activities that support students’ computational thinking in the service of disciplinary learning. Journal of Science Education and Technology, 29 , 53–64. https://doi.org/10.1007/s10956-019-09801-y

Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25 , 127–147. https://doi.org/10.1007/s10956-015-9581-5

Weller, D. P., Bott, T. E., Caballero, M. D., & Irving, P. W. (2022). Development and illustration of a framework for computational thinking practices in introductory physics. Physical Review Physics Education Research, 18 (2), 020106. https://doi.org/10.1103/PhysRevPhysEducRes.18.020106

Wiebe, E., Kite, V., & Park, S. (2020). Integrating computational thinking in STEM. In C. C. Johnson, M. J. Mohr-Schroeder, T. J. Moore, & L. D. English (Eds.), Handbook of Research on STEM Education (pp. 196–209). Taylor & Francis Group.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49 (3), 33–35. https://doi.org/10.1145/1118178.1118215

Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society a: Mathematical, Physical and Engineering Sciences, 366 (1881), 3717–3725. https://doi.org/10.1098/rsta.2008.0118

Wing, J. M. (2011). Research notebook: Computational thinking—What and why. The Link Magazine, 6 , 20–23.

Woo, K., & Falloon, G. (2022). Problem solved, but how? An exploratory study into students’ problem solving processes in creative coding tasks. Thinking Skills and Creativity, 46 , 101193. https://doi.org/10.1016/j.tsc.2022.101193

Xia, L., & Zhong, B. (2018). A systematic review on teaching and learning robotics content knowledge in K-12. Computers & Education, 127 , 267–282. https://doi.org/10.1016/j.compedu.2018.09.007

Xu, W., Geng, F., & Wang, L. (2022). Relations of computational thinking to reasoning ability and creative thinking in young children: Mediating role of arithmetic fluency. Thinking Skills and Creativity, 44 , 101041. https://doi.org/10.1016/j.tsc.2022.101041

Xu, Z., Ritzhaupt, A. D., Tian, F., & Umapathy, K. (2019). Block-based versus text-based programming environments on novice student learning outcomes: A meta-analysis study. Computer Science Education, 29 (2–3), 177–204. https://doi.org/10.1080/08993408.2019.1565233

Ye, H., Liang, B., Ng, O.-L., & Chai, C. S. (2023). Integration of computational thinking in K-12 mathematics education: A systematic review on CT-based mathematics instruction and student learning. International Journal of STEM Education, 10 , 3. https://doi.org/10.1186/s40594-023-00396-w

Ye, J., Lai, X., & Wong, G. K. W. (2022). The transfer effects of computational thinking: A systematic review with meta-analysis and qualitative synthesis. Journal of Computer Assisted Learning, 38 (6), 1620–1638. https://doi.org/10.1111/jcal.12723

Yılmaz, F. G. K., & Yılmaz, R. (2023). Exploring the role of sociability, sense of community and course satisfaction on students’ engagement in flipped classroom supported by facebook groups. Journal of Computers in Education, 10 , 135–162. https://doi.org/10.1007/s40692-022-00226-y

Yin, Y., Hadad, R., Tang, X., & Lin, Q. (2020). Improving and assessing computational thinking in maker activities: The integration with physics and engineering learning. Journal of Science Education and Technology, 29 , 189–214. https://doi.org/10.1007/s10956-019-09794-8

Yun, H. J., & Cho, J. (2022). Affective domain studies of K-12 computing education: A systematic review from a perspective on affective objectives. Journal of Computers in Education, 9 , 477–514. https://doi.org/10.1007/s40692-021-00211-x

Zha, S., Morrow, D. A., Curtis, J., & Mitchell, S. (2021). Learning culture and computational thinking in a Spanish course: A development model. Journal of Educational Computing Research, 59 (5), 844–869. https://doi.org/10.1177/0735633120978530

Zhan, Z., He, W., Yi, X., & Ma, S. (2022). Effect of unplugged programming teaching aids on children’s computational thinking and classroom interaction: With respect to Piaget’s four stages theory. Journal of Educational Computing Research, 60 (5), 1277–1300. https://doi.org/10.1177/07356331211057143

Zhang, L., & Nouri, J. (2019). A systematic review of learning computational thinking through Scratch in K-9. Computers & Education, 141 , 103607. https://doi.org/10.1016/j.compedu.2019.103607

Zhang, S., & Wong, G. K. W. (2023). Exploring the underlying cognitive process of computational thinking in primary education. Thinking Skills and Creativity, 48 , 101314. https://doi.org/10.1016/j.tsc.2023.101314

Zhang, Y., Ng, O.-L., & Leung, S. (2023). Researching computational thinking in early childhood STE (A) M education context: A descriptive review on the state of research and future directions. Journal for STEM Education Research, 6 , 427–455. https://doi.org/10.1007/s41979-023-00097-7

Zhao, L., Liu, X., Wang, C., & Su, Y.-S. (2022). Effect of different mind mapping approaches on primary school students’ computational thinking skills during visual programming learning. Computers & Education, 181 , 104445. https://doi.org/10.1016/j.compedu.2022.104445

Zhong, H.-X., Lai, C.-F., Chang, J.-H., & Chiu, P.-S. (2023). Developing creative material in STEM courses using integrated engineering design based on APOS theory. International Journal of Technology and Design Education, 33 , 1627–1651. https://doi.org/10.1007/s10798-022-09788-5

Download references

Acknowledgements

The authors are indebted to the editor and reviewers who greatly helped strengthen this paper.

This study is not supported by any funding sources.

Author information

Authors and affiliations.

Faculty of Education, University of Macau, Taipa, Macau, China

Zuokun Li & Pey Tee Oon

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the writing of this manuscript. The work of Zuokun Li included designing the study, collecting and analyzing the data, interpreting the results, and writing the initial draft of the manuscript. Pey Tee Oon made contributions in the areas of conceptualization, writing, reviewing, and editing, as well as providing project supervision.

Corresponding author

Correspondence to Pey Tee Oon .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

All individuals identifiable in this manuscript have given their consent for publication.

Competing interests

The authors declare no potential conflict of interest in the work.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1., additional file 2., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Li, Z., Oon, P.T. The transfer effect of computational thinking (CT)-STEM: a systematic literature review and meta-analysis. IJ STEM Ed 11 , 44 (2024). https://doi.org/10.1186/s40594-024-00498-z

Download citation

Received : 12 December 2023

Accepted : 02 August 2024

Published : 09 September 2024

DOI : https://doi.org/10.1186/s40594-024-00498-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Transfer effect
  • Systematic literature review
  • Meta-analysis

module 4 critical thinking identification research and analysis

COMMENTS

  1. Module 4 PHI 1600 Flashcards

    Study with Quizlet and memorize flashcards containing terms like Which step in the Critical Thinking model involves thinking about the work you have done and evaluating what you have learned and what could do better next time? a. identification b. research c. analysis d. application e. decision making f. evaluation g. reflection, Which terms are used to describe whether an argument is true or ...

  2. Module 4 Dropbox Critical Thinking

    Module 4 Dropbox Critical Thinking bonnie hernandez christian moriarty phi1600 identification the main ethical issue: what should the renovation of the alamo ... Research. Three areas of research would include: ... Analysis (use at least 4 different options and 4 different stakeholders) All the options begin with the same name as the person ...

  3. Module 4

    Es absurdo tener que escribir los titulos cristian chaves raul phi 1600 studies in applied ethics identification the main ethical issue: what should tom do. Skip to document. University; High School. Books; ... Analysis. Options Stakeholders Tom could tell the EPA about his ... Module 4 - Critical Thinking fixed sadad. Course: Studies in ...

  4. What is Critical Thinking in Academics

    Critical thinking helps identify potential biases in research or theories, ensuring a more objective understanding. Example: In studying economic policies, critical thinking helps weigh the benefits and drawbacks of different economic models, considering both empirical data and theoretical arguments. 4. Mathematics.

  5. Module 4 Philosophy Flashcards

    Terms in this set (20) Which step in the Critical Thinking model involves thinking about who might be affected and how they might be affected? 3) Analysis. Which step in the Critical Thinking model involves reflecting on arguments against the decision you recommend and how you would evaluate and answer these objections? NOT 7) Reflection.

  6. Module 4- critical thinking Flashcards

    Study with Quizlet and memorize flashcards containing terms like critical thinking, critical thinking triangle (bottom top), 4 critical thinking skills and more. Scheduled maintenance: July 8, 2024 from 07:00 PM to 09:00 PM

  7. Critical Thinking in Academic Research

    Critical Thinking in Academic Research - 2nd Edition provides examples and easy-to-understand explanations to equip students with the skills to develop research questions, evaluate and choose the right sources, search for information, and understand arguments. This 2nd Edition includes new content based on student feedback as well as additional interactive elements throughout the text.

  8. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  9. Module-4-dropbox-critical-thinking-model-identification-research

    Management document from La Trobe University, 2 pages, Module 4 Dropbox: Critical Thinking Model - Identification, Research, Analysis For this assignment, you will focus on a case, and complete the first three steps of the critical thinking process: 1) Identification, 2) Research, and 3) Analysis. First, read

  10. 8.2: Introduction to Analysis As Critical Thinking

    What you'll learn to do: describe analysis. Analysis is a critical thinking skill that has been applied to academic study since the time of Aristotle (384-322 BCE). It is a foundational tool across disciplines, from chemistry to literature to business to philosophy. Understanding what analysis is will be of value to you throughout your ...

  11. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  12. Putting It Together: Thinking and Analysis

    Putting It Together: Thinking and Analysis. As we've just learned, one of the key skills every student needs is the ability to think critically. Once you've learned to critically examine the content you come into contact with, you can then think creatively to come up with new—and potentially better—solutions to the problems we find in ...

  13. Chapter 4 Critical Thinking

    Critical thinking involves logic, but it is more than logic. It can include using stories, common sense, and perception as well. Critical thinking brings together our best skills and achievements to resolve problems. (Manias, 12/2017, pp. 74-76) Arguments. Logic and good reasoning are the keys that make this decision-making process work.

  14. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  15. The transfer effect of computational thinking (CT)-STEM: a systematic

    Prior research has focused on assessment targets within the cognitive and noncognitive domains (Tang et al., 2020; Wang et al., 2022a).The former includes direct cognitive manifestations such as knowledge and skills related to CT constructs and STEM constructs, as well as domain-general mental abilities such as creativity and critical thinking (Tang et al., 2020).

  16. critical thinking module 4-Karteikarten

    the unstated or hidden beliefs that support our explicit reasoning about something. are what we take for granted as being true when we formulate or accept an argument. inference (conclusion) the outcome of reasoning. It is what the writer or speaker is trying to prove about the issue being addressed.