Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

Eight Instructional Strategies for Promoting Critical Thinking

critical thinking in instructional design

  • Share article

(This is the first post in a three-part series.)

The new question-of-the-week is:

What is critical thinking and how can we integrate it into the classroom?

This three-part series will explore what critical thinking is, if it can be specifically taught and, if so, how can teachers do so in their classrooms.

Today’s guests are Dara Laws Savage, Patrick Brown, Meg Riordan, Ph.D., and Dr. PJ Caposey. Dara, Patrick, and Meg were also guests on my 10-minute BAM! Radio Show . You can also find a list of, and links to, previous shows here.

You might also be interested in The Best Resources On Teaching & Learning Critical Thinking In The Classroom .

Current Events

Dara Laws Savage is an English teacher at the Early College High School at Delaware State University, where she serves as a teacher and instructional coach and lead mentor. Dara has been teaching for 25 years (career preparation, English, photography, yearbook, newspaper, and graphic design) and has presented nationally on project-based learning and technology integration:

There is so much going on right now and there is an overload of information for us to process. Did you ever stop to think how our students are processing current events? They see news feeds, hear news reports, and scan photos and posts, but are they truly thinking about what they are hearing and seeing?

I tell my students that my job is not to give them answers but to teach them how to think about what they read and hear. So what is critical thinking and how can we integrate it into the classroom? There are just as many definitions of critical thinking as there are people trying to define it. However, the Critical Think Consortium focuses on the tools to create a thinking-based classroom rather than a definition: “Shape the climate to support thinking, create opportunities for thinking, build capacity to think, provide guidance to inform thinking.” Using these four criteria and pairing them with current events, teachers easily create learning spaces that thrive on thinking and keep students engaged.

One successful technique I use is the FIRE Write. Students are given a quote, a paragraph, an excerpt, or a photo from the headlines. Students are asked to F ocus and respond to the selection for three minutes. Next, students are asked to I dentify a phrase or section of the photo and write for two minutes. Third, students are asked to R eframe their response around a specific word, phrase, or section within their previous selection. Finally, students E xchange their thoughts with a classmate. Within the exchange, students also talk about how the selection connects to what we are covering in class.

There was a controversial Pepsi ad in 2017 involving Kylie Jenner and a protest with a police presence. The imagery in the photo was strikingly similar to a photo that went viral with a young lady standing opposite a police line. Using that image from a current event engaged my students and gave them the opportunity to critically think about events of the time.

Here are the two photos and a student response:

F - Focus on both photos and respond for three minutes

In the first picture, you see a strong and courageous black female, bravely standing in front of two officers in protest. She is risking her life to do so. Iesha Evans is simply proving to the world she does NOT mean less because she is black … and yet officers are there to stop her. She did not step down. In the picture below, you see Kendall Jenner handing a police officer a Pepsi. Maybe this wouldn’t be a big deal, except this was Pepsi’s weak, pathetic, and outrageous excuse of a commercial that belittles the whole movement of people fighting for their lives.

I - Identify a word or phrase, underline it, then write about it for two minutes

A white, privileged female in place of a fighting black woman was asking for trouble. A struggle we are continuously fighting every day, and they make a mockery of it. “I know what will work! Here Mr. Police Officer! Drink some Pepsi!” As if. Pepsi made a fool of themselves, and now their already dwindling fan base continues to ever shrink smaller.

R - Reframe your thoughts by choosing a different word, then write about that for one minute

You don’t know privilege until it’s gone. You don’t know privilege while it’s there—but you can and will be made accountable and aware. Don’t use it for evil. You are not stupid. Use it to do something. Kendall could’ve NOT done the commercial. Kendall could’ve released another commercial standing behind a black woman. Anything!

Exchange - Remember to discuss how this connects to our school song project and our previous discussions?

This connects two ways - 1) We want to convey a strong message. Be powerful. Show who we are. And Pepsi definitely tried. … Which leads to the second connection. 2) Not mess up and offend anyone, as had the one alma mater had been linked to black minstrels. We want to be amazing, but we have to be smart and careful and make sure we include everyone who goes to our school and everyone who may go to our school.

As a final step, students read and annotate the full article and compare it to their initial response.

Using current events and critical-thinking strategies like FIRE writing helps create a learning space where thinking is the goal rather than a score on a multiple-choice assessment. Critical-thinking skills can cross over to any of students’ other courses and into life outside the classroom. After all, we as teachers want to help the whole student be successful, and critical thinking is an important part of navigating life after they leave our classrooms.

usingdaratwo

‘Before-Explore-Explain’

Patrick Brown is the executive director of STEM and CTE for the Fort Zumwalt school district in Missouri and an experienced educator and author :

Planning for critical thinking focuses on teaching the most crucial science concepts, practices, and logical-thinking skills as well as the best use of instructional time. One way to ensure that lessons maintain a focus on critical thinking is to focus on the instructional sequence used to teach.

Explore-before-explain teaching is all about promoting critical thinking for learners to better prepare students for the reality of their world. What having an explore-before-explain mindset means is that in our planning, we prioritize giving students firsthand experiences with data, allow students to construct evidence-based claims that focus on conceptual understanding, and challenge students to discuss and think about the why behind phenomena.

Just think of the critical thinking that has to occur for students to construct a scientific claim. 1) They need the opportunity to collect data, analyze it, and determine how to make sense of what the data may mean. 2) With data in hand, students can begin thinking about the validity and reliability of their experience and information collected. 3) They can consider what differences, if any, they might have if they completed the investigation again. 4) They can scrutinize outlying data points for they may be an artifact of a true difference that merits further exploration of a misstep in the procedure, measuring device, or measurement. All of these intellectual activities help them form more robust understanding and are evidence of their critical thinking.

In explore-before-explain teaching, all of these hard critical-thinking tasks come before teacher explanations of content. Whether we use discovery experiences, problem-based learning, and or inquiry-based activities, strategies that are geared toward helping students construct understanding promote critical thinking because students learn content by doing the practices valued in the field to generate knowledge.

explorebeforeexplain

An Issue of Equity

Meg Riordan, Ph.D., is the chief learning officer at The Possible Project, an out-of-school program that collaborates with youth to build entrepreneurial skills and mindsets and provides pathways to careers and long-term economic prosperity. She has been in the field of education for over 25 years as a middle and high school teacher, school coach, college professor, regional director of N.Y.C. Outward Bound Schools, and director of external research with EL Education:

Although critical thinking often defies straightforward definition, most in the education field agree it consists of several components: reasoning, problem-solving, and decisionmaking, plus analysis and evaluation of information, such that multiple sides of an issue can be explored. It also includes dispositions and “the willingness to apply critical-thinking principles, rather than fall back on existing unexamined beliefs, or simply believe what you’re told by authority figures.”

Despite variation in definitions, critical thinking is nonetheless promoted as an essential outcome of students’ learning—we want to see students and adults demonstrate it across all fields, professions, and in their personal lives. Yet there is simultaneously a rationing of opportunities in schools for students of color, students from under-resourced communities, and other historically marginalized groups to deeply learn and practice critical thinking.

For example, many of our most underserved students often spend class time filling out worksheets, promoting high compliance but low engagement, inquiry, critical thinking, or creation of new ideas. At a time in our world when college and careers are critical for participation in society and the global, knowledge-based economy, far too many students struggle within classrooms and schools that reinforce low-expectations and inequity.

If educators aim to prepare all students for an ever-evolving marketplace and develop skills that will be valued no matter what tomorrow’s jobs are, then we must move critical thinking to the forefront of classroom experiences. And educators must design learning to cultivate it.

So, what does that really look like?

Unpack and define critical thinking

To understand critical thinking, educators need to first unpack and define its components. What exactly are we looking for when we speak about reasoning or exploring multiple perspectives on an issue? How does problem-solving show up in English, math, science, art, or other disciplines—and how is it assessed? At Two Rivers, an EL Education school, the faculty identified five constructs of critical thinking, defined each, and created rubrics to generate a shared picture of quality for teachers and students. The rubrics were then adapted across grade levels to indicate students’ learning progressions.

At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students’ early experiences through 12th grade. For instance, a kindergarten student may be expected to “identify cause and effect in familiar contexts,” while an 8th grader should demonstrate the ability to “seek out sufficient evidence before accepting a claim as true,” “identify bias in claims and evidence,” and “reconsider strongly held points of view in light of new evidence.”

When faculty and students embrace a common vision of what critical thinking looks and sounds like and how it is assessed, educators can then explicitly design learning experiences that call for students to employ critical-thinking skills. This kind of work must occur across all schools and programs, especially those serving large numbers of students of color. As Linda Darling-Hammond asserts , “Schools that serve large numbers of students of color are least likely to offer the kind of curriculum needed to ... help students attain the [critical-thinking] skills needed in a knowledge work economy. ”

So, what can it look like to create those kinds of learning experiences?

Designing experiences for critical thinking

After defining a shared understanding of “what” critical thinking is and “how” it shows up across multiple disciplines and grade levels, it is essential to create learning experiences that impel students to cultivate, practice, and apply these skills. There are several levers that offer pathways for teachers to promote critical thinking in lessons:

1.Choose Compelling Topics: Keep it relevant

A key Common Core State Standard asks for students to “write arguments to support claims in an analysis of substantive topics or texts using valid reasoning and relevant and sufficient evidence.” That might not sound exciting or culturally relevant. But a learning experience designed for a 12th grade humanities class engaged learners in a compelling topic— policing in America —to analyze and evaluate multiple texts (including primary sources) and share the reasoning for their perspectives through discussion and writing. Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care about and connect with can ignite powerful learning experiences.

2. Make Local Connections: Keep it real

At The Possible Project , an out-of-school-time program designed to promote entrepreneurial skills and mindsets, students in a recent summer online program (modified from in-person due to COVID-19) explored the impact of COVID-19 on their communities and local BIPOC-owned businesses. They learned interviewing skills through a partnership with Everyday Boston , conducted virtual interviews with entrepreneurs, evaluated information from their interviews and local data, and examined their previously held beliefs. They created blog posts and videos to reflect on their learning and consider how their mindsets had changed as a result of the experience. In this way, we can design powerful community-based learning and invite students into productive struggle with multiple perspectives.

3. Create Authentic Projects: Keep it rigorous

At Big Picture Learning schools, students engage in internship-based learning experiences as a central part of their schooling. Their school-based adviser and internship-based mentor support them in developing real-world projects that promote deeper learning and critical-thinking skills. Such authentic experiences teach “young people to be thinkers, to be curious, to get from curiosity to creation … and it helps students design a learning experience that answers their questions, [providing an] opportunity to communicate it to a larger audience—a major indicator of postsecondary success.” Even in a remote environment, we can design projects that ask more of students than rote memorization and that spark critical thinking.

Our call to action is this: As educators, we need to make opportunities for critical thinking available not only to the affluent or those fortunate enough to be placed in advanced courses. The tools are available, let’s use them. Let’s interrogate our current curriculum and design learning experiences that engage all students in real, relevant, and rigorous experiences that require critical thinking and prepare them for promising postsecondary pathways.

letsinterrogate

Critical Thinking & Student Engagement

Dr. PJ Caposey is an award-winning educator, keynote speaker, consultant, and author of seven books who currently serves as the superintendent of schools for the award-winning Meridian CUSD 223 in northwest Illinois. You can find PJ on most social-media platforms as MCUSDSupe:

When I start my keynote on student engagement, I invite two people up on stage and give them each five paper balls to shoot at a garbage can also conveniently placed on stage. Contestant One shoots their shot, and the audience gives approval. Four out of 5 is a heckuva score. Then just before Contestant Two shoots, I blindfold them and start moving the garbage can back and forth. I usually try to ensure that they can at least make one of their shots. Nobody is successful in this unfair environment.

I thank them and send them back to their seats and then explain that this little activity was akin to student engagement. While we all know we want student engagement, we are shooting at different targets. More importantly, for teachers, it is near impossible for them to hit a target that is moving and that they cannot see.

Within the world of education and particularly as educational leaders, we have failed to simplify what student engagement looks like, and it is impossible to define or articulate what student engagement looks like if we cannot clearly articulate what critical thinking is and looks like in a classroom. Because, simply, without critical thought, there is no engagement.

The good news here is that critical thought has been defined and placed into taxonomies for decades already. This is not something new and not something that needs to be redefined. I am a Bloom’s person, but there is nothing wrong with DOK or some of the other taxonomies, either. To be precise, I am a huge fan of Daggett’s Rigor and Relevance Framework. I have used that as a core element of my practice for years, and it has shaped who I am as an instructional leader.

So, in order to explain critical thought, a teacher or a leader must familiarize themselves with these tried and true taxonomies. Easy, right? Yes, sort of. The issue is not understanding what critical thought is; it is the ability to integrate it into the classrooms. In order to do so, there are a four key steps every educator must take.

  • Integrating critical thought/rigor into a lesson does not happen by chance, it happens by design. Planning for critical thought and engagement is much different from planning for a traditional lesson. In order to plan for kids to think critically, you have to provide a base of knowledge and excellent prompts to allow them to explore their own thinking in order to analyze, evaluate, or synthesize information.
  • SIDE NOTE – Bloom’s verbs are a great way to start when writing objectives, but true planning will take you deeper than this.

QUESTIONING

  • If the questions and prompts given in a classroom have correct answers or if the teacher ends up answering their own questions, the lesson will lack critical thought and rigor.
  • Script five questions forcing higher-order thought prior to every lesson. Experienced teachers may not feel they need this, but it helps to create an effective habit.
  • If lessons are rigorous and assessments are not, students will do well on their assessments, and that may not be an accurate representation of the knowledge and skills they have mastered. If lessons are easy and assessments are rigorous, the exact opposite will happen. When deciding to increase critical thought, it must happen in all three phases of the game: planning, instruction, and assessment.

TALK TIME / CONTROL

  • To increase rigor, the teacher must DO LESS. This feels counterintuitive but is accurate. Rigorous lessons involving tons of critical thought must allow for students to work on their own, collaborate with peers, and connect their ideas. This cannot happen in a silent room except for the teacher talking. In order to increase rigor, decrease talk time and become comfortable with less control. Asking questions and giving prompts that lead to no true correct answer also means less control. This is a tough ask for some teachers. Explained differently, if you assign one assignment and get 30 very similar products, you have most likely assigned a low-rigor recipe. If you assign one assignment and get multiple varied products, then the students have had a chance to think deeply, and you have successfully integrated critical thought into your classroom.

integratingcaposey

Thanks to Dara, Patrick, Meg, and PJ for their contributions!

Please feel free to leave a comment with your reactions to the topic or directly to anything that has been said in this post.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all Ed Week articles, has been changed by the new redesign—new ones won’t be available until February). And if you missed any of the highlights from the first nine years of this blog, you can see a categorized list below.

  • This Year’s Most Popular Q&A Posts
  • Race & Racism in Schools
  • School Closures & the Coronavirus Crisis
  • Classroom-Management Advice
  • Best Ways to Begin the School Year
  • Best Ways to End the School Year
  • Student Motivation & Social-Emotional Learning
  • Implementing the Common Core
  • Facing Gender Challenges in Education
  • Teaching Social Studies
  • Cooperative & Collaborative Learning
  • Using Tech in the Classroom
  • Student Voices
  • Parent Engagement in Schools
  • Teaching English-Language Learners
  • Reading Instruction
  • Writing Instruction
  • Education Policy Issues
  • Differentiating Instruction
  • Math Instruction
  • Science Instruction
  • Advice for New Teachers
  • Author Interviews
  • Entering the Teaching Profession
  • The Inclusive Classroom
  • Learning & the Brain
  • Administrator Leadership
  • Teacher Leadership
  • Relationships in Schools
  • Professional Development
  • Instructional Strategies
  • Best of Classroom Q&A
  • Professional Collaboration
  • Classroom Organization
  • Mistakes in Education
  • Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Update

Edweek top school jobs.

Whales flying in the sky above dreamlike mountains. Surreal image of creative thoughts. Childhood imagination.

Sign Up & Sign In

module image 9

  • Future of work
  • Managed Learning Services
  • Learning Strategy
  • Learning Talent
  • Measurement and Analytics
  • Custom Learning Content
  • Curated Content
  • Pre-built Content
  • Virtual & Instrutor Led Training
  • LMS Administration
  • Event Management and Hosting
  • Sales Training
  • Product Knowledge
  • Customer Service Training
  • Unlock:LEARN
  • Professional Services
  • Banking And Finance
  • Infrastructure
  • The Future Of Work
  • Employee Engagement
  • Diversity, Equity and Inclusion
  • Emerging Leaders
  • Leading through Uncertain Times
  • Optimize Leader Judgement
  • Your Emotional Inteligence
  • Self-Awareness & Self-Assessment
  • Performance Management
  • Personal Leadership Brand
  • Driving Business Growth
  • Leadership Development
  • Strategic Leadership
  • Self-Leadership
  • Team-Building
  • Customer-Centric Leadership
  • Press Releases
  • United Kingdom

The Instructional Designer’s Guide to Critical Thinking

Instructional Design

critical thinking in instructional design

As Instructional Designers, we have always exhibited critical thinking prowess without realizing how important it is in the larger scheme of things. Now, anybody would assume that this makes us natural critical thinkers, who hold the key to bridging the vast knowledge gap among people round the globe. Right? Well, that’s exactly where we need to stop and do a reality check!

As instructional designers, we divide our learning time among three broad areas:

1. Researching: This includes reading, asking questions to experts, and collecting information based on facts. 2. Processing: This comprises evaluating and ranking information according to the importance with most important at the top and least important at the bottom. For example, putting the lead (who, what, when, where and how) at the very top; placing the body (facts and further information, revealed in order of importance) next; and keeping the fluffy stuff (little bits of information fading into oblivion) at the end. 3. Output: Writing and sharing our thoughts.

The majority of our instructional decisions are a result of the chosen source content. However, content heavy sources should not drive instruction, since most of these are not structured to enhance critical thinking in the subject. Our decisions about the structure and instructional strategies of the learning content should result from our most fundamental objectives in designing any solution.

Instructional design involves two deeply interwoven parts: structure and strategy. Structure involves the ‘what’ part of the course and includes: What (content) am I going to teach? What questions/problems/concepts will be central to the course? What amount of information will students need to access? What will be the reference point for the learners? What is my understanding of the course? What overall plan should I follow? etc.

Strategy involves the ‘how’ part of the course and includes: How will I teach so that the outlined structure works? How will I get the students to be actively involved? How will I get them to develop essential insights, understandings, knowledge, and ability? How will I get them to learn to provide logical answers to questions on any particular concept they have learned?

Once we have decided upon the most basic structure and substructures of our learning material, we must focus on the strategies we will use to drive that structure home. We should aim at using strategies to solve two different purposes. The first one is to create a learning solution that focuses on daily or episodic tasks of the learners. The second one is to first divide the learners’ tasks into simple and complex ones. And then, using socratic method for complex tasks, ask continual probing questions to explore the underlying beliefs that shape the learners’ views and opinions. We should ideally focus on giving learners the right questions, not answers. This would enable them to read critically and develop self-assessment skills. Such a learning solution would have multiple parts and therefore, often require an extended period of time to be carried out effectively.

All Set to Work on Your Critical Thinking Skills? With that all said, critical thinking is what separates effective instructional designers from ineffective. So, here are some quick fun exercises for you to continually develop your critical thinking skills and become a better critical thinker

As instructional designers, we should think about instruction in both structural and strategic ways. This will enable us to move away from the didactic method and the ineffective teaching that invariably accompanies it toward active learning through critical thinking. However, our learning solutions will not be transformed simply because we believe in the philosophical value of critical thinking. We must strive to continually find innovative and effective ways to bring it into practical instruction, both structurally and strategically.

To read more about critical thinking in instructional design check out our article Is Critical Thinking the Key to Instructional Design (and a Better World)? And don’t stop there, read about creativity of thought including checklists to improve creativity here.

Leave a Comment

You must be logged in to post a comment.

Recommended For You...

critical thinking in instructional design

Advantages and Disadvantages of MOOCs for Learning

Everything You Wanted to Know About MOOCs And Were Too Embarrassed to Ask! MOOCs...

content skill

Content, Skill and Scale: ID Best Practices​

We are witnessing significant disruptions throughout most organizations, including the overall workplace dynamics, the...

critical thinking in instructional design

Product Knowledge Training

Today’s consumers are digitally connected, socially networked and prefer to have gathered all the...

Use Your Keywords For Search

Download catalog.

  • Schedule a Demo
  • Submit an RFI

Submit Your Project Here

Upcoming Webinar | What is WRONG with Today’s Leaders? | September 19, 2024 at 11:00 A.M. EST Register Now

Our website uses cookies to improve your user experience. If you continue browsing, we assume that you consent to our use of cookies. ACCEPT COOKIES

The Link Between Critical Thinking and Effective Design

Brigg Patten : Oct 18, 2016 2:00:00 PM

The Link Between Critical Thinking and Effective Design

In order to make it in the proverbial real world, students need to develop critical thinking skills, and they need a curriculum that will foster these skills. This principle is easily applied to any learning situation, whether in school or business.

It All Starts with Design

The best teacher in the world will struggle if their curriculum is not developed in a way that will help students learn. In addition to the basics and standard procedures, students of all ages need to learn how to develop their critical thinking skills and learn to approach problems in a flexible and creative way.

Clearly, this skill is not going to be honed by doing dry math problems and worksheets. However, it can be hard for a teacher to know what will be effective. This is where curriculum design companies enter the picture.

A New Type of Lesson Plan

Many curriculum design companies are now working on developing these skills through challenging and creative methods. Gone are the textbooks and dusty worksheets of yesteryear, say hello to tablet-based lessons and flexible planning. The advent of the internet has created a huge space for students and trainees of all ages to access all kinds of material, and teach themselves in a more self-directed manner.

Self Determination and Learning

It is vital for learners to be able to have some say in their lessons if they are going to learn anything. As anyone with a child can tell you, nothing will get a kid to do something that they don't want to do. Therefore, lesson plans need to be tailored to each person’s individual interests and methods of learning, so they feel empowered and capable of learning at their highest potential. Additionally, students need the resources to help themselves when they run into trouble, so that their sense of self sufficiency and agency can grow.

There is a lot of discussion of "grit" in education, which is a student's ability to face tough challenges. However, grit is not an intrinsic ability, and the appropriate lesson plan goes a long way towards fostering this toughness. With the wrong plan, no amount of grinding teeth will truly empower students to learn.

This article offers some further insight into what makes an effective lesson plan and curriculum design. The program develops one central idea, which is that people need to develop their critical thinking skills instead of memorizing facts and procedures. The rationale is simple, according to the authors. While facts and simple procedures have a fairly straight forward application, this application is too specific and is not useful without broader critical thinking skills. Therefore, the development of more generalized and broad-based abilities to reason and problem solve are the foundation to being an effective learner.

Problems out in the real world are never as straight forward as a mathematics test, and this disconnect can have a negative impact on learners trying to apply their skills in real life. As a result, instructional design companies need to focus on taking these low level skills and teaching students to apply them to situations in the real world. This can be messy and difficult, but the payoff is tremendous, for both students and teachers. Students find themselves to be more capable and self-sufficient, prepared to tackle the real work, and teachers get the satisfaction of knowing that their work is not going to waste.

Hybrid Instructional Design—Strategies, Techniques, & Tech

Hybrid Instructional Design—Strategies, Techniques, & Tech

Picture of Jennifer Hofmann

Designing for Today's Hybrid Learning Environment Needs to be Strategic. Read on—or get the big picture in our infographic Instructional Strategies,...

Throw it in Zoom and See if it Sticks! (Part 1)

Throw it in Zoom and See if it Sticks! (Part 1)

2020 was a year that witnessed a myriad of changes in the workplace, society, and how we interact, work, play, and…live. In the world of learning and...

Throw it in Zoom and See if it Sticks! (Part 2)

Throw it in Zoom and See if it Sticks! (Part 2)

In my previous post, I discussed the importance of strong design for virtual delivery, and how to achieve it. This post addresses the challenges...

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

13 Towards a Critical Instructional Design Framework

Katrina Wehr

As Costanza-Chock (2020) points out in the concluding chapter of Design Justice, “we urgently need more critical analysis in every design domain.” What critical analysis might look like for instructional design, however, is complicated for practitioners who are rarely afforded final authority or power over the design itself. While existing scholarship in this area offers a variety of approaches and considerations, the literature does not point to a single ideal framework or list of steps for assessing instructional design in terms of inclusivity, power, equity, or justice. In order to get closer to critical practice of instructional design, we in the field must blaze the trail ourselves. In this chapter, I explore the ways power, process, and positionality influence course design from a perspective many instructional design professionals are familiar with: an inherited course, that is, one we did not design ourselves but are now tasked with maintaining and improving.

For practitioner-researchers like myself, this leaves room for exploration into how one might critically analyze an existing instructional design in an effort to develop a framework for designing more just learning experiences from the start of the process. The more we practice these types of design approaches, the more mature the processes will become (Costanza-Chock, 2020). To begin, I decided to analyze a course in my instructional design portfolio based on existing critical scholarship in order to practice interrogating these concepts as they play out in a course design.

The following work is an effort to create salient applications of themes identified through research and practice regarding design justice in instructional design contexts. In a 2020 article, Collier (2020) draws on the writings of various educators which implores us to approach critical analysis of designs by working to uncover “what’s wrong” in order to take action towards counteracting marginalizing designs in education. This paper draws on critical scholarship in the learning sciences to guide the analysis and help other instructional designers focus their own critical analyses of existing course designs by providing examples of my thinking about these ideas in relation to specific elements of an online course. I also attempt to synthesize my process into a framework as a starting point for improvement with future critical analysis cycles. At the conclusion of this paper, I summarize my discoveries and reflect on the process and my role as an instructional designer.

Framing Concepts

Instructional designers are well positioned to influence the design of learning environments, and we should be “accountable for the social and political consequences” of our work (Barab et al., 2007, p. 296). I view my role in this critical analysis as similar to that of Barab et al. (2007). I aim to “position [myself] in a manner that will attune [me] to extant issues and highlight them among the community” (Barab et al., 2007, p. 281). The community for this work includes not only instructional designers on my team and instructors who teach this course, but anyone interested in critically analyzing existing instructional designs and how their discoveries are connected to, and could be changed by, the processes they follow to design instruction.

To begin the analysis, a review of critical learning sciences research was conducted to derive a framework to evaluate the instructional design of the focal course in this analysis. “Questions of power and ideology come to center stage in the decisions that educational designers make” (Barab et al., 2007, p. 291), and it is important that this framework is a flexible one that could potentially be utilized to conduct a similar analysis of other courses. This effort attempts to critically interrogate an existing design with the goal of uncovering areas for improvement, or, as Collier (2020) writes, to discover “what’s wrong” with a design.

Taylor (2018) outlines some commitments to ethical teaching and research, such as the commitment to “foregrounding issues of historicity, race, power, and privilege in the curriculum I teach and/or design” (Gutiérrez & Vossoughi, 2010, in Taylor, 2018, p. 196). Taylor identifies some questions that guide this commitment, which I find particularly valuable to this work: “What voices are unaccounted for here? Why might that be? How can I/we fix that?” (Taylor, 2018, p. 196). While these questions of representation won’t reveal every opportunity for improvement, beginning a critical analysis with this mindset will uncover details about the course design that can lead to other avenues of critique.

Barab et al. highlight power and its role in design. Indeed, power is conveyed, wielded, and distributed in different ways in a learning design. Esmonde (2016) discusses the relationship between power and artifacts, especially in a learning environment, and cites the example of curriculum standards and how their existence shapes the way K-12 teachers in the U.S. not only teach students, but how their performances are evaluated as well. Similarly, a learning design in an online environment in higher education generally includes predetermined learning objectives and disciplinary practices dictating how learners participate and develop their skills, or in other words, what types of practices and knowledge are considered valuable and worthy. Examining these arrangements of power and authority in a learning design is crucial to critical practice of instructional design.

Questions of identity and values are also essential to a productive critical instructional design effort. When considering identity and how it is relevant in learning, Vakil (2020) implores researchers to consider not only who students are in the present, but also who they have the potential to become as they learn. In practice, this means thinking about the diverse racial and gender identities taken up by learners and how those identities are reflected, or not, in an instructional design. Drawing on Wenger (1998), who stated, “learning is an experience of identity,” Vakil argues that as learners’ identities change, those shifts, “give rise to new ways of making meaning of and interacting with the world,” as well as their relationship with the world (p. 92). But throughout this process of shifting identity, learners are also considering what their disciplinary learning means for their “possible futures” (Vakil, 2020, p. 93). As a result, learners may question the narratives surrounding political and ethical values of their chosen disciplines, such whose perspectives are considered expert lenses in the field, leading learners to “consider the kind of person one has to be, or become, in order to participate in the communities of practice explicitly or implicitly associated with particular forms or domains of knowledge” (Vakil, 2020, p. 93). This can lead to both positive and negative impacts on learners’ disciplinary learning and their identities when those disciplinary values align or conflict, which makes ethical identity development and ethical values important commitments to this critical analysis.

To summarize, the extant literature points to three commitments, or for the purposes of this analysis, critical focuses, that create a map through this analysis. Taylor’s (2018) questions of representation, Esmonde’s (2016) ideas around power, and Vakil’s (2020) concept of disciplinary values serve as themes for exploring how a course design could be improved for more just outcomes.

A Framework for Critical Instructional Design

Using these critical focuses to guide the analysis, I also turned to Vossoughi and Gutiérrez (2016) to identify specific units of analysis within the design. In their discussion of critical pedagogy, Vossoughi and Gutiérrez (2016) challenge researchers to consider the “how” of teaching rather than “what is to be taught” (p. 143). This switch in focus calls for critical analysis of how learning is organized, how practices situate power and ideology, how social relations are established, and “how tools expand or limit opportunities for development of critical thought” (p. 143). Based on my experience as an instructional designer of online learning experiences, these units of analysis can be neatly aligned to the main component parts of a learning design. The way learning is organized could include not only the linear progression of a course as laid out in the syllabus, but also the learning management system (LMS) and choices about how learning is presented in a digital space with regard to features enabled or disabled in a given design. When considering power and ideology embedded in practices, I look to examine the activities and projects learners are completing. Social relations between instructor and learners, and between learners, can also be analyzed based on the practices included in a learning design and can be observed by looking at discussion posts and announcements among other artifacts. Examining the course policies and procedures around topics like late submissions and test taking can also answer questions regarding power and values. Finally, the tools, such as software, LMS integrations, and others, that are utilized both by the instructor and learners play an important role in online learning experiences and are important to examine in a critical analysis. These units of analysis identified by Vossoughi and Gutiérrez serve as focal points for uncovering data related to the critical focuses identified previously. The table below organizes this framework and includes the types of course data I will analyze. “Units of analysis” has become “sites of analysis” to clarify that those design components are where one should look for evidence related to the critical focuses. The following analysis examines a foundational course that serves as an entry point into a completely asynchronous bachelor’s degree program at a large research institution in the U.S. I used the master course space for the purposes of this work and investigated these areas of focus on my own in an effort to develop this framework, but in the future, this process could and should occur in partnership with other instructional designers and even faculty partners. Next, I will highlight specific examples from the focal course and expand on this framework, followed by suggestions for improvement or practices to consider replicating in other learning designs.

Critical Focuses Sites of Analysis

Who do learners become in this discipline?
How do identities change
Who is here? Who isn’t here? Who should be here?


Who matters? What matters?
Who decides?


content, syllabus, procedures, policies, reference materials, books, example work from previous learners

learner/learner
instructor/learner

LMS, third-party LMS tools, Studio, software

learning goals, activities such as discussions & peer reviews, projects & assignments

Table 1: Draft Critical Instructional Design Framework

Organization

While many design components of any course would fit the organization category of analysis, for the sake of brevity this chapter will be limited to 1-2 components from each category. Of all the components that organize learning in the course, I focus here on the content. Since content itself is a broad term, this paper considers reference materials such as books, instructor-provided documentation, and examples of work as primarily composing course content.

The focal course is based in the arts and is made up of five lessons. Each lesson is structured to include reading a selection of chapters from the required text, and a book that is reviewed as a critical reflection on the role of the discipline in everyday life. The lessons also contain other reference materials in formats such as audio podcasts and videos. In addition, each lesson contains some form of practice activity and culminates in the completion of three projects. An outline of the topics covered in the course is included below.

  • Design Thinking – systems thinking, critical thinking, design process
  • Visual and Interaction Design – semiotics, inclusive design, critical design, visual design, identity design
  • Storytelling – structure, development, character
  • Open Design
  • Self Design

At first glance, the organization of learning as it is reflected in these materials answers some of the guiding questions about disciplinary values and learner identity. Learners are introduced to the idea of using discipline-based insights as a frame to understand complex problems and develop comprehensive, critical solutions. As an introductory course that ushers students into the discipline at this university’s art school, the course topics prioritize key concepts alongside critical concepts relevant to the field, positioning these issues as foundational to the discipline. If they were not already, learners in this discipline will become practitioners who are at least made aware of the importance of these topics and will be expected to incorporate them into their practices within the course. A noteworthy observation about this course involves the perspectives from which these critical concepts are presented, a matter that will be addressed later.

Although I was not the instructional designer in charge of this course’s initial development, I am aware of the processes and decision making that went into it as a member of the unit within which the development was supported and this is where the questions of power regarding organization of learning are answered. In my department, the instructor is the ultimate authority on what content is included in a course, and in what order, which has the tradeoff of leaving out the voices of learners and other instructors within the department. This arrangement creates an additional barrier for instructional designers and our ability to make positive changes toward more equitable and inclusive instruction. When control over the course rests so singularly in the hands of one individual, there is a risk of replicating that instructor’s ways of knowing and being within a discipline, for better or worse. In the case of our focal course, the instructor chose to prioritize inclusive, critical discipline-relevant concepts alongside foundational disciplinary concepts, but not every individual instructor may have made that same choice. An area of improvement when it comes to power and authority in learning designs across the board would be to consider who the stakeholders are more broadly and include additional stakeholders in the process to ensure the voices of others are present in the course content. Additional stakeholders could be other faculty, a department chair, or even students.

Finally, representation in the organization of learning in the course can be analyzed by examining the authors who are cited in the course to learn who is present and who is missing. As part of an independent project, a past student kindly shared their data about representation in the course content which was collected, along with similar data from all the courses this student had taken so far in their program of study, for a project in another class. I verified these numbers and share them here with permission and gratitude to the student who chose not to be identified. Of the 59 scholars whose work appears in the course, 58 are white and of American or European descent, one is a Hispanic male, and seven are white women. To connect this oversight to an earlier point about power in the learning design process, bringing more voices into the process may have illuminated this lack of representation earlier in the process.

Social relations

Next, I will analyze how power and identity development are intertwined in the social relations supported in the learning design. Vossoughi and Gutiérrez don’t offer a specific definition of social relations in their chapter, so I lean on the instructional design definition of social interaction in online courses which centers on interaction among classmates and instructors. By this definition, there are two primary social relations that are specific to this class, which are learner-to-learner interactions and learner-instructor interactions.

There are many opportunities for learner-instructor interaction in the course design, as instructors provide feedback and grades for every activity learners complete, and learners are invited to use the commenting feature in the LMS to communicate with instructors regarding submissions. However, there is only one formalized opportunity for peer-peer interaction where learners can offer each other feedback and interact. Peer feedback in the form of studio critique has its own dedicated section in the course policies and procedures which outlines how feedback should be given and received, thus putting up boundaries around how students interact with each other during these activities. Such guidance cites the “critique sandwich” method of sandwiching suggestions for improvement between one to two positive comments. The critique guidance also offers tips on how to make this process useful: “If someone gives you ambiguous feedback, this means that they can intuitively see a weakness but might not know why something isn’t working. You should follow up with their comments with probing questions to better understand their perspective.” Learners are graded on critique as participation, so it can be assumed to be a disciplinary value in the design discipline, and learners could anticipate becoming comfortable with the idea of critiquing their peers with advice and criticism in the spirit of improvement.

Critique is a commonplace teaching strategy, especially in arts-based disciplines, but the process of giving and receiving feedback in this way is a practice entrenched in a culture that is worth questioning. The “critique sandwich” structure in particular does not allow for learners to express their positionality when providing a response to classmates’ work, but rather encourages students to make a value judgment on others’ work as good or bad, a practice with ties to white supremacy culture (Okun, 1999). While boundaries and structure have a beneficial place in a foundational course, an approach to critique that encourages appreciation for classmates’ efforts and recognizes there are a multitude of ways to provide substantive feedback could be a more productive method.

Critique sessions may be viewed as an opportunity for learners to demonstrate their identity development as an artist, and possibly distribute expertise among the class rather than looking at the instructor as the sole expert. Nowhere in the course is it explicitly stated that an instructor’s feedback during critique carries more weight or should be treated differently from peers, but since the instructor is the one handing out grades, it isn’t unreasonable to assume that instructor feedback may be perceived as non-negotiable. While the critique process may seem like a way for learners to practice being experts and hold some of the power within the course, ultimately the instructor likely maintains the most power in this activity as a result of setting the critique guidelines and also determining what an acceptable critique is. These social relations as designed are more related to identity development and practicing using disciplinary language and ways of interacting than they are designed to disrupt the power balance in the course.

The next focal point in this analysis is the tools utilized in the course learning design. Tools can both constrain and empower, and in an online learning experience, tools mediate learner’s interactions with learning materials, classmates, and their instructor. Since tools are so ubiquitous in online learning and mediate nearly all aspects of student learning, analyzing the values, power, and opportunities for representation that tools do and do not provide is essential to this work. There are several software tools learners utilize to complete projects, including InVision App, Twine, and Adobe Spark, which are available for use without purchase. The use of publicly available tools in a foundations course could convey to learners that specialty software is not necessarily required to do real design work, and dispel any potential notions that using higher cost tools produces higher quality work. On the other hand, these tools are very specific in nature and may constrain the projects learners produce in ways that are counterproductive to learning about design.

In addition to the software students use to produce projects, there are tools that students utilize to access course materials, submit assignments, and interact. This course is unique in that the learning design consists of three systems that work together to provide different functions. The LMS (Canvas) serves as the course home base and grading center because it is supported by the university and tied directly to the student information system which allows for seamless access without requiring students to create multiple accounts and provides customizable options for learners to set up notifications that remind of due dates and alerts for instructor communications. Tied into this system is the virtual studio tool, which sits outside of Canvas specifically for critique of in-progress work. The virtual studio tool was developed to provide a visual-based discussion area for students to share many different formats of work, such as audio/video files, large images, animations, etc. Third, the course uses GitHub to manage updates to the content and deliver course pages in order to maintain consistency across concurrent offerings. The GitHub site is accessed when learners click on the link from the course Canvas site and is positioned as an electronic textbook.

A critical analysis of the Canvas LMS could be its own book, so for the purposes of this analysis I will focus on the virtual studio. The studio primarily supports the disciplinary values of design by prioritizing visuals as the objects of discussion. Learners typically photograph their work from various angles (for physical objects) or upload a selection of in-progress screen shots for digital work, and provide a short artist statement for their classmates to accompany the images. This foundational course emphasizes the design process, so learners are encouraged to document their work at various stages and share it. Learners who may previously have only presented finished work in past educational experiences would ideally become people who are comfortable with the idea of work in progress through this sharing activity, and understand that projects are iterative and open to criticism and improvement.

In terms of “what matters” in the studio, comments and visuals are the primary features of the tool. The only options learners can utilize when they click on a peer’s work is to scroll through their images and add comments. By prioritizing the visuals and the comments submitted by learners, the studio reflects the values of looking at works in progress and considering feedback. As was mentioned earlier, the development of the virtual studio is another project that was primarily driven by one instructor, though more staff members were part of the process due to the complexity of skillsets required to complete this type of undertaking. Programmers, UX designers, and the instructional designer were all part of the team, but still student voices were absent from the development of this tool, even though it was designed specifically to support online learners’ studio activity. When using a specially-designed tool like this, learners’ activity could be constrained by the need to fit the submission model.

Finally, practice is the last unexplored site of analysis. When examining a learning design from an instructional design perspective, practices are considered to be the activities learners undertake that are guided by a set of goals. When it comes to the learning design of this class, components that I categorized as practices include the learning objectives, projects, and activities that learners participate in. While projects and activities may seem obvious, the inclusion of learning objectives as practices may initially be unclear, but I justify this choice because the learning objectives in any course ultimately guide the assignments, exams, projects and other activities learners engage in. Experienced instructional designers understand the learning objectives of a course to be an explicit statement of expectations for both the instructor and the learners. In an ideal world, the objectives are the guide posts that drive the learning experience, thus they are part of practices.

In addition to guiding learners’ progression, the learning objectives should also illuminate the disciplinary values of any course. For example, the focal course states the following objective: “Implement new ideas and develop a diverse array of options for problem solving in response to critical review and the iterative design process for improving work.” Learners might interpret from this objective what values are prioritized in the discipline and the course, which align with the values of iterative work, sharing, and openness to critique that I have uncovered in previous sections of this paper across the other sites of analysis.

I would also like to analyze a more traditional example of practice in the form of a project from the course, the Daily Design Journal. This is an introductory project learners begin at the start of the course, and the primary task is to document 14 objects learners encounter in their daily life. Learners are instructed to note the shape, form, materials, dimensions, functionality, and how those factors relate to the way they interact with the objects they choose. This activity is once again driven by the instructor, who sets the criteria for what matters and should be included in the final product, and also sets the tone for the reflective practice that follows the sketching portion where students respond to a series of questions about the three sketches of their choice. This activity doesn’t illuminate much with regard to representation within the course, but it does demonstrate another opportunity for learners to evaluate disciplinary values and practice those values by learning to think like a designer by examining designed objects and reflecting on how their designs connect with their functionality.

I recognize this analysis as presented here is only the beginning of what will hopefully become a much more in-depth, refined process. I also hope this documentation serves as a starting point for re-evaluating our instructional design processes from the beginning of a course design, rather than retroactively considering issues of power, identity, inclusion, etc. When reflecting on whether the framework uncovered previously hidden problematic areas, it seems more like this version of the framework offered a way to articulate those issues. For instance, a theme that became painfully clear in this course, and likely would for many in my portfolio, is that there is only one subject matter expert typically present in the development, and that leads to issues of power and representation that have been outlined above. While many may have assumed this would be a problem, I did not imagine the breadth and depth of the influences power can have on all aspects of the course. I went into this work assuming the critique activity was an opportunity for learners to have some power in the course, but in reality, my reflections within the other sites of analysis made it abundantly clear that this activity is not at all balanced due the instructor-imposed guidelines on interactions.

Another valuable outcome of this effort is disrupting the notion that I could break each piece of the course apart and analyze them individually. I quickly realized that this framework, at least as I interpreted it, leads to a lot of overlap and influences other parts of the framework. For instance, the course procedures and policies dictate a lot of the social relations in the course. Participation grades in particular mean there has to be a designed interaction, and similarly, that need affects the practices that are designed into a course, which in turn influences the way learning is organized. All of these factors are constrained by the tools available. The choices made about course designs early in the process and the beliefs held by instructors and designers have a waterfall effect that touches nearly every experience designed into a course. This leads to the power problem I referred to that continually resurfaces throughout each of my chosen sites of analysis. It is the decision-making of the person or people in power that influences who is represented and what disciplinary values are prioritized, and how learners may develop new identities within these experiences. As a result, this discovery calls for further reflection around the notion of power and whether it should be positioned differently in the framework, or addressed in an entirely different way. In addition, the influence of power in our work leads to the question of what instructional designers can do to shine a light on this issue.

Reflecting on the critical focus of disciplinary values and identity development, I wonder now if these are two different categories. To elaborate further, alignment of disciplinary values is good from an instructional design perspective, but that doesn’t necessarily mean the values themselves are inherently good. Those responsible for the learning design need to determine together if the values represent ethical outcomes for all, or if the values conflict. Additionally, learners also need to decide for themselves if the values they interpret are in alignment with their own personal values, regardless of the ethics in the disciplinary values. Further research into how students interpret disciplinary values, as well as how disciplinary experts interpret disciplinary values, could help this part of the analysis. Perhaps instructional designers can design opportunities for students to examine their personal values against the perceived values of their chosen area of study throughout their college careers.

After this initial trial, I also wonder if there is a better way to consider questions of representation in this framework? Looking at who is present in the referenced materials was productive, but I struggled to identify any other instances of representation in the practices, tools, or social relations within the course. There are undoubtedly different ways to think about representation that I have overlooked in this work as a result of my privileged way of experiencing education, and exploring research around heterogeneity in learning might benefit the next iteration of this framework. This gap may also require further consideration of stakeholders and how they think about ideas of representation and identity within the discipline, and whether the course design meets those expectations.

This work is limited in some ways. While I hope the framework will help drive critical analysis of existing learning designs across all disciplines, I am biased by my experience as an instructional designer who has primarily worked in professionalized disciplines. The three themes or critical focuses I outlined at the beginning of this paper may not be as functional for learning designs in fields with less direct applications of skills and knowledge. Another bias that may have unintentionally impacted this framework is my experience in online learning design, which is much deeper than my experience in traditional residential learning design.

In addition, this framework was influenced only by the critical scholarship that I have encountered. Undoubtedly there are other authors and papers unknown to me whose work may strengthen or contradict this framework. I hope this framework sparks conversation and collaboration around iterations for future use to evaluate other courses in my portfolio, and hopefully with instructional designers from other disciplines interested in this line of work. As I emphasized above, when one individual holds all the power, systemic problem areas remain undiscovered. The experiences of other professionals can only strengthen this effort.

Conclusions: The Instructional Designer’s Role in Critical Instructional Design

Near the beginning of this paper, I cited Barab et al., who discuss the role instructional designers have in educational designs and the responsibilities that should be carried with this role. The authors state that “designers should regard their work in terms of its impact not on a situation directly but, rather on how users transact with the work, with each other, and with their contexts” (Barab, et al., 2007, p. 296). As an instructional designer, exploring critical scholarship in the learning sciences has changed my perspective on what my role is when collaborating with instructors and other team members to develop learning designs. I have always viewed this role as one of advocacy; for pedagogy, for innovative and memorable learning experiences, and for learners in the margins. While that hasn’t changed, this analytical exercise has shown that previous ideas of who is in the margins are not sufficient. Based on my analysis of this course alone, I already feel that I should advocate for change in the process itself by ensuring there are voices other than mine and the instructor I’m working with involved when we are making these design choices. As I discussed previously, power relations are abundant and tangled within the typical instructional design process, and drawing attention to this problem should be the goal of all instructional designers.

To borrow once again from Collier (2020), a “small move” I can make now could be working with my team to remedy the problematic spots highlighted in this analysis, and encouraging my teammates to apply this process to their own portfolios. Sharing our discoveries could help our unit find ways to embed these critical focuses from the beginning of a new project in the future, and help us make a plan for revising our old designs. Additionally, finding ways to bring more people into the design process can be another “small move” to tackle in the present. Specifically, striving to include student voices in the instructional design process in ways that are authentic and meaningful for both parties will be an immediate focus in my professional work going forward. I can advocate for spreading out the power in the instructional design process, ask the right questions, and point to the literature. By adopting this critical stance, I can be one of those instructional designers who takes steps to “build transformative models of what could be” (Barab, et al., 2007, p. 264).

Barab, S., Dodge, T., Thomas, M. K., Jackson, C., & Tuzun, H. (2007). Our designs and the social agendas they carry. Journal of the Learning Sciences, 16(2), 263–305.

Collier, A. (2020). Inclusive Design and Design Justice: Strategies to Shape Our Classes and Communities. EDUCAUSE Review. https://er.educause.edu/articles/2020/10/inclusive-design-and-design-justice-strategies-to-shape-our-classes-and-communities.

Costanza-Chock, S. (2020). Design Justice: Community-led practices to build the worlds we need. Cambridge: MIT Press.

Esmonde, I. (2017). Power and sociocultural theories of learning. In I. Esmonde & A. N. Booker (Eds.), Power and privilege in the learning sciences: Critical and sociocultural theories of learning (pp. 6–27). New York, NY: Routledge.

Okun, T. (1999). White supremacy culture. White Supremacy Culture. https://www.whitesupremacyculture.info/

Taylor, K. H. (2018). The role of public education in place-remaking: From a retrospective walk through my hometown to a call to action. Cognition and Instruction, 36(3), 188–198.

Vakil, S. (2020). “I’ve always been scared that someday I’m going to sell out”: Exploring the relationship between political identity and learning in computer science education. Cognition and Instruction, 38(2), 87-115.

Vossoughi, S. & Gutiérrez, K. (2017). Critical pedagogy and sociocultural theory. In I. Esmonde & A. N. Booker (Eds.), Power and privilege in the learning sciences: Critical and sociocultural theories of learning (pp. 139–161). New York, NY: Routledge.

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge, MA: Cambridge University Press.

Winner, L. (1986). Do artifacts have politics? In The whale and the reactor (pp. 19-39). Chicago: University of Chicago Press. https://ebookcentral.proquest.com/lib/pensu/detail.action?docID=557593

Toward a Critical Instructional Design Copyright © by Katrina Wehr is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

Share This Book

EDUCAUSE Review - The Voice of the Higher Education Technology Community

Tools That Teach: Lessons for Critical Instructional Design

In mediated teaching and learning, skeuomorphism helps educators map ritualized pedagogies into new learning contexts. Higher education learning designers are best equipped to lead the way in recognizing, leveraging, and resisting skeuomorphism when it perpetuates problematic rituals.

Tools That Teach: Lessons for Critical Instructional Design

Repeating a skill until it is mastered and becomes a ritual or habit is not entirely dependent on a person's willpower or determination. The design and interfaces of the tools people use can encourage ritual formation—both desired and undesired. Indeed, through their design, some tools can actively teach people how to use them or provide a sense of familiarity so that users are encouraged to try them. Footnote 1 Using physical traits and cues in the design of a tool or object to give users a sense of familiarity is referred to as skeuomorphism (pronounced skew-oh-morph-ism). Skeuomorphism can be seen, for example, in mechanical pencils that are designed to resemble wood pencils. Many electric cars have a front grille so that they look like gasoline/diesel vehicles. These design features are not necessary for the object to function, but they give users clues about what the object does and how it works. The function of a wood-like mechanical pencil might seem obvious today but imagine when the mechanical pencil was introduced. Making it look wooden attempted to communicate what the user should expect: a pencil—not a pen or a marker or something else.

Skeuomorphism in Action

In the digital world, skeuomorphism was a key strategy in the design of the first wave of smartphones. Early iPhones were filled with skeuomorphic design touches: the notepad looked like a real legal pad, the calendar looked like a physical calendar, and the contacts app looked and functioned like a physical address book. The 2D screen attempted to mimic many things about the physical objects it was replacing. This was intentional. Apple designers wanted to use familiar objects to teach users how they were meant to treat this new tool. Footnote 2 In 2013, after six years of using a heavily skeuomorphic interface, Apple changed the design of the iPhone to the much flatter, more modern (and now contemporary) user interface that we recognize today. Branding and consumerism aside, this design shift was also intentional. Apple had taught people how to use its iPhones and iPads, so the skeuomorphic callbacks to the physical world were no longer needed. Users know what a calendar app is, how contacts work on a smartphone, and how a digital photo album behaves.

The Apple interfaces from 2007–2013 are excellent examples of what applied skeuomorphic design looks like. During this same time period, many of the online teaching tools that users depend on today were also going through their adolescence. And while most learning management systems (LMS) do not actively employ skeuomorphism in their visual appearance, nearly every aspect of online teaching and learning includes ties to the physical world. Instructors can create assignments and quizzes in the LMS, the Moodle LMS allows instructors to organize materials into "books," and the name of the Blackboard LMS implies how instructors and students are supposed to interact with it at its most basic level. Many naming and design choices like these are meant to familiarize instructors and students with the functionality of the tools and to provide reference points for meaning.

When Rituals Are Useful and When They Are Not

A skeuomorphic approach to new designs has a few unintended consequences. First, while providing a familiar reference point is helpful for teaching people where to start, it can result in overreliance on the rituals that people create when they are learning. Rituals can help people ease into a new paradigm, but they can also hold people back. For example, while a notepad on an iPhone mostly corresponds to the form and function of a real-world notepad, this analogy starts to break down with more complex concepts such as assignments in an LMS. On a very basic level, turning in an LMS assignment is akin to placing a finished essay on an instructor's desk at the end of class: the LMS serves as a location for students to submit their work. This comparison works when thinking about how an essay assignment would be submitted in person, but is an assignment really an assignment if the material being collected is audio, video, or collaborative works that have only ever existed digitally? In addition, other online teaching processes try to mimic the real world too closely. Consider, for example, the process for collecting an assignment from a student, giving feedback to the student, returning the assignment, and providing another opportunity for the student to resubmit the assignment with revisions. Conducting this process online is generally more complicated in most LMS environments than it would be in person. Instead of simply handing back a physical document with some comments, educators have created a system where everyone involved needs to manage sharing permissions, gradebook settings, and other aspects that are consequences of applying physical analogies to digital environments. This can result in instructors adhering to strict rituals if they do not genuinely understand why the process is set up a certain way, and any small change in the operational steps (such as an update to the LMS that moves options around) can break the ritual.

Breaking the Limits of the Design

The second unintended consequence of applying physical structures to digital spaces can occur when the digital tool is limited in some way by the constraints of what is technologically possible at the time of its creation. These limitations can persist across iterations of the technology long after the original limitations have been overcome. An example of this can be seen in the structure of telephone numbers in the United States. In their original form, telephone numbers were tied to physical locations (rather than to people). So, for example, instead of dialing a number, a person would pick up a telephone handset and immediately tell a switchboard operator what number they wanted to reach. Eventually, dialing was introduced and, as demand for phone numbers grew, the numbers were lengthened—from between three and five digits long to seven digits, and then, with the addition of area codes, to ten digits. Today, telephone operators are no longer needed to physically connect calls: a smartphone can be used to reach any number in the world, and the call is relayed digitally across what is more analogous to a computer network. Footnote 3

Today, the same phone number structure is being used even though it is no longer explicitly necessary. A shorter alphanumeric code could more easily identify another person and would provide many more combinations to work with. The phone number structure made sense at the time of its creation, but phone numbers were bound by the limitations of the technology at the time. While these limitations no longer apply, the rituals that have formed around this way of organizing communication dictates its continued use, and for most people, the ritual is not inconvenient enough to justify changing it.

When to Leverage Skeuomorphism and When to Resist It

Similarly, the technological limitations that were present when online learning was introduced have continued to influence online learning practices today. One of the most common examples of this is the lecture style of instruction. Lectures have existed for much longer than modern audio-visual aids or interactive materials have. Early incarnations of the LMS would not have allowed for much more than passive interaction with the materials (such as early message boards), and certainly, the high-speed broadband connection needed to facilitate other activities was not as widely available as it is now. These technological limitations were reasonable barriers at the time; however, the online learning landscape is not nearly as limited today. Yet many instructional designers and educational developers continue to see faculty sticking to the model of replicating passive lectures online.

Research examining the barriers that educators face when learning how to teach online reveals that ritualized pedagogies were a type of troublesome knowledge that interfered with changes in practice. Footnote 4 For a forthcoming research article, learning designers, academic technologists, and educational developers submitted case examples of ritual knowledge that were present in their work with faculty learners, including ways that some faculty clung to practices that had become routine, such as utilizing discussion boards but not crafting effective prompts for generative discussions or uploading PowerPoint presentation files and lecture notes to the LMS rather than reimagining learner interaction with content in an online learning context. Footnote 5 These rituals are everywhere, and while they are problematic, they are also enabled by skeuomorphic elements in the virtual learning environment.

Zoom's videoconferencing platform is an example of this dilemma. This technology (and others like it) allowed learners and educators to transfer the physical classroom into a virtual space so that those who were not experienced with or adequately prepared for asynchronous online teaching and learning could maintain some academic continuity during the coronavirus pandemic. Videoconferencing quickly became an integral part of pandemic life. In fact, it became as essential for digital life as email or office productivity software. While Zoom existed before the pandemic, its surge in popularity resulted in the rapid addition of new features and adjustments (such as basic security measures to prevent Zoom-bombing). More recently, however, Zoom updated its interface to include an immersive view, where meeting participants appear on the screen situated in a common physical location. It included "classroom" and "lecture hall" options that positioned students in obedient rows, with a front-of-the-room and a back-of-the-room, and allowed all participants to view the configuration from the instructor's perspective.

Left image has bright blue chairs and colorful walls. Right image has gray chairs and wood walls.

While the novelty of this update garnered some initial enthusiasm, the learning-design community lamented this unnecessary skeuomorphic design feature because it failed to imagine that learning could be more than horizontal rows of filed and sorted student faces or a teacher-centered experience—arguments that have already been made by other experts in the field. Footnote 6

There are lessons that can be gleaned from an awareness of skeuomorphism in new teaching and learning contexts. While skeuomorphism enables certain types of behavioral changes by allowing those who are experiencing the changes to cling to some semblance of familiarity or ritual, it also prevents deeper, more fundamental intellectual advances that may be necessary to prevent and repair harm, as well as to truly and radically innovate for equitable, learner-ready experiences. Footnote 7 At minimum, teaching and learning professionals should have an awareness of—and at best be critical and active about—recognizing and resisting skeuomorphic elements that shepherd them into a particular, ritualized way of teaching and learning that is not based on what they know about how people learn. The good news is that this is what learning and instructional designers do: "[they] understand digital space. They understand learning. They understand teaching. And they understand technology." Footnote 8 They are capable of recognizing skeuomorphism and leveraging it when it helps to facilitate faculty development for mediated teaching and learning. But they are also well equipped to push back on skeuomorphism through the lens of critical instructional design when those elements function to perpetuate pedagogical rituals that no longer serve students. Footnote 9

The following four simple recommendations can help to safeguard against troublesome ritualized pedagogical practices:

  • Educators: Be aware of the routines that advance your teaching philosophy and those that impede your development. Beware of the way unexamined rituals can be leveraged by those with power to steer you away from or toward your values.
  • Learning Designers, Technologists, and Educational Developers: Engage in critical instructional design and understand that there is no values-free technology. Footnote 10
  • Higher Education Institutions: Leverage your instructional design assets to achieve your institutional mission by understanding their value and purpose. Footnote 11
  • Academic Technology Companies: Build stakeholder feedback into your roadmap so that it reflects the needs (and not just the familiar rituals) of the users as well as the valuable expertise of those who are experts in teaching and learning. Footnote 12

Educators are tasked with balancing many demands on their time. It is natural, therefore, to develop routines or heuristics to navigate these demands, especially when they may be in an area that is not the educator's primary expertise. Sometimes the tools purposefully cue educators to act or think a certain way through their design. By working as a team, educators, learning designers, institutional leaders, and academic technology providers can critically evaluate the routines that are used in the classroom to ensure that they align with the intended mission and values of the institution and alter tools to encourage equitable, socially just, and effective teaching. Each of these stakeholders brings unique and important contributions to the team. Learning designers, technologists, and educational developers should consider how their strengths are particularly well suited to identifying helpful and harmful routines and then act accordingly in the best interest of students.

  • Donald A. Norman, The Design of Everyday Things (Cambridge: MIT Press, 2013). Jump back to footnote 1 in the text. ↩
  • Abbas Vajihi, "Why Apple Products Feel So Intuitive," Mac O'Clock, Medium (website), July 15, 2020. Jump back to footnote 2 in the text. ↩
  • For an interesting description on the development of area codes, listen to Adrianne Jeffries, "A Conspiracy Theory About Area Codes," March 9, 2021, in Under Understood (podcast), MP3 audio, 41:40. Jump back to footnote 3 in the text. ↩
  • David Perkins, "The Many Faces of Constructivism," Educational Leadership 57, no. 3 (November 1999): 6–11. Jump back to footnote 4 in the text. ↩
  • Lorna Gonzalez and Christopher S. Ozuna, "Troublesome Knowledge: Identifying Barriers to Innovate for Breakthroughs in Learning to Teach Online," Online Learning Journal 25 (forthcoming). Jump back to footnote 5 in the text. ↩
  • Ajaay Srinivasan, "Best Immersive View Scenes for Zoom [Download]," Nerds Chalk (blog), April 27, 2021; Lucy Biederman, "Goodbye, Zoom Fatigue," Inside Higher Ed, March 31, 2021. Jump back to footnote 6 in the text. ↩
  • Amy Collier, "Inclusive Design and Design Justice: Strategies to Shape Our Classes and Communities,"   EDUCAUSE Review 55, no 4 (October 2020): 12–23. Jump back to footnote 7 in the text. ↩
  • Sean Michael Morris, "Instructional Designers Are Teachers," Hybrid Pedagogy, April 12, 2018. Jump back to footnote 8 in the text. ↩
  • Jessie Stommel and Martha Burtis, "Counter-friction to Stop the Machine: The Endgame for Instructional Design," Hybrid Pedagogy, April 27, 2021. Jump back to footnote 9 in the text. ↩
  • Ibid. Jump back to footnote 10 in the text. ↩
  • Joshua Kim, "As Many Instructional Designers as Librarians," Inside Higher Ed, March 25, 2019. Jump back to footnote 11 in the text. ↩
  • Jaime Hannans, "Reduce Performance Anxiety for Nursing Students While Improving the Feedback Iteration," PlayPostit: Interactive Video (blog), February 3, 2020. Jump back to footnote 12 in the text. ↩

Christopher S. Ozuna has previously worked as an instructional designer and is currently a doctoral candidate at the University of California, Santa Barbara.

Lorna Gonzalez is the Interim Assistant Director of Innovation and Faculty Development and a Lecturer at California State University Channel Islands.

© 2021 Christopher S. Ozuna and Lorna Gonzalez. The text of this work is licensed under a Creative Commons BY 4.0 International License.

9 Instructional design principles and how to use them

critical thinking in instructional design

Robert Gagne’s instructional design principles were first proposed in 1965. Though the world has changed a lot in that time, the way we learn is fundamentally the same. Together, these nine principles are a science-backed framework for creating effective learning experiences, whatever your learning format. 

Whether you’re creating elearning courses, training sessions or working in blended environments, this guide will help you understand how to deploy these instructional strategies and engage your learners. We’ll explore each principle in turn and provide practical examples and advice for applying them in your instructional design process.

Design your next session with SessionLab

Join the 150,000+ facilitators 
using SessionLab.

Recommended Articles

A step-by-step guide to planning a workshop, 54 great online tools for workshops and meetings, how to create an unforgettable training session in 8 simple steps.

  • 18 Free Facilitation Resources We Think You’ll Love

Good instructional design is an art and a science.

Even if you’re new to learning design, chances are you’ll already be practicing some of the learning principles I’ll outline below. In my own practice, I learned from mentors and teachers I admired and through trial and error long before I came across learning theory.

Creativity and experience got me so far, but then I realized that a deeper understanding of how people learn would help me improve the courses and workshops I designed. Instructional design theories and learning frameworks provide a solid foundation that you can build upon with your signature style.

In this guide, we’ll explore Robert Gagne’s 9 principles of instructional design and how to apply them. You’ll learn what principles need to be considered in designing instruction that resonates with learners and how the framework can inform your design process.

By better understanding these principles and how to practically action them as an instructional designer, you can create more engaging learning experiences that will help participants retain and better utilize what they learn after the course is complete. 

What are Gagne’s instructional design principles and why are they important? 

Robert M. Gagne was an American psychologist whose worked centred on educational psychology. He is best known for his 1965 book The Conditions of Learning . There, he demonstrated a nine-step process for creating effective learning called the events of instruction.

The 9 instructional design principles (also known as Gagne’s nine events of instruction) as outlined by Gagne are:

  • Gain the attention of your learners
  • State the objectives 
  • Stimulate recall of prior learning
  • Present the learning content 
  • Provide learning guidance
  • Elicit performance from learners
  • Provide feedback
  • Assess learner performance 
  • Enhance retention and transfer

You’ll find a detailed explanation of each of Gagne’s instructional design principles below alongside practical tips for implementing them in your eLearning course, classroom or live training session. 

Instructional design principles are important because they provide a proven framework for designing an effective learning experience. They successfully incorporate the key concepts and psychological principles at the heart of learning into a practical, easy to follow process.

critical thinking in instructional design

In my experience, these principles help concretize all the learning theory out there and make it easy for me to ensure the course I’m designing will engage students and achieve the desired learning outcomes. They provide a systematic process that is easy for any instructional designer to follow, repeat and internalize.

While most of these principles will occur in your course roughly in the order presented above, it’s worth noting that these events can overlap and you’ll revisit them at various points in your learning flow. 

For example, it’s common that an eLearning course will present the content of one training block alongside an interactive game that gives learners an opportunity to demonstrate their knowledge and for instructors to gauge their progress.

Afterwards, you might then go into another learning block that repeats this process, perhaps even restating objectives at certain points and asking learners to remember previous learning when pertinent. 

In most effective courses, many of these instructional principles are repeated throughout and learners will have multiple opportunities to receive feedback, interact with experiential content and practice their skills.

Try thinking of the list as both a rough skeleton of the points you will want to hit in your learning flow as well as practice advice for improving individual sections of your course or training programs. 

critical thinking in instructional design

When designing an eLearning course outline or instructional design storyboard , it’s helpful to detail how each training block aligns with your learning objectives.

In SessionLab, instructional designers can add colour-coding to training blocks to delineate the learning objective, activity type or even which stage of the learning process it touches upon. This can help ensure you have a balanced learning flow that will be engaging for your learners. 

As Gagne writes, “organisation is the hallmark of effective instructional materials.”

Start by creating a simple course outline that meets your learner needs. Turn that outline into a storyboard by adding detailed text, timing and clear instructions. Attach learning materials, images, links and other multimedia to each training block so that your content team can easily find what they need.   

By combining the following instructional design principles for course creation with SessionLab, you can quickly structure your online course and ensure nothing is missed. 

A screenshot of a blended course template printout created in SessionLab.

1) Gain attention

Effective learning can only begin once learners’ attention is in the room or on the course provided. Gagne’s first event of instruction is all about getting the attention of your learners, sparking their curiosity and drawing them into the session. 

In a live setting, gaining attention often means actually starting the session, asking for people to settle into the room and leave what’s outside of the session for later. While this can be as simple as raising your voice and saying, let’s begin, it can be especially effective to engage learners’ curiosity and help them be present in the moment with a game or activity. For a learning program or online course delivered via an LMS or course platform, getting the attention of your learners also needs to consider a combination of good design and simple, but engaging content.

How to gain attention in instructor-led training

  • Call the room to action by using your voice and presence . Often, this looks like standing up at the front of the room, raising the volume of your voice and calling the group to attention. In traditional learning environments, lots of learners expect some kind of formal signal to begin so don’t be afraid to lean into this.
  • Use an icebreaker game to help people mentally arrive in the room. Bonus points if it relates to the topic of the day and starts getting people engaging and learning with one another.
  • Don’t forget the body! A simple invitation to take a few deep breaths, stretch or do an energizer can help people give their attention to what’s happening in the room, rather than what’s happening on their phone or after the session.  
  • Use music or other audio tool to signal the start of the session . One facilitator friend swears by the use of a Tibetan gong!
  • Create engaging visuals. Have a thought-provoking cover slide on your visual presentation or have posters or images relating to your topic around the room. 
  • Place question cards, quotes or image cards on tables and chairs ready for when people come in . Pique their interest or invite them to start thinking about the session and their own understanding of the topic at hand while everyone arrives.

How to gain attention in eLearning courses

  • Share a short, engaging intro video. Video content that features the instructor behind the training can be very effective, adding a human touch to forthcoming material while also introducing the key points of the course.
  • Share a story relevant to the topic at hand. Good stories are highly engaging for us as people, whatever the medium. A real world example that engages with the topic of your course is also a great shout – if it’s personal, even better! 
  • Have an interactive moment early on, preferably including some element of user choice. Picking an avatar, a favourite colour or simply answering a question in the first moments of the course can help get the attention of learners in this environment. Bonus points if it’s persistent! 
  • Use compelling visuals. As above, attractive, thought provoking visual material can be a great way to get attention immediately. Good design can go a long way here!  
  • Make a big claim or bold statement to grab learner attention and encouraging critical thinking. Something a learner strongly agrees (or disagrees!) which can be a compelling hook to move forward. On other occasions, using evocative language and restating the ideal goal state or benefit of the course can be a great way to gain learner attention. Let them know what real world problems they’ll solve after taking your course!

A photograph of a trainer delivering content.

2) State objectives

In adult learning, it’s been shown that people learn better when they know why they’re doing a particular activity and what the goals or desired outcomes of a training program are. Gagne’s second event of instruction is all about outlining the goals and objectives of the training they’re about undertake.

Stating course objectives can help learners engage with each step of the learning flow, understanding what the overall goal is and how each step can help them reach that goal. 

This stage is often about building trust too – giving your learners an overview of what they’re going to learn and some sense of how you’re going to help them learn it. Whatever your learning format and audience, try to use language that speaks to them and relates to their personal goals, as well as those of the wider training program.

Unsure about the learning goals or your audience? Check out this guide on how to run a training needs assessment to ensure you thoroughly understand your audience and their goals.

How to state objectives in instructor-led training 

  • Let your learners know the objectives of the training session early in the process. Practically, this looks like having a slide in your presentation to present the objectives or a handout which includes learning objectives close to the top. For some sessions, you might even state the objectives of the training in the invitation email or in a shareable agenda so participants can come to the session fully prepared. 
  • Try making objectives personal and aspirational. Statements such as “by the end of this training you will be able to:” can help the goals of a session more concrete. Aspirational statements that invite participants to consider their personal goals can also be effective ways to motivate learners.
  • In live environments, it can also be effective to ask learners and trainees what their own objectives are at the beginning of a training session. This can help ensure alignment, create a participatory environment and also create the potential to cover peripheral topics (if able) that learners will respond to. 
  • Having learners share those desired learning outcomes with the rest of the group can also be an effective way to cover the objectives of the course and also begin the active learning process. Ask participants to share their own goals with the group and then add any core learning outcomes they’ve missed at the end.

How to state objectives in eLearning courses

  • A simple bullet point list or slide that tells learners exactly what they’ll learn on the course is a tried and tested method for stating objectives. This might also come in the form of a course outline where objectives are linked to the main sections of your course. 
  • Outlining before and after states is also an effective way to sell the self-paced course they’re on and carry them through the first screens and into the training proper. What will it look like after the course has been completed? How will the learnings benefit their day-to-day work?
  • A short video where the instructor introduces learning objectives works well. The human touch can help demystify tricky objectives or help demonstrate an ideal future state. For example, if you were running a course on improving facilitation skills, a video where an expert facilitator tells trainees how achieving learning objectives has improved their personal practice can do wonders to get participant buy-in. 
  • I’ve found that including an activity that asks participants to write down their personal objectives is a great way to start people on their learning journey. If you’ve gotten the right people on the course at the right time, most personal objectives will overlap with those of the course, but it’s an effective exercise to get participants thinking about what they most want to get out of the material ahead. 

3) Stimulate recall of prior learning

In Cagne’s principles of instructional design, the recall of previous learning is an integral part of the learning process. By creating connections between new material and their existing knowledge and experiences, it’s easier for learners to retain what they learn. 

In many cases, the recollection of previous material is also a great tool to allow a trainer to assess participants’ existing knowledge or skill level. You can use this to tailor the learning experience and to measure the impact of your course – for example, running a short quiz at the beginning and end of the training and seeing how the results change.

Remember that previous learning doesn’t just mean “what previous courses or training have you taken on this subject?”

If the subject of your training is conflict resolution, it might be more effective to ask participants about recent conflicts and how they resolved them. Personal experience and parallels to real life situations can be very effective at stimulating the recall of prior knowledge. 

How to stimulate recall in instructor-led training

  • Group discussions where participants are encouraged to share their experiences around a core training topic is a highly effective method of stimulating recall. In my experience, 1-2-4 All provides the best structure of this kind of discussion. It allows trainees to have some personal time thinking about the subject before a pair and small group discussion. It also ensures that one person doesn’t dominate a whole group discussion and that multiple viewpoints are shared. 
  • Experiential activities can also be a great way to stimulate recall. For example, a simple problem solving game might require participants to use skills they’ve learned in order to be successful. In my experience, using an energizer game as an opportunity for people to use problem solving skills related to the topic at hand can also help.
  • One simple way to stimulate recall is to simply ask all participants to summarize their knowledge on a chosen topic and present those summaries to the group. 

How to stimulate recall in eLearning courses

  • Quizzes and other interactive content are highly effective in a self-paced format. A short quiz can help you engage learners early on, providing variation in your course content while also allowing you to gauge their level of knowledge.  
  • Asking learners to recall prior knowledge and summarize can work in a self-paced format , though without peer feedback, it might not suit every training topic or learning format. That said, even as a self-reflection activity it can be effective. You can even begin the process of multimedia learning by asking participants to create a slide-deck or image to summarize their existing knowledge.  
  • Referencing previous learning content or well known material in your course material can help gently nudge learner recall. If you know your trainees have engaged in a previous course or will likely have read a well known book on the topic, organically referencing these in your course is a good idea.
  • Sending preparatory reading material to your trainees in your invitation or prep materials can give participants an opportunity to prepare and also give you something to refer back to later. Be aware that not every trainee will do this reading, but don’t be afraid to refer back to it to help stimulate recall either. 
  • A single great question can also pave the way for this kind of recall. Ask a question that invites participants to reflect (and take time to do so!) on a given topic or an inspiring subject can be all it takes to promote this learning principle. 

Visual representation of the ADDIE cycle - Analyze, Design, Develop, Implement, Evaluate.

4) Present the content

So this is the big one – actually presenting your learning content to your trainees. In most training sessions , this is where the bulk of time is spent. Here, you’ll organize your learning content into a methodical, engaging learning flow that will help learners understand and engage with the learning material. 

In live training, presenting content can often look like a trainer running through a slide deck, asking questions from the group, encouraging reflection and perhaps including an experiential game to demonstrate some of the content in action. 

In eLearning environments, content will likely live in an LMS or learning platform, and be a sequence of interactive slides, games and other material. Using the key principles of multimedia learning and varying your content style is a great place to begin, though you’ll want to go further in order to produce a truly effective instructional design.

In any format, there are some solid best practices for ensuring your learning content is presented in an engaging fashion that will help learners move efficiently to the next stage. 

How to present content in instructor-led training

  • Try presenting your content in different ways to engage different learning styles. Standing at the front of the room and simply talking to your trainees without any variation or interactivity can quickly grow stale and lower engagement with your content. Use slides, videos, audio, handouts and images in your visual presentations to cater to different learning styles. You might also encourage active participation in the form of a training activity that involves your participants in presenting content.
  • Simplicity and legibility is important. Your content should follow a structured learning flow that makes it easy for learners to follow, understand and synthesize. Be sure to contextualize anything you present and that it’s suitable for the level of understanding your learners have. Using SessionLab to design your content flow and storyboards is one effective way of ensuring your content is well structured and follows a logical sequence.
  • Summarizing content and describing key points to learners either at the beginning or end of a training block can be helpful at switching your participants into the right mode for learning while also reinforcing the key takeaways. 
  • Relating your material to real world contexts can provide learners with a way to relate this new knowledge to their own experience. Try using multiple examples or even ask for examples from your group.
  • Encouraging note-taking is another effective method of helping participants engage with the content you’re presenting. In some training activities, you might ask learners to share notes with each other between learning blocks or to summarize the content you just presented using their own notes. 

How to present content in eLearning courses

  • Presenting content in different forms is especially important for keeping learners attention in self-paced eLearning. Using a blend of text, video, audio, infographics, slides and other media are all useful at creating engagement.
  • Interactivity can be an effective way of presenting your content in a more memorable and experiential manner. Simple learning games developed with the help of subject matter experts can make all the difference when it comes to helping learners actively engage with your content.  
  • Ensure that any additional media or interactivity you add is relevant to the topic and learning goals. Making things visually appealing is a bonus, but adding heaps of images that aren’t relevant to your central content can distract the learner. As with everything in instructional design, balance is key. 
  • Segment your content into digestible chunks and add simple, measurable goals to each section. This can help keep your learners on track and ensure they don’t lose sight of why they’re being given a particular piece of content. 

A photograph of a speaker giving a presentation.

5) Provide learning guidance

Gagne’s fifth event of instruction is where the instructor or trainer provides learning guidance. This guidance can come in many forms, though it should always have the aim of helping learners better understand the material provided and helping them learn how to learn. 

In my experience, learning guidance comes in two main forms: 

  • Learning guidance that is baked into the content
  • Learning guidance offered alongside main content by the instructor or course

Making learning guidance present on a content level is often a design decision. For example, instructional designers will often start with simple material before increasing in complexity in order to facilitate learning.

They may distribute handouts which help guide a learner towards answering questions on the training content or include step by step instructions that facilitate deeper comprehension. They might also include practical examples of what is being learned in the form of a case study or training activity . 

Learning guidance offered alongside the main content is often about helping learners improve their own ability to learn.

Instructional designers might include a PDF on best practices for studying, taking breaks and keeping learning alive. In cohort based learning, that guidance might also look like providing office hours or online chat groups where participants can help one another learn too.

How to provide learning guidance in instructor-led training

  • Develop step-by-step lessons that start with simple, easy to understand concepts before moving towards more complex material. This allows learners to build on existing knowledge and develop their understanding as they go.
  • Break your content into small chunks and create opportunities for the group to ask simple qualifying questions at regular intervals.  You might also solicit relevant experiences from the group or step to one side and talk about how to best internalize the content effectively.
  • Using an instructional design model such as the ADDIE model to thoroughly understand your learners needs can really help you choose the right method of learning guidance. Ensure you’ve investigated what will wok best for your learners in order to produce the most effective instruction.
  • Talking through a question and how you might arrive at an answer for the group can be really effective at demonstrating a learning mindset. You might do this yourself or by doing a pop-quiz and asking the correct respondent to talk more about the process they went through when finding the answer.
  • Practical examples are a great tool for providing learning guidance. You might include a real world example or case study in your content that shows how someone might deploy the knowledge being learned in your training. You might also use a training activity involving role play that gives participants an opportunity to practice in a safe environment where you as a trainer can also provide guidance. 

How to provide learning guidance in eLearning courses

  • Most content authoring tools offer features like image hotspots or buttons that allow users to explore a subject in more detail. Add links to additional material where you can. Include tooltips on key terms and learning points so that those learners who may need extra help can find it without leaving your course and come out of their learning flow. 
  • Add a section with advice on how to get the most out of the course. Set expectations for how long learners should spend on each section, how much extra reading they should do and how they should approach the material. Even something as simple as a reminder to silence phones and give learning material their full attention can help here!
  • Peer-support and activities can be effective, even in a self-paced environment. With cohort based learning, give opportunities for learners to discuss material or complete a group task to support your self-paced material. Blended courses are a great way to make this dance between self-directed learning and group discussion a reality – see more in this blended course template ! 

critical thinking in instructional design

6) Elicit performance

Gagné’s sixth event of instruction is eliciting performance. This is typically where learners are able to practice new skills, demonstrate what they’ve learned and begin retaining information. Practical exercises, role playing simulations and quizzes are all common methods trainers and instructional designers will use in order to elicit performance from learners. 

By tapping into experiential learning methods, this stage of the learning process can help learners retain information and file it in their long term memory. 

This is arguably the most important step of the learning process. Whatever the topic or format of your training, you’ll want to ensure you give ample opportunity for participants to practice their skills and demonstrate their knowledge within your course – simply providing lots of informational content isn’t enough, however great that content might be. 

Eliciting performance is also an important step for the instructor. If learners are having continually difficulty with a particular concept, the instructor may want to revisit that topic in greater detail. In a self-paced format, the input you get from participants at this stage can also be used for improving your learning experiences. 

How to elicit performance in instructor-led training

  • Role-playing games and training activities where learners must deploy their new skills are great ways to elicit learner performance . In some scenarios such as soft skills training, participants are able to use what they’ve learned in a real-life situation immediately while in others, you may need to offer a simulation – such as for workers operating specialized machinery which may not be available on site. Wherever possible, consider how you can create opportunities to directly employ what’s being learned in a “as true to life” manner as possible. 
  • Simple quizzes and Q&A sessions can also be an effective way to give participants a chance to show what they’ve learned. It’s often useful to go beyond repetition and ask learners how they arrived at an answer or how they might use their answer in the real world. 
  • Giving participants an opportunity to present what they’ve learned and demonstrate their understanding is another common method of eliciting performance. Put folks into groups and ask them to discuss what they’ve learned, how they might apply it and then presenting those ideas to the rest of the cohort. This is an effective way of encouraging people to not only repeat what they heard, but to start putting those learnings into practice. 
  • In a live session, it’s important to consider how a balanced agenda can pave the way for effective practice. Add breaks to your SessionLab agenda and use the automatic timing calculations to ensure participants haven’t been digesting content for 3 hours straight before then being asked to demonstrate new knowledge!

How to elicit performance in eLearning courses

  • Interactive activities are the name of the game for this stage of the learning process. Use quizzes and games where participants need to demonstrate their knowledge in order to proceed. You can gate progress or create fail states so that participants can only go to the next step when they provide correct answers and demonstrate their understanding. 
  • Simulations are even better if they’re relevant to your learning objectives . For example, if you’re delivering a sales training course, you might simulate a few customer calls and ask respondents to select the best responses.
  • If you’re running cohort based training or a blended learning course, get participants to do an activity together or in a facilitated group activity . This provides a great opportunity to practice new skills with the guidance and feedback of peers and an instructor. 
  • In some scenarios, using open-ended questions and giving participants an opportunity to respond creatively to a problem and use their new skills can be effective . This approach requires peer or instructor feedback to be effective, and so is best used in a blended format, or at the end of a larger unit of self-paced training. 
  • It can also be effective to give participants some homework or ask them to practice what they’ve learned in a real-life environment between training content . Give participants a clear call to action on what to do next with some practical ideas for how to use what they’ve learned. Even with entirely self-directed learning, it’s possible to give direction for employing new skills between training content and then ask participants to reflect on what they did when they come back for the next block.
  • Allow participants to retake or repeat key sections , particularly if they’ve not satisfied performance expectations. You might link back to sections contextually or simply provide an index or course overview so learners can go back over what they need to whenever necessary. 

critical thinking in instructional design

7) Provide feedback

Gagné’s seventh event of instruction is providing feedback. This is where the instructor provides direct feedback on learner progress and how they’re performing in comparison to the desired learning goal. This kind of feedback is most often given in direct response to learner input, such as when they are answering questions about a new learning, conducting a practical exercise or practicing new skills. 

In a training context, feedback is most effective when given immediately following learner action. It should also provide enough detail for the learner to understand what went well or what needs improvement. The idea is not to just tell the learner why they were wrong but also to help them make adjustments and move towards the desired learning goal.

The best kind of feedback to give your learners is often dependent on context, where they are in the learning journey and the relative importance of a given point. Here are some of the different kinds of feedback you might provide to your learners:

  • Confirmation feedback: this kind of feedback lets the learner know they did the right thing or gave the right answer. This typically includes a positive affirmation that futher encourages the learner.
  • Corrective feedback: the type of feedback tells learners that they did the wrong thing or an incorrect answer was given and explain why. Remedial feedback will typically direct learners to where they can find the right answer or prompt them to try again. 
  • Evaluative feedback: this feedback method gives the learner a sense of how they performed, often in the form of a score. You might also include a description of what that score means, often in line with an assessment criteria document. This kind of feedback is often short and to the point, with learners expected to take some ownership of next steps based on the score they received. 
  • Descriptive feedback: descriptive feedback can be used in both correct and incorrect scenarios, giving participants a deeper level of feedback that often includes suggestions, additional information and next steps that will help learners improve their performance and progress on the learning journey. 
  • Peer feedback: peer feedback is an opportunity for learners to reflect on the performance of others and provide input to one another. This is especially useful during group activities or as a point of contact in a blended learning environment.
  • Self evaluation/self feedback: this kind of feedback method involves prompting the learner to self reflect on their progress or performance. Self reflection is a great habit to encourage at various points in the learning process.

How to provide feedback in instructor-led training

  • In a live environment, feedback is often given immediately following learner input or during a practical exercise. The faster you’re able to help learners correct their actions, the easier it is for them to make changes and incorporate the desired learning. 
  • Create space for learners to ask follow-up questions.  The best learning experiences are rarely one way and giving participants a deeper understanding of what to improve, change or why their answer was correct can help deepen the process.
  • In many cases, it’s vital for learners to understand why they were wrong, as well as being given the correct answer. Contextualize your feedback and where necessary, detail the process of finding the right answer. This can help ensure participants develop the skills they need, rather than just parroting the correct answer in a training context.  
  • When learners are practicing their skills or conducting role-play exercises, ensure there’s an opportunity to course correct and practice the ideal behaviour. This can help switch context from a potentially negative to positive relationship with the training material and help reinforce the desired outcome. 
  • Positive affirmation that helps reinforce ideal behaviour is as important as correcting undesired responses. Tell people when they’ve done well and explain why their response was ideal. In a group setting, it can also be helpful to share what a great response or effective application looks like.     

How to provide feedback in eLearning courses

  • It’s worth noting that giving people a chance to learn from their mistakes is especially important during eLearning. Just telling people they were wrong and then moving on isn’t an ideal flow for learning. After providing feedback on a wrong answer be sure to then provide the opportunity for participants to give the right answer or demonstrate their knowledge some other way. You might also offer a simpler or adjusted version of the simulation or provide a quiz that offers additional hints or tooltips. 
  • As with live training, any feedback should be given in a direct, immediate and clear manner. Your content authoring tool will have everything from tooltips, pop-ups, audio tools and more. Leverage these tools thoughtfully to congratulate participants on a correct response or gently let them know that the response was incorrect and provide them with feedback that can help them do better next time. 
  • As a rule of thumb, try to ensure every point of learner input provides feedback of some kind. Whether it’s a positive affirmation of correct practice or an incorrect answer message, each point of input is an opportunity to guide participants to the ideal learning journey.  
  • Achieving clarity in a self-paced training course isn’t just about the text. Visual design is a vital element of providing feedback that is easy for the learner to understand and doesn’t create friction. Think about how to make feedback visually distinct from other learning material and try to employ a consistent method of delivering feedback throughout your course. 
  • Test your courts and explore how it feels to receive feedback to an incorrect response. If every incorrect answer triggers a warning klaxon and a wall of text, that’s unlikely to feel good for your learner, and may negatively impact the learning journey. 
  • Remember that feedback is about guiding participants to the correct response and deepening the learning journey. Messages will want to let people know what went wrong but also guide them towards understanding. It’s not fun to be told you’re incorrect over and over again without context or support! 
  • Providing links to additional material or opportunities to revisit content is easily achieved in most content authoring tools . Giving learners an opportunity to improve their understanding by linking to supporting material can help ensure they get the right answer while also reinforcing key points. This can be an effective way of helping learners gain an understanding of the material, rather than just brute forcing your quiz. 

critical thinking in instructional design

8) Assess performance

Gagné’s eight event of instruction is an assessment of learner performance. This is where trainers officially evaluate how well learners have performed against the desired learning objectives. In practice, this can look like a written or oral exam, practical demonstration, scored quiz or other form of assessment. 

For most learning scenarios, it’s important that trainers do not offer additional guidance or help while assessing performance. Participant ability will typically be measured on individual performance and with a pass/fail model. 

The results of these assessments are used in multiple ways. First, they’re often given back to participants to either congratulate them or provide an opportunity to retake an assessment or deepen their learning.

Assessments are also a great tool for trainers and instructional designers to improve the quality of their materials – if participants struggle with certain elements, it’s potentially a sign you need to make something clearer or cover certain topics in greater detail. 

How to assess performance in instructor-led training

  • Demonstrations and practical activities that are supervised or observed by the instructor is a common method of assessment in live training. Typically, those assessing the performance will score or grade each trainee as they progress through a pre-defined scenario. This is especially useful when training participants in practical skills.
  • A formative assessment in the form of a written or oral exam is also common. These often include a series of questions that are scored by the trainer in order to determine performance. 
  • Individual outputs such as essays, reports or creative products are another tried and tested assessment method – many university courses include essays and other personal outputs to assess learner progress and performance. Note that these can be more difficult and time consuming to assess, and require thorough assessment criteria used by every instructor in order to be fair and effective. 
  • Be sure to outline how performance will be assessed at the outset of the course and again just before an assessment. Trainees should know exactly how they’ll be assessed and there shouldn’t be any surprise criteria that doesn’t relate to what they learned. Include it in your training agenda and provide links to supporting material where appropriate.
  • In some cases, it can also be effective to assess participants before the course begins and then assess them again at the end. Measuring the improvement in skills or knowledge can provide a finer degree of assessment and also help the trainer understand the true impact of their material. 
  • Going further, it can also be helpful for learners to get used to being assessed in some small form throughout the course . You might sprinkle various assessment techniques such as quizzes and group questioning throughout your course to help you and your learners be aware of performance throughout the course. 

How to assess performance in eLearning courses

  • Scored quizzes are a common feature of self-paced courses for good reason. They provide an opportunity to cover many learning events in turn and effectively assess the performance of learners. 
  • Vary the format of your assessments so that they’re engaging and can’t be brute forced. Using a mix of multiple-choice questions, word games and other quiz formats can help you assess performance while avoiding burnout.
  • Challenges and simulations provide an experiential way to assess performance. Remember that even if your assessment method is gamified, participants still need to know how they are being assessed. Clear instructions and good feedback are key here. 
  • Include links to assessment criteria and supporting materials in your course introduction and ensure participants can access what they need when preparing for assessment. 
  • Clearly signpost when a section of your online course is part of the formal assessment of course progress. You might distinguish these sections visually while also clearly spelling out that this section is important. 
  • Milestone tests or short assessments spread throughout the course are especially important in a self-paced environment where the instructor does not have the ability to organically gauge performance. 
  • Pre-testing before the start of an eLearning course can be an effective way to tailor the experience for your learners. You might allow them to skip certain sections or draw more attention to others based on the results.

critical thinking in instructional design

9) Enhance retention and transfer

Gagné’s ninth event of instruction is about enhancing the transfer of knowledge and helping learners retain what they’ve learned during the course so they can apply it in real-life . The goal of any learning experience isn’t to just help participants pass the course – it’s to equip them with skills and knowledge that will be used from here on out.

Instructional designers tend to achieve this in two ways. First, by using activities that improve retention and knowledge transfer throughout the course, often in the form of simulations and practice exercises.

They’ll also provide resources to help participants continue learning once the training is over. Static resources like PDFs, checklists and job aids are helpful, though you might go further and offer feedback loops with line managers or group forums for peer support. 

How to enhance retention and transfer in instructor-led training

  • A summary of key points and core topics in the form of a one-pager can be a great resource to provide to learners at the end of a training session. A job-aid that helps demonstrate the connection between what’s been learned and how to apply it in day-to-day work is also an effective resource to share at the end of a course. 
  • End your training session with a final opportunity to practice key skills or demonstrate knowledge. You might do a final group role play, quick-fire quiz or practical exercise. 
  • Close the session with a group reflection or debrief. Giving everyone the opportunity to reflect on what they learned and share different perspectives how they’ll use what their new skills or knowledge can be a great way to ensure next steps are taken and that learning is retained. Closing activities like Letter to Myself or I used to think…Now I think are proven methods you can use here. 
  • Create opportunities to check-in following the training session . You might have line-managers or trainers check-in with trainees to discuss progress and to reinforce key learnings. Alternatively, create an accountability group where a cohort of trainees can share experiences and tips while keeping what they learned alive. 
  • Have trainees create an action plan for how and when they’ll use their new skills following the workshop. Setting an intention for a real-life application of what’s been learned can ensure trainees are in a good position to retain material following the course. 

How to enhance retention and knowledge transfer in eLearning courses

  • The steps trainees take immediately following the completion of an online course are key. Encourage learners to think about how they’ll apply their new skills and knowledge throughout or ask them to create an action plan with next steps. 
  • Ask participants to create their own artefacts related to the course. You might have an activity where they create a one-pager with key points or create a visual that would help others (and themselves) to remember the most important elements. 
  • Remind learners of the journey they’ve been on and give them some guidance of what they might do next. If there’s a story at the heart of your training, you might use the end of your course to give that story a compelling ending or show how other learners have achieved great things following the course. 
  • Links to further reading and interesting resources related to the course can encourage trainees to continue engaging with the material and go deeper. 
  • Repeatable simulations which trainees can use to practice their skills are a great method of encouraging knowledge retention. You might allow participants to simply repeat previous practical simulations or include a more difficult version that encourages them to go further. How about creating a scored simulation where trainees in a cohort might be encouraged to achieve and share a high score?  

Now we’ve explored these core instructional design principles, you might be wondering what’s next and how you might go about using these principles to design effective learning experiences.

Beyond these core principles, most instructional designers will use a tool such as the ADDIE model to effectively project manage the process of creating a completed learning experience.

It’s also worth acknowledging that alternative principles of instructional design are out there.

Some learning designers prefer David Merrill’s principles, which includes five principles : task-centered, activation, demonstration, application, and integration. The successive approximation model (SAM model) is also a popular method for creating a learning program.

I would recommend using these instructional design models to get a broader view of how you might progress from conducting a needs assessment to working with subject matter experts and sharing a completed course with participants. 

critical thinking in instructional design

Whatever instructional design model you use, a storyboarding and learning design tool like SessionLab is a simple and effective way to go from an outline to a fully realized learning design while keeping these principles in mind.

You can invite your subject matter experts to collaborate on your design and attach materials to each learning block, ready for your content team to recreate in your LMS.

Want to learn more? Explore how learning designers at Vlerick Business School use SessionLab to design instructor led training and eLearning courses at scale. 

Working on a blended learning course? See how to apply instructional design principles in a blended environment with this in-depth guide to blended learning design .

Designing instructor led training? You might also find this step-by-step process for creating a training session plan helpful. You’ll find tips on creating engagement and realizing a live training session with the help of a detailed agenda.  

We hope that the above guide and these additional resources will help you take a systematic approach to learning design that also leaves space for your personal touch.

Did we miss anything or is there something we should explore further? Let us know in our community of facilitators and learning professionals!

critical thinking in instructional design

James Smart is Head of Content at SessionLab. He’s also a creative facilitator who has run workshops and designed courses for establishments like the National Centre for Writing, UK. He especially enjoys working with young people and empowering others in their creative practice.

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

cycle of workshop planning steps

Going from a mere idea to a workshop that delivers results for your clients can feel like a daunting task. In this piece, we will shine a light on all the work behind the scenes and help you learn how to plan a workshop from start to finish. On a good day, facilitation can feel like effortless magic, but that is mostly the result of backstage work, foresight, and a lot of careful planning. Read on to learn a step-by-step approach to breaking the process of planning a workshop into small, manageable chunks.  The flow starts with the first meeting with a client to define the purposes of a workshop.…

critical thinking in instructional design

Effective online tools are a necessity for smooth and engaging virtual workshops and meetings. But how do you choose the right ones? Do you sometimes feel that the good old pen and paper or MS Office toolkit and email leaves you struggling to stay on top of managing and delivering your workshop? Fortunately, there are plenty of great workshop tools to make your life easier when you need to facilitate a meeting and lead workshops. In this post, we’ll share our favorite online tools you can use to make your life easier and run better workshops and meetings. In fact, there are plenty of free online workshop tools and meeting…

critical thinking in instructional design

How does learning work? A clever 9-year-old once told me: “I know I am learning something new when I am surprised.” The science of adult learning tells us that, in order to learn new skills (which, unsurprisingly, is harder for adults to do than kids) grown-ups need to first get into a specific headspace.  In a business, this approach is often employed in a training session where employees learn new skills or work on professional development. But how do you ensure your training is effective? In this guide, we'll explore how to create an effective training session plan and run engaging training sessions. As team leader, project manager, or consultant,…

Design your next workshop with SessionLab

Join the 150,000 facilitators using SessionLab

Sign up for free

REVIEW article

Conceptualizations and instructional strategies on critical thinking in higher education: a systematic review of systematic reviews.

Paola Andreucci-Annunziata

  • Instituto de Investigación y Postgrado, Facultad de Ciencias de la Salud, Universidad Central de Chile, Santiago, Chile

Aim: This systematic review identified systematic reviews of quantitative and qualitative empirical studies on the promotion and development of critical thinking in higher education students that allowed us to answer the following research questions : (1) What are the main definitions of critical thinking found in systematic reviews of critical thinking in higher education, and what are their similarities and differences? and (2) What are the most commonly used teaching strategies in higher education for teaching or promoting critical thinking, and how effective have they proven to be?

Methods: Systematic reviews were selected according to the guidelines for systematic reviews and meta-analyzes (PRISMA, 2020) and the eligibility criteria proposed by the PICOS strategy (population, interventions, comparators, outcomes and study design), based on 23 records of scientifically identified registers in the Journal Citation Report databases of the Web of Science.

Results: The bibliometric and systematic search of reviews of empirical studies on the topic allowed the selection of five systematic reviews. The results highlighted that conceptually critical thinking is related to both dispositions and skills, and that although there is no consensus on its definition, it is established that it is a higher-order cognitive process that can be trained. However, the results show that more studies have been conducted considering critical thinking as a skill than as a disposition, that the immersion approach has been widely used, and that some instructional strategies have shown greater effectiveness than others when the disciplines are evaluated independently.

Discussion: Despite the relative consensus on the importance of critical thinking for professional development in higher education, this review highlights some difficulties in conceptualizing critical thinking, in the relationship between dispositions and skills, and in its assessment in academic disciplines.

1. Introduction

How we think has become a fundamental pedagogical discussion, in terms of the kinds of thinking skills needed in particular societies, and the role and possibilities of education in developing or fostering these skills. In this context, critical thinking has become a central notion, understood in educational institutions in the Global North as a key necessity in contemporary societies. In this regard, the UN and UNESCO have gone so far as to define critical and creative thinking, which enables innovation and knowledge sharing, as a requirement for achieving the Sustainable Development Goals and therefore a priority for any educational institution ( Fejes, 2006 ; Beneitone and Yarosh, 2015 ; Organización de Naciones Unidas, 2018 ; Sabzalieva et al., 2021 ). As a result, various higher education (HE) institutions around the world have included critical thinking among their objectives ( Zahavi and Friedman, 2019 ; Cruz et al., 2021 ). However, despite broad agreement on its relevance, there is neither a single definition of critical thinking that satisfies the complex and diverse aspects that are part of critical thinking discussions, nor agreement on the best method for teaching or fostering critical thinking in HE, or on how to assess or measure it ( Halpern, 1998 ; Van Damme and Zahner, 2022 ). Moreover, recent studies show that even within HEIs that have established critical thinking as an explicit pedagogical objective and developed specific strategies for teaching it, students do not appear to become significantly more skilled as critical thinkers as a result of their education, with variables such as nationality, languages, gender and socio-economic background having varying degrees of impact in this regard. As suggested by van Damme and Zahner (2022) , given the importance that critical thinking has gained in higher education and the limited success of these critical thinking programmes, universities should make greater efforts in this regard.

In terms of its conceptualization, a specific link between critical thinking and education dates back to the beginning of the twentieth century. El Soufi and See (2019) noted that the Deweyan approach had already pointed to the role of education in strengthening critical thinking among students as a key objective. More recently, in 1980, Peter Facione gave rise to the Delphi Project ( Facione, 1990 ). This was based, on the one hand, on the observation in various cases that students did not reason adequately. And on the other, the identification of a lack of agreement about how critical thinking was defined, taught and assessed, despite its agreed relevance to higher education ( Facione, 1990 ). The Delphi project brought together 46 experts from around the world, including philosophers, scientists, and educators, with the aim of defining critical thinking and developing recommendations on how to teach and assess it ( El Soufi and See, 2019 ).

The resulting definition – and one of the most widely quoted – referred to critical thinking as: “purposeful, self-regulating judgment that results in interpretation, analysis, evaluation, and conclusion, as well as an explanation of the evidential, conceptual, methodological, critical, or contextual considerations on which that judgment is based. Critical thinking is essential as a tool of inquiry. As such, critical thinking is a liberating force in education and a powerful resource in personal and civic life” ( Facione, 1990 , p. 651). However, despite this agreed definition, some authors have noted that there is still a lack of agreement on how to define and approach critical thinking ( Niu et al., 2013 ). For example, there is a debate about whether it is even possible to teach critical thinking. This discussion relates, on the one hand, to the argument that critical thinking is a socio-culturally specific practice that cannot be easily taught or learned ( Ramanathan and Kaplan, 1996 ; Atkinson, 1997 ). In this regard, variables such as nationality, culture, language and socio-economic background may be key to differentiating students’ critical thinking learning processes ( Giacomazzi et al., 2022 ; Van Damme and Zahner, 2022 ).

And, on the other hand, a discussion related to this academic talent from the creative perspective or the development of divergent thinking ( Crossley-Frolick, 2010 ), distinguishing nativist, deterministic or dispositional approaches from others that are more developmental or related to formal and informal learning ( Andreucci-Annunziata, 2012 , 2016 ; Payan-Carreira et al., 2019 ). In this last sense, from a relational, socio-constructivist, dialogical, and critical conception, both academic talent and critical thinking are referred to from their possibilities and limitations in the field of pedagogical interaction and problem-solving ( Andreucci-Annunziata, 2016 ; Ahern et al., 2019 ). In this sense, Puig et al. (2019) suggest that the transition from ‘what to think’ to ‘how to think’ adequately summarizes the challenge of teaching critical thinking, a challenge that requires major transformations in instructional paradigms and that, in turn, questions the initial conceptions.

Given the polysemy of the concept and the divergences around it, critical thinking is generally understood as doubly constituted: on the one hand, as an ability (skill) and, on the other, as a disposition, both dimensions being closely related ( Dumitru et al., 2018 ). The former understands critical thinking as a cognitive skill, or a set of cognitive skills necessary to think critically. As a disposition, critical thinking refers to a set of basic, predetermining affective dispositions, toward life in general and toward specific thinking situations ( Cruz et al., 2021 ). These dispositions are considered necessary (as prerequisites) for the development of the cognitive skills that constitute critical thinking. Understood as dispositions, critical thinking is close to what Dewey (1910) calls “good mental habits” or what Siegel (1988) has conceptualized as “critical spirit.” Facione (1990) proposes a list of affective dispositions grouped into two categories: approaches to life in general (e.g., confidence in one’s own reasoning abilities, interest in keeping informed, openness to different world views, flexibility in considering other alternatives and opinions, etc.) and approaches to specific issues, questions or problems (e.g., clarity in formulating questions and concerns, diligence in seeking relevant information, etc.). The distinction between these two categories is important because it emphasizes that critical thinking is not developed exclusively in relation to specific aspects of reasoning but is rather a way of approaching different aspects of everyday life and questioning this process of approach ( Facione, 2000 ; Braun et al., 2020 ).

Simultaneously, critical thinking studies point out that it is not enough to teach cognitive skills, but that people should: “understand the value of critical thinking and have an interest and enthusiasm in applying it. While critical thinking skills can be explicitly taught, dispositions need to be modeled and nurtured so that students progressively adopt an identity as critical thinkers” ( Al-Ghadouni, 2021 , p. 241). However, while many educators agree that critical thinking is an important skill to teach, not all agree on the best way to teach it. The disagreement falls mainly on whether it is a generic skill that can be transferred between different dimensions and that can be taught independently of the subject or topic, or whether it is specific to each dimension and, therefore, requires positioning ( McPeck, 1981 ; Bailin et al., 1999 ; Moore, 2013 ). Therefore, a detailed analysis of how critical thinking is translated into teaching-learning processes shows several possible paths. Generally, however, there is agreement among educational researchers on the key principles that should shape teaching and learning processes to promote critical thinking, including: “facing open-ended problems, encountering real-world complexity, using multiple knowledge sources, developing knowledge artifacts to explicate thinking, utilizing collective efforts and group resources instead of favoring individual student work, and integrating rich use of modern technologies into the work processes” ( Hyytinen et al., 2019 , p. 71). Regarding these teaching-learning processes, three relevant concepts are identified in the literature: (1) approaches, (2) instructional strategies, and (3) learning materials.

The concept of approaches is usually used in critical thinking studies referring to Ennis (1989) ’s distinction between four different ways of teaching critical thinking mainly differentiated according to the explicit or implicit teaching of critical thinking ( Ahern et al., 2019 ; El Soufi and See, 2019 ). These pedagogical approaches to critical thinking have been synthesized into four types: general method; infusion; immersion and mixed method, which we briefly explain below ( Al-Ghadouni, 2021 ). The general method consists of the explicit teaching of critical thinking, to acquire or developing critical thinking skills as the sole focus. In the infusion method, critical thinking constitutes an explicit objective but in parallel to a specific topic of study. Critical thinking is taught in relation to the topic at hand, and students are encouraged to think critically about it, while the basic principles of critical thinking are explicitly taught as well. In the immersion approach, critical thinking is not an explicit teaching objective. The focus is on immersion in a specific theme or subject, which is taught in a way that provokes critical thinking. Critical thinking principles are not explicitly addressed, and students are not necessarily aware that they are being trained to think critically. Finally, the mixed method consists of a combination of the general method and the infusion or immersion method.

The second key concept in relation to critical teaching-learning processes is instructional strategies. These refer to more specific kinds of activities through which teachers expect students to develop and engage in critical thinking practices. Some of these strategies are: defining arguments, evaluating the reliability of sources, identifying fallacies and assumptions, using inductive and deductive logic, synthesizing information, making inferences, assessment techniques like peer-review, teacher evaluation, and self-evaluation, debates, brainstorming techniques, journal writing, scaffolding, active learning strategies, FRISCO ( Ennis, 1996 ), the guidelines of Elder and Paul (2003) , the ‘IDEALS’ technique of Facione (2011) , Lecture-Discussion Teaching (LDT), Problem-Based Learning (PBL) ( Ennis, 2016 ), problem-solving (inquiry), lecture discussions (argumentation), group work, role-play, self-study, self and peer-assessment, context-based learning ( Dominguez, 2018a ), constructing maps with structured arguments, concept mapping, dialog (learning through discussion), authentic instruction (presenting real problems, simulation, sequential assignments, and performance-based assessment).

The third concept, learning materials, is suggested by Puig et al. (2019) to identify relevant materials that are part of critical thinking teaching-learning processes, such as literary and narrative texts (articles, essays), E-learning activities, and authentic problems.

In addition to the conceptual and methodological discussion around the critical thinking pedagogical approach, critical thinking studies have also focused on discussing the possibility of evaluating it. Various instruments have been developed for this purpose, such as the California Test, which is based on the work of Facione (2000) and focuses on skills, or the Cornell Test, which is based on the work of Ennis and Weir (1985) and focuses on dispositions.

Given the current relevance of critical thinking in higher education and the breadth of its conceptual approaches and the heterogeneity of pedagogical methods used to address it, this article discusses the results of a systematic review of systematic reviews that have addressed critical thinking in relation to higher education. This review responds to the need to identify the main definitions and didactic approaches that have emerged from the establishment of critical thinking as a pedagogical objective in different HE institutions worldwide, systematizing what has been learned in this process to facilitate the formulation of guidelines. Theoretical and methodological support to those academic institutions that intend to implement critical thinking among their teaching objectives and hallmarks in the present and future. In this way, the article develops by answering the following questions.

• What are the main definitions of critical thinking found in systematic reviews of critical thinking in higher education? What are their similarities and differences?

• What are the most commonly used teaching strategies in higher education to teach or promote critical thinking, and how effective have they been shown to be?

In what follows, the materials and methods of the systematic reviews are presented, and then the findings are presented and discussed.

2. Materials and methods

In this review, the Preferred Reporting Items for Systematic Reviews and Meta-Analyzes (PRISMA, 2020) guidelines ( Page et al., 2021a , b ) were used, and the PICOS (participants, interventions, comparators, outcomes, and study design) strategy was used to establish the eligibility criteria for the articles ( Methley et al., 2014 ). In addition, the initial search for articles was performed using bibliometric procedures ( Porter et al., 2002 ). Systematic reviews of systematic reviews and bibliometrics have recently been used separately to address educational topics related to learning in general and critical thinking competencies in HE students ( Djamnezhad et al., 2021 ; Pagán Castaño et al., 2022 ). Both methods blend allows for increased accuracy and replicability of study ( Andreucci-Annunziata et al., 2022 ).

A set of articles was used as a homogeneous citation base, avoiding the impossibility of comparing indexing databases that use different calculation bases to deter-mine journals’ impact factors and quartiles ( Bakkalbasi et al., 2006 ; Falagas et al., 2008 ; Chadegani et al., 2013 ; Harzing and Alakangas, 2016 ; Mongeon and Paul-Hus, 2016 ), relying on the Web of Science (WoS) core collection, selecting articles published in journals indexed by WoS in the Science Citation Index Expanded (WoS-SCIE) and Social Science Citation Index (WoS-SSCI), from a search vector on critical thinking TS = ((critical NEAR/0 (thinking OR perspective OR approach)) AND (Higher NEAR/0 Education)), without restricted temporal parameters, performing the extraction on 3 October 2022. The following types of documents were included: articles and review articles.

A complementary bibliometric analysis was carried out on a set article obtained for the topic under study. Using two fundamental bibliometric laws:

1. Exponential science growth or Price’s Law, through the exponential adjustment degree of the annual growth of publications, as a measure of a strong interest among the scientific community to develop studies on critical thinking in HE, conforming a critical researcher mass developing this knowledge topic ( Price, 1976 ; Dobrov et al., 1979 ), and determining the time median and its contemporary and obsolete periods.

2. Then we have excluded proceeding papers, book reviews and editorial materials and other languages, for estimate the publications concentration in journals by Bradford’s Law, distributing the journals in thirds according to the decreasing number of documents published in them, establishing as the nucleus of journals with the highest concentration that cover at least 33% of the total publications ( Bulik, 1978 ; Morse and Leimkuhler, 1979 ; Pontigo and Lancaster, 1986 ; Swokowski, 1988 ; Kumar, 2014 ).

According to the checklist of the PRISMA 2020 guidelines ( Page et al., 2021a , b ), the following quality steps for systematic reviews were verified according to the following sections: 1 (title), 2 (structured abstract), 3 (rationale), 4 (objectives), 5 (eligibility criteria), 6 (sources of information), 7 (search strategy), 8 (selection process), 9 (data extraction process), 10a and 10b (data items), 16a and 16b (study selection), 17 (study characteristics), 19 (results of individual studies), 23 (discussion), 24 (registration and protocol), 25 (support), 26 (competing interests), and 27 (availability of data, code and other materials). The following sections were excluded because, as a review of reviews or umbrella review ( Aromataris et al., 2015 ), the data from each study to satisfy their criteria were not considered pertinent within the narrative synthesis of the present review, or were not available, or were presented only in a general way after having been part of a respective protocol: 11 (study risk of bias assessment), 12 (effect measures), 13 (synthesis methods), 14 (reporting bias assessment), 15 (certainty assessment), 18 (risk of bias in studies), 20 (results of syntheses), 21 (reporting biases), and 22 (certainty of evidence).

Through PRISMA guidelines, the selection of articles was specified based on eligibility criteria: the target population (participants), the interventions (methodological techniques), the elements of comparison of these studies, the outcomes of these studies, and the study designs (the criteria of the PICOS strategy as shown in Table 1 ). Screening of the preselected systematic reviews was first performed independently by the following authors, PA-A, AR, SC, AM, and AV-M. Then, the final review of the included reviews was done in the following pairs: PA-A, AM; AR, SC; and AV-M, AM. In case of doubt, it was decided to include a third reviewer among the six authors.

www.frontiersin.org

Table 1 . Eligibility criteria using PICOS (participants, interventions, comparators, outcomes, and study design).

The bibliometric systematization over an unrestricted period in the WoS main collection resulted in 1999 documents between 1965 and 2022, showing a continuous publication record from 1994 onwards. Figure 1 shows an exponential publication growth between 1994 and 2022 with an R 2 adjustment of 78% (trend line and value in red). In addition to highlighting as a semi-period of more recent publications between 2018 and 2022 (green shaded area), with an analysis set reduced to 1,084 documents for this period.

www.frontiersin.org

Figure 1 . Publications on critical thinking between 1965 and 2022.

After the exclusions are made, 847 documents are fragmented in search of the Bradford core ( Table 2 ). This estimate narrows the core to 38 journals that concentrate the publication of 276 articles between 2018 and 2022 (See detail in Table A1 in Appendix A, and data in Supplementary Table S1 ).

www.frontiersin.org

Table 2 . Bradford zones estimation, articles by journal zones.

The absolute percentage error is estimated at 3%, therefore the adjustment achieved by the nuclear zone is considered adequate (See Equation 1 ).

This 276-document set is entered as input to the PRISMA diagram flow ( Figure 2 ), according to the eligibility criteria (PICOS) set out in Table 1 .

www.frontiersin.org

Figure 2 . PRISMA 2020 diagram flow. *SSCI, Social Sciences Citation Index; SCI-E, Science Citation Index Expanded; ESCI, Emerging Sources Citation Index; BKCI-SSH, Book Citation Index Social Sciences & Humanities; A&HCI, Arts & Humanities Citation Index.

Thus, this search identified a total of 276 articles from five different databases in the collection Web of Science (SSCI, Social Sciences Citation Index; SCI-E, Science Citation Index Expanded; ESCI, Emerging Sources Citation Index; BKCI-SSH, Book Citation Index –Social Sciences & Humanities; A&HCI, Arts & Humanities Citation Index). Excluding records by type of document, particularly articles (224), book chapters (9), and early access (20), 23 records were obtained for the screening, corresponding only to systematic reviews of the subject.

Then, 17 systematic reviews were excluded because they presented literature reviews (6); critical reading and writing reviews (6); specific critical thinking teaching techniques, because they focus on how to implement a specific technique and marginally on the development of critical thinking (2) or were outside the focus of this review (3), reducing the corpus to be analyzed to six full-text systematic reviews in English, retrieved and screened using the selection criteria defined with the PICOS strategy. Finally, a last review that included studies on the assessment of critical thinking through standardized instruments was excluded at this stage. Thus, the screening made it possible to identify five systematic reviews that met the inclusion criteria, as shown in Figure 2 . A summary of the general characteristics of the included systematic reviews can be found in Table 3 .

www.frontiersin.org

Table 3 . Characteristics of the included reviews.

The selected reviews included studies with different methodological designs, both quantitative (2) and a mixture of quantitative and qualitative design (3). The reviews addressed 29.8 critical thinking studies on average, all chosen following PRISMA 2020 guidelines for their respective selection. It was not possible to conduct a meta-analysis mainly due to the heterogeneity of the studies included in the reviews. One of them considered the Hedge’s g effect size, although not all the studies reviewed by their authors provided the necessary data to perform the calculation ( El Soufi and See, 2019 ). Another review reported three types of statistically significant gains (general, specific, and no gain) assessed from standardized tests in their studies, but without giving values or effect sizes ( Payan-Carreira et al., 2019 ). Finally, the remaining revisions informed methodological limitations of the studies they selected and/or did not report specific statistical tests from the studies ( Ahern et al., 2019 ; Puig et al., 2019 ; Tuononen et al., 2022 ).

The narrative synthesis of the selected systematic reviews made it possible to answer the proposed research questions. For this purpose, we consulted the guidelines for narrative syntheses in systematic reviews ( Popay et al., 2006 ) suggested by the document PRISMA-P 2015 ( Shamseer et al., 2015 ).

A summary of the objectives, definition of critical thinking, associated concepts and variables, and background and/or assumptions of each of the selected reviews can be found in Table 4 , while Table 5 presents a summary of the relevance of critical thinking to HE, key findings and challenges for future research arising from each of the selected reviews.

www.frontiersin.org

Table 4 . Summary of the objectives, definition of critical thinking, associated concepts and vari-ables, assumptions, and relevant authors of each of the reviews.

www.frontiersin.org

Table 5 . Summary of the relevance of critical thinking to HE, key findings, and challenges for future research of each of the reviews.

Table 6 synthesizes the findings of the approaches and strategies applied for the development of critical thinking in HE in each of the selected reviews.

www.frontiersin.org

Table 6 . Approaches and strategies applied for the development of critical thinking in HE from selected reviews.

One of the selected reviews sought to examine the teaching of generic competencies in HE ( Tuononen et al., 2022 ) and another one examined critical thinking in different disciplines, such as biomedical sciences, STEM (science, technology, engineering, and mathematics), social sciences, and humanities ( Puig et al., 2019 ). The other three studies have referred to the teaching of critical thinking in specific disciplines: English as a second language ( El Soufi and See, 2019 ), engineering ( Ahern et al., 2019 ), and health sciences ( Payan-Carreira et al., 2019 ).

Regarding the definition of critical thinking, in two of the five systematic reviews addressed, the definition used by the authors is literally the one proposed by Facione (1990) , who led the Delphi project on this topic.

Reviews argue that the critical thinking literature suggests that critical thinking is a disposition and skill ( Ahern et al., 2019 ; Puig et al., 2019 ). However, our results - that analyzed the set of the above five systematic reviews - show that, currently, the concept of skill is more prevalent in the literature than that of disposition. Two of the five reviews do not refer to dispositions at all ( El Soufi and See, 2019 ; Tuononen et al., 2022 ), and the other three do so only narrowly ( Payan-Carreira et al., 2019 ; Puig et al., 2019 ). In contrast, the five systematic reviews highlight the skills aspect, and two of them go deeper into it, highlighting the specific role of cognitive skills ( Payan-Carreira et al., 2019 ; Tuononen et al., 2022 ).

The different existing conceptualizations of critical thinking in the academic field have in common that it is a type of thinking that enables a reflective process and the ability to make evidence-based judgments. In addition to reflexivity and judgment, other terms and verbs highlighted in the conceptualizations are competence, ability, disposition, understanding, analyzing, inferring, and concluding, among others.

Regarding the approaches and methodologies used to teach critical thinking, the first reassuring finding is that the greatest effect is in the explicit teaching of general critical thinking skills ( El Soufi and See, 2019 ). In relation to teaching English, the methodologies identified as effective are the use of literary and narrative texts, assessment techniques such as peer review, teacher assessment and self-assessment, and approaches such as conducting debates, brainstorming, daily writing, scaffolding and active learning strategies ( El Soufi and See, 2019 ). In relation to engineering education, it is concluded that to date there has been no quantifiable evaluation of interventions implemented to enhance critical thinking ( Ahern et al., 2019 ). This review, which looked at critical thinking in different professional fields, concludes that the most commonly used teaching approach across all fields is the so-called immersion approach ( Puig et al., 2019 ). This finding suggests that the teaching of critical thinking is more effective when it is integrated transversely into the teaching of different fields than when it is treated as a separate subject. The reviews that have addressed critical thinking in the health sector are consistent with this review in highlighting the high use of the immersion approach. Within this approach, the most effective strategies appear to be simulation, reflective writing, concept mapping, problem-based learning [PBL] and case-based learning [CBL] ( Payan-Carreira et al., 2019 ). Finally, this review, which focused on the learning of generic skills in higher education, shows that active learning methods, i.e., those that promote students’ activity and role in their learning process, are factors that enhance the learning of critical thinking ( Tuononen et al., 2022 ).

These systematic reviews agree that the development of critical thinking skills is a key objective of different higher education programmes. They also agree that critical thinking contributes to the integration and performance of professionals in different work settings. Two of the reviews offer arguments to support this relevance. Firstly, a pedagogical argument suggests that, given the large amount of information available today, it is relevant that students can distinguish facts from opinions and evaluate and judge the credibility of the evidence presented to them ( El Soufi and See, 2019 ). In the same vein, it is pointed out that health science students should complement scientific and technical knowledge with advanced thinking dispositions and reasoning and decision-making skills ( Payan-Carreira et al., 2019 ). A second argument, of a more technical nature, relates to the requirements of university accreditation processes with assessment agencies ( Ahern et al., 2019 ).

4. Discussion

This systematic review of critical thinking in HE with PRISMA 2020 guidelines identified the main definitions of critical thinking, their commonalities, and differences, instructional strategies, and their effectiveness. The revision was conducted with five reviews from WoS databases which allowed focusing the search according to the PICOS strategy ( Porter et al., 2002 ; Liberati et al., 2009 ; Moher et al., 2009 ; Methley et al., 2014 ; Andreucci-Annunziata et al., 2022 ).

This work has shown that there are several definitions of critical thinking, which has implications for the formulation of theoretical and methodological guidelines in the teaching and learning process in higher education. Through the analysis ( Table 4 ), we found that critical thinking involves complex cognitive activities, which in turn need to be applied to specific contexts in which HE students operate.

Facione’s (1990) definition appears to be the most comprehensive, emphasizing critical thinking as evaluation carried out in a self-regulatory manner through sequential cognitive processes. There are nuances in what constitutes a skill, which implies a situated and evaluative implementation ( Cruz et al., 2017 ; Tuononen et al., 2022 ). El Soufi and See’s (2019) definition is more focused on evidence-based reasoning. Cruz et al. (2017) emphasize dispositions that point to mental and character qualities inherent in a person, which extends the definition to look beyond cognitive abilities.

Comparing these definitions, there is no complete consensus on what needs to be done in order to think critically, except that it involves higher order cognitive processes. The literature emphasizes the fact that students should move from what to learn to how to learn from a socio-constructivist perspective ( Andreucci-Annunziata, 2012 , 2016 ).This means that students must be able to make sense of the task they are doing, because at this level of complex thinking it is not enough to follow instructions or perform tasks: critical thinking necessarily implies students’ ability to evaluate.

Since the information in Table 4 , the question would be how to approach critical thinking, considering two related aspects: one has to do with the training of cognitive tasks in an instructional setting; the other requires aspects more linked to the affective/emotional being, a comprehensive quality that is trained according to the idiosyncrasy and background of each person. The five selected papers do not provide a common answer on how to do this. Critical thinking is associated with formal education in certain fields, such as engineering, language teaching, etc. This means that it is generally approached from specific problem situations and generalized to broader aspects where competences are demonstrated.

The review by El Soufi and See (2019) highlights specific teaching methods that enable critical thinking to be exercised. However, when looking for an answer, they suggest studies with larger populations and add that not all studies agree on a common definition of critical thinking so that different aspects of the process could be measured. Ahern et al. (2019) add that studies should be longer and integrate critical thinking into the curriculum, which would make it possible to evaluate a period of training. They question the assessment of critical thinking in the absence of a more consensual definition of the term. Finally, they suggest that stakeholders interested in demonstrating or assessing critical thinking, such as employers, should be involved.

Payan-Carreira et al. (2019) also discuss the difficulties in studying critical thinking, arguing that no consistent results are obtained from studies using the same teaching strategies. Nor are conclusive results obtained from different strategies. Puig et al. (2019) state that the conceptualization of critical thinking as both a set of skills and a set of dispositions lacks more specific information on how and to what extent learning strategies enhance critical thinking skills and dispositions.

There are several unresolved issues. There is still no consensus on what is meant by critical thinking. On the one hand, reference is made to formal teaching factors provided by universities, which recommend different strategies to acquire the necessary cognitive skills. On the other hand, there is recognition of defined dispositions, which are attributed to action tendencies, personality traits and positive qualities of individuals. Although the authors agree on the existence of both, studies on strategies for training during higher education prevail and the discussion on individual factors of students would appear in disposition or aspects of it. From the selected reviews, it can be seen that the definition of critical thinking obtained by the Delphi project ( Facione, 2011 ) is still valid, although this project was carried out three decades ago. It is worth noting that in the current discussion of critical thinking, the high cognitive skills are most often mentioned, more often than the dispositions, which raises a question. Is this because dispositions are more difficult to study or measure than skills?

It is recognized that critical thinking or reasoning requires dispositions; however, the relationship between dispositions and skills is not yet clear in light of these recent reviews. That is, critical thinking can be developed in students whose dispositions in terms of personal attributions favor this process ( Cruz et al., 2017 ; Wechsler et al., 2018 ). A possible question that arises is whether critical thinking skills are developed from motivational, attitudinal and other dispositions. From the perspective of individual development, there would be environmental conditions and people’s emotional world that favor the acquisition of critical thinking.

Another relevant finding of our analysis is that several of the reviews emphasize the need for methodologically sound studies to advance knowledge about critical thinking in general and how to teach it. For example, Tuononen et al. (2022) found that active learning occurs in learning environments. However, they found conflicting results regarding methodological issues such as study design, methods and sample size.

One question is whether there should be more research on the dispositional aspects of successful critical thinking students, taking into account socio-cultural factors. For example, it is easier to compare individuals with similar educational opportunities (e.g., Finland), as in the study included in this systematic review ( Tuononen et al., 2022 ), which alludes to methodological shortcomings.

If a framework definition of critical thinking training for higher education students were to be proposed, a high level of training in cognitive skills and a complex and comprehensive view of the conditions that make this possible would be paramount. These, as well as aspects of human talent, have been addressed as a condition that favours the development of critical approaches whenever pedagogical scenarios make it possible ( Andreucci-Annunziata, 2012 , 2016 ).

Looking more closely at the strategies that promote the development of critical thinking, and with a view to contributing to the construction of theory in this area, the emphasis on training in cognitive tasks in discipline-based teaching scenarios in four of the five reviews examined stands out. Focusing on the second question guiding this review, Table 6 shows that, with the exception of Tuononen et al. (2022) , who do not mention this aspect, the authors agree on strategic approaches such as the general approach, the infusion approach, the immersion approach or the mixed approach, depending on the specificity of the students.

When considering the specificity of the student, it seems appropriate not to forget the specificity of the teacher. Only the study by Ahern et al. (2019) shows that, from the perspective of the educator, there is a disconnect between the theory of critical thinking and the practice of teaching critical thinking in engineering. The above seems to be relevant to the repair of teacher education beyond techniques. In other words, although some techniques have demonstrated their effectiveness, the interventions carried out in all areas, such as the immersion approach and the infusion approach ( Payan-Carreira et al., 2019 ; Puig et al., 2019 ), followed by general critical thinking skills ( El Soufi and See, 2019 ), operate in a specific interactional framework between teacher and student ( Andreucci-Annunziata, 2016 ; Salas et al., 2021 ).

This interactional framework seems to be relevant for further research. It is within this framework that the teaching-learning process takes place. In turn, this teaching-learning process, of which the development of critical thinking becomes a fundamental part, is inserted into a defined institutional educational and strategic project with guiding guidelines. The guidelines for the process of restructuring and strategic planning of universities in the world, and especially in Latin America, have emphasized the review of the integration of the respective institutional educational projects into the general academic task. This has implications not only for the objectives of academic quality, but also for a rigorous analysis of the curricular models postulated in institutional educational projects. In this sense, the approaches that pay attention to critical thinking because of and in the process of development, focus on the students and enable them to insert themselves in the framework of the challenges imposed by global citizenship, the strengthening of academic skills (cognitive, affective and/or bonding) and life skills, sustainable development, the inclusion of diverse perspectives and openness to internationalization ( Delors et al., 1996 ; Sabzalieva et al., 2021 ). According to Molina et al. (2018) , an educational model in a university setting expresses “synthetic visions of theories or pedagogical approaches that guide specialists and teachers from the development and analysis of study programmes to the systematization of the teaching-learning process in university classrooms” (p. 153). It is this last process that is particularly highlighted in this review.

5. Conclusion

Not surprisingly, since critical thinking is the foundation of integral education in complex times, there has been much research and study on this topic. The recent bibliometric analysis of critical thinking ( Pagán Castaño et al., 2022 ) allowed us to support a review of reviews with current and updated data. Our review shows that dispositions and skills are key concepts in the promotion of critical thinking, and Giancarlo and Facione (2001) point out that the disposition to think critically is conceptually different from having the skills to think critically. Although all the authors reviewed agree in recognizing the importance and influence of dispositions in the area of critical thinking, there has been more research on skills than on dispositions. When turning to the aspect of teaching strategies for critical thinking, there was no consensus on how this should be done. In fact, the common recommendation to conduct further research on how to teach critical thinking raises the question of whether it is possible to teach this disposition or skill at all.

Further concerns arise about the conditions under which critical thinking can be developed in contexts that do not sufficiently validate it, or in higher education institutions that do not explicitly define it in their policies, although they require it in academic outcomes, and vice versa. The strategies derived from the methodologies reviewed do not fully respond to the development of critical thinking because they focus almost exclusively on the evaluation of outcomes rather than on the process of constructing this type of thinking and its applicability. It would be helpful to update paradigms in this area that support both study and teaching practice. A possible alternative is to consider complex paradigms ( Delors et al., 1996 ; Elfert, 2015 ) that support life skills in this 21st century and are concerned with placing students at the center of their learning process, in close contact with their interactional dialog environment (family members, teachers and classmates), which challenges them and proposes joint problem solving.

In the context of educational transformation, which is the purpose of this type of study, the elements to be considered are (1) the institutional educational project (mission, vision, objectives), (2) the institutional strategic plan (strategic quality objectives in the areas of teaching, management, research and links with the environment), (3) the study plan (degree programmes, undergraduate and postgraduate programmes and their respective curricula), and (4) the teaching-learning process. At this last level, which is also the first (the micro-genesis of educational transformations), the development of critical thinking is considered key in two senses: as training in cognitive tasks (instructional scenario) and as “training” in affective-relational attitudinal skills (expressive scenario). It is clear, in the opinion of the authors of this review, that this second approach is the one that requires further study and constitutes a line of research to be deepened and strengthened in future research. The conclusive analysis presented is consistent with the potential of complexity theory to address the challenges, at the micro- and macro-genetic levels, in establishing a new field of research in higher education from the perspective of educational psychology, and to provide possible solutions for the implementation of complex and creative thinking as a developmental goal for students and a strategic goal for higher education institutions. ( Davis and Sumara, 2014 ; Scott et al., 2018 ; Harmat and Herbert, 2020 ).

On the other hand, the main limitation of this review is that there is not enough information to explore the different weight of the methodologies implemented for the development of cognitive, affective-attitudinal, creativity, talent and academic performance skills in higher education in academic programmes. Likewise, given the origin of the systematic reviews found and analyzed in this study, there is no information on the application of critical thinking conceptualizations and teaching practices in Latin America ( Beneitone and Yarosh, 2015 ), which constitutes a challenge and line of research for a working team such as ours.

Author contributions

PA-A: original idea and institutional link. PA-A, AR, SC, and MR: conceptualization and writing—original draft preparation. AM and AV-M: methodology. AM and AR: formal analysis. PA-A, AM, and AV-M: writing—review and editing. PA-A: funding acquisition. PA-A: proofreading and final editing. All authors have read and agreed to the published version of the manuscript.

The article processing charge (APC) was funded by Instituto de Investigación y Postgrado, Facultad de Ciencias de la Salud, Universidad Central de Chile (Code: ACD 219201).

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2023.1141686/full#supplementary-material

Aglen, B. (2016). Pedagogical strategies to teach bachelor students evidence-based practice: a systematic review. Nurse Educ. Today 36, 255–263. doi: 10.1016/j.nedt.2015.08.025

PubMed Abstract | CrossRef Full Text | Google Scholar

Ahern, A., Dominguez, C., McNally, C., O’Sullivan, J. J., and Pedrosa, D. (2019). A literature review of critical thinking in engineering education. Stud. High. Educ. 44, 816–828. doi: 10.1080/03075079.2019.1586325

CrossRef Full Text | Google Scholar

Ahern, A., O’Connor, T., Mac Ruairc, G., McNamara, M., and O’Donnell, D. (2012). Critical thinking in the university curriculum: the impact on engineering education. Eur. J. Eng. Educ. 37, 125–132. doi: 10.1080/03043797.2012.666516

Al-Ghadouni, A. (2021). Instructional approaches to critical thinking: an overview of reviews. Rev. AR. de Clin. Psicol. 30, 240–246. doi: 10.24205/03276716.2020.2020

Andreucci-Annunziata, P. (2012). El Talento: Una Construcción en y Desde la Pedagogía Dialógica. Psicoperspectivas 11, 185–205. doi: 10.5027/psicoperspectivas-Vol11-Issue2-fulltext-200

Andreucci-Annunziata, P. (2016). Talento y argumentación: Una alianza dialógica en el aula. Profesorado Revista de Currículum y Formación de Profesorado 20, 2–17. doi: 10.30827/profesorado.v20i2.10405

Andreucci-Annunziata, P., Mellado, A., and Vega-Muñoz, A. (2022). Telesupervision in psychotherapy: a bibliometric and systematic review. Int. J. Environ. Res. Public Health 19:16366. doi: 10.3390/ijerph192316366

Aromataris, E., Fernandez, R., Godfrey, C. M., Holly, C., Khalil, H., and Tungpunkom, P. (2015). Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach. JBI Evid. Implement. 13, 132–140. doi: 10.1097/XEB.0000000000000055

Association of American Colleges and Universities (2004). Liberal Education Outcomes: A Preliminary Report on Student Achievement in College . Washington, DC, Association of American Colleges and Universities.

Google Scholar

Association of American Colleges and Universities (2015). Falling Short? College Learning and Career Success . Washington, DC. Available at: http://www.aacu.org/sites/default/files/files/LEAP/2015employerstudentsurvey

Atkinson, D. (1997). A critical approach to critical thinking. TESOL Q. 31, 71–94. doi: 10.2307/3587975

Bailin, S., Case, R., Coombs, J., and Daniels, L. (1999). Common misconceptions of critical thinking. J. Curric. Stud. 31, 269–283. doi: 10.1080/002202799183124

Bakkalbasi, N., Bauer, K., Glover, J., and Wang, L. (2006). Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomed. Digit. Libr. 3:7. doi: 10.1186/1742-5581-3-7

Behar-Horenstein, L. S., and Niu, L. (2011). Teaching critical thinking skills in higher education: a review of the literature. J. Coll. Teach. Learn. 8, 25–42. doi: 10.19030/tlc.v8i2.3554

Beneitone, P., and Yarosh, M. (2015). Tuning impact in Latin America: is there implementation beyond design? Tuning. J. High. Educ. 3, 187–216. doi: 10.18543/tjhe-3(1)-2015pp187-216

Braun, H. I., Shavelson, R. J., Zlatkin-Troitschanskaia, O., and Borowiec, K. (2020). Performance assessment of critical thinking: conceptualization, design, and implementation. Front. Educ. 5:156. doi: 10.3389/feduc.2020.00156

Bulik, S. (1978). Book use as a Bradford-Zipf phenomenon. Coll. Res. Libr. 39, 215–219. doi: 10.5860/crl_39_03_215

Chadegani, A. A., Salehi, H., Yunus, M. M., Farhadi, H., Fooladi, M., Farhadi, M., et al. (2013). A comparison between two main academic literature collections: Web of Science and Scopus databases. Asian Soc. Sci. 9:5. doi: 10.5539/ass.v9n5p18

Claris, L., and Riley, D. (2012). Situation critical: critical theory and critical thinking in engineering education. Eng. Stud. 4, 101–120. doi: 10.1080/19378629.2011.649920

Coil, D., Wenderoth, M. P., Cunningham, M., and Dirks, C. (2010). Teaching the process of science: faculty perceptions and an effective methodology. CBE Life Sci. Educ. 9, 524–535. doi: 10.1187/cbe.10-01-0005

Crossley-Frolick, K. A. (2010). Beyond model UNITED NATIONS: simulating multi-level, multi-actor diplomacy using the millennium development goals. Int. Stud. Perspect. 11, 184–201. doi: 10.1111/j.1528-3585.2010.00401.x

Cruz, G., Payan-Carreira, R., and Dominguez, C. (2017). Critical thinking education in the Portuguese higher education institutions: a systematic review of educational practices. Rev. Lusofona Educ. 38, 43–61. doi: 10.24140/issn.1645-7250.rle38.03

Cruz, G., Payan-Carreira, R., Dominguez, C., Silva, H., and Morais, F. (2021). What critical thinking skills and dispositions do new graduates need for professional life? Views from Portuguese employers in different fields. High. Educ. Res. Dev. 40, 721–737. doi: 10.1080/07294360.2020.1785401

Davis, B., and Sumara, D. (2014). Complexity and Education: Inquiries into Learning, Teaching, and Research . Lawrence Erlbaum. Routledge.

Delors, J., Amagi, I., Carneiro, R., Chung, F., Geremek, B., Gorham, W., et al. (1996). La educación encierra un tesoro: informe para la UNESCO de la Comisión Internacional sobre la Educación para el Siglo Veintiuno . París, Santillana Ediciones UNESCO.

Dewey, J. (1910). How We Think . D.C. Heath & Co. Publishers: Boston, United States.

Djamnezhad, D., Koltcheva, N., Dizdarevic, A., Mujezinovic, A., Peixoto, C., Coelho, V., et al. (2021). Social and emotional learning in preschool settings: A systematic map of systematic reviews. Front. Educ. 6:691670. doi: 10.3389/feduc.2021.691670

Dobrov, G. M., Randolph, R. H., and Rauch, W. D. (1979). New options for team research via international computer networks. Scientometrics 1, 387–404. doi: 10.1007/bf02016658

Dominguez, C. (2018a). A European Review on Critical Thinking Educational Practices in higher Education Institutions. UTAD/EU: Vila Real

Dominguez, C. (2018b). A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century . ERASMUS + Programme/EU UTAD: Vila Real

Dumitru, D., Bigu, D., Elen, J., Jiang, L., Railienè, A., Penkauskienè, D., et al. (2018). A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century . UTAD: Vila Real, Portugal.

El Soufi, N., and See, B. H. (2019). Does explicit teaching of critical thinking improve critical thinking skills of English language learners in higher education? A critical review of causal evidence. Stud. Educ. Eval. 60, 140–162. doi: 10.1016/j.stueduc.2018.12.006

Elder, L., and Paul, R. (2003). Critical thinking: teaching students how to study and learn. J. Dev. Educ. 27, 36–38.

Elfert, M. (2015). UNESCO, the Faure report, the Delors report, and the political utopia of lifelong learning. Eur. J. Educ. 50, 88–100. doi: 10.1111/ejed.12104

Ennis, R. (1996). Critical Thinking . Upper Saddle River, NJ: Prentice Hall.

Ennis, R. H. (1989). Critical Thinking and Subject Specificity: Clarification and Needed Research. Educ. Res.. 18, 4–10. doi: 10.3102/0013189X018003004

Ennis, R. H. (2016). Critical Thinking Across the Curriculum: A Vision. Topoi 37, 165–184. doi: 10.1007/s11245-016-9401-4

Ennis, R. H., and Weir, E. (1985). The Ennis-Weir Critical Thinking Essay Test . Pacific Grove, CA: Critical Thinking Press and Software.

European Parliament Council (2008). The establishment of the European qualifications framework for lifelong learning. Official Journal of European Union: EU. Available at: https://www.cedefop.europa.eu/en/projects/european-qualifications-framework-eqf

Eurydice (2011). Science Education in Europe: National policies, practices and research . EACEA: Brussels, Belgium.

Facione, P.A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction . The California Academic Press: Millibrae, CA, USA.

Facione, P. A. (2000). The disposition toward critical thinking: its character, measurement, and relationship to critical thinking skill. Informal Log. 20, 61–84. doi: 10.22329/il.v20i1.2254

Facione, P. A. (2011). Critical Thinking: What It Is and Why It Counts . San Jose: California Academic Press.

Facione, N. C., Facione, P. A., and Sanchez, C. A. (1994). Critical thinking disposition as a measure of competent clinical judgment: the development of the California critical thinking disposition inventory. J. Nurs. Educ. 33, 345–350. doi: 10.3928/0148-4834-19941001-05

Falagas, M. E., Pitsouni, E. I., Malietzis, G., and Pappas, G. (2008). Comparison of PubMed, Scopus, web of science, and Google scholar: strengths and weaknesses. FASEB J. 22, 338–342. doi: 10.1096/fj.07-9492LSF

Fejes, A. (2006). The Bologna process-governing higher education in Europe through standardisation. Revista Española de Educación Comparada 12, 203–232.

Giacomazzi, M., Fontana, M., and Camilli Trujillo, C. (2022). Contextualization of critical thinking in sub-Saharan Africa: a systematic integrative review. Think. Ski. Creat 43:100978. doi: 10.1016/j.tsc.2021.100978

Giancarlo, C. A., and Facione, P. A. (2001). A look across four years at the disposition toward critical thinking among undergraduate students. J. Gen. Educ. 50, 29–55. doi: 10.1353/jge.2001.0004

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains. Dispositions, skills, structure training, and metacognitive monitoring. Am. Psychol. 53, 449–455. doi: 10.1037//0003-066x.53.4.449

Halpern, D. F. (2001). Assessing the effectiveness of critical thinking instruction. J. Gen. Educ. 50, 270–286. doi: 10.1353/jge.2001.0024

Halpern, D. F. (2014). Thought and knowledge: An introduction to critical thinking , 5th. Psychology Press: New York, United States.

Harmat, L., and Herbert, A. (2020). Complexity thinking as a tool for understanding the didactics of psychology. Front. Psychol. 11:542446. doi: 10.3389/fpsyg.2020.542446

Harzing, A. W., and Alakangas, S. (2016). Google scholar, scopus and the web of science: A longitudinal and cross-disciplinary comparison. Scientometrics 106, 787–804. doi: 10.1007/s11192-015-1798-9

Hildenbrand, K. J., and Schultz, J. A. (2012). Development of a rubric to improve critical thinking. Athl. Train. Educ. J. 7, 86–94. doi: 10.4085/070386

Hoskins, B., and Deacon Crick, R. (2010). Competences for learning to learn and active citizenship: different currencies or two sides of the same coin? Eur. J. Educ. 45, 121–137. doi: 10.1111/j.1465-3435.2009.01419.x

Hyytinen, K., Saari, E., and Elg, M. (2019). “Human-centered co-evaluation method as a means for sustainable service innovations,” in Human-centered digitalization and services . eds. M. Toivonen and E. Saari (Springer), 57–75.

Kumar, S. (2014). Application of Bradford’s law to human-computer interaction research literature. DESIDOC J. Libr. Inf. Technol. 34, 223–231.

Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C., Ioannidis, J. P., et al. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J. Clin. Epidemiol. 62, e1–e34. doi: 10.7326/0003-4819-151-4-200908180-00136

Liu, O. L., Frankel, L., and Roohr, K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessment. ETS Res. Rep. Ser. 2014, 1–23. doi: 10.1002/ets2.12009

McPeck, J. E. (1981). Critical Thinking and Education . St. Martin's Press: New York, United States.

Methley, A. M., Campbell, S., Chew-Graham, C., McNally, R., and Cheraghi-Sohi, S. (2014). PICO, PICOS and SPIDER: a comparison study of specificity and sensitivity in three search tools for qualitative systematic reviews. BMC Health Serv. Res. 14:579. doi: 10.1186/s12913-014-0579-0

Mitchell, R., Myles, F., Johnston, B., and Ford, P. (2003). “Criticality and the ‘key skills’ agenda in undergraduate linguistics” in Notes of Talk given at Subject Centre for Languages, Linguistics, and Area Studies Seminar: ‘Key Skills Linguistics’. University of Southampton, London (Southampton, London, England: CILT)

Moher, D., Liberati, A., Tetzlaff, J., and Altman, D. G. (2009). The PRISMA Group. Preferred reporting items for sstematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 6:e1000097. doi: 10.1371/journal.pmed.1000097

Molina, J. M., Lavandero, J., and Hernández, L. M. (2018). El modelo educativo como fundamento del accionar universitario: Experiencia de la Universidad Técnica de Manabí, Ecuador [the educational model as the Foundation of University Action.: the experience of the technical University of Manabi, Ecuador]. Revista Cubana de Educación Superior 37, 151–164.

Mongeon, P., and Paul-Hus, A. (2016). The journal coverage of web of science and scopus: a comparative analysis. Scientometrics 106, 213–228. doi: 10.1007/s11192-015-1765-5

Moore, T. (2013). Critical thinking: seven definitions in search of a concept. Stud. High. Educ. 38, 506–522. doi: 10.1080/03075079.2011.586995

Moore, T. (2014). Wittgenstein, Williams and the terminologies of higher education: a case study of the term ‘critical’. J. Acad. Lang. Learn. 8, A95–A108.

Morse, P. M., and Leimkuhler, F. F. (1979). Technical note—exact solution for the Bradford distribution and its use in modeling infor-mational data. Oper. Res. 27, 187–198. doi: 10.1287/opre.27.1.187

Naimpally, A., Ramachandran, H., and Smith, C. (2012). Lifelong Learning for Engineers and Scientists in the Information Age . Elsevier: Amsterdam, Holland.

Niu, L., Behar-Horenstein, L., and Garvan, C. W. (2013). Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ. Res. Rev. 9, 114–128. doi: 10.1016/j.edurev.2012.12.002

Organización de Naciones Unidas (2018). Objetivos de Desarrollo Sostenible. Available at: https://unstats.un.org/sdgs/files/report/2018/TheSustainableDevelopmentGoalsReport2018-es.pdf

Pagán Castaño, J., Arnal-Pastor, M., Pagán-Castaño, E., and Guijarro-García, M. (2022). Bibliometric analysis of the literature on critical thinking: an increasingly important competence for higher education students. Econ. Res-Ekonomska Istraživanja , 1–22. doi: 10.1080/1331677X.2022.2125888

Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., et al. (2021a). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int. J. Surg. 88:105906. doi: 10.1016/j.ijsu.2021.105906

Page, M. J., Moher, D., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., et al. (2021b). PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. BMJ 372:n160. doi: 10.1136/bmj.n160

Partnership for 21st Century Skills (2003). Learning for the 21st Century . Partnership for 21st Century Learning. Washington, DC, USA.

Payan-Carreira, R., Cruz, G., Papathanasiou, I. V., Fradelos, E., and Jiang, L. (2019). The effectiveness of critical thinking instructional strategies in health professions education: a systematic review. Stud. High. Educ. 44, 829–843. doi: 10.1080/03075079.2019.1586330

Pontigo, J., and Lancaster, F. W. (1986). Qualitative aspects of the Bradford distribution. Scientometrics 9, 59–70. doi: 10.1007/BF02016608

Popay, J., Roberts, H., Sowden, A., Petticrew, M., Arai, L., Rodgers, M., et al. (2006). Guidance on the conduct of narrative synthesis in systematic reviews: a product from the ESRC Methods Programme . Institute for Health Research, University of Lancaster.

Porter, A. L., Kongthon, A., and Lu, J. C. (2002). Research profiling: improving the literature review. Scientometrics 53, 351–370. doi: 10.1023/A:1014873029258

Price, D. D. S. (1976). A general theory of bibliometric and other cumulative advantage processes. J. Am. Soc. Inf. Sci. 27, 292–306. doi: 10.1002/asi.4630270505

Puig, B., Blanco-Anaya, P., Bargiela, I., and Crujeiras Pérez, B. (2019). A systematic review on critical thinking intervention studies in higher education across professional fields. Stud. High. Educ. 44, 860–869. doi: 10.1080/03075079.2019.1586333

Ramanathan, V., and Kaplan, R. B. (1996). Audience and voice in current L1 composition texts: some implications for ESL student writers. J. Second. Lang. Writ. 5, 21–34. doi: 10.1016/S1060-3743(96)90013-2

Sabzalieva, E., Chacon, E., Bosen, L. L., Morales, D., Mutize, T., Nguyen, H., et al. (2021). Thinking higher and beyond perspectives on the futures of higher education to 2050. UNESCO IESALC. ISBN: 978-980-7175-57-9. Available at: https://www.voced.edu.au/content/ngv:90677

Saiz, C., and Rivas, S. (2008). Assessment in critical thinking: a proposal for differentiating ways of thinking. Ergo Nueva Época 22, 25–66.

Salas, M., Díaz, A., and Medina, L. (2021). Mentorías en Chile: De la política diseñada a la puesta en acto [Mentoring in Chile: from policy design to implementation.]. Rev. Mex. Investig. Educ. 26, 449–474.

Scott, A., Woolcott, G., Keast, R., and Chamberlain, D. (2018). Sustainability of collaborative networks in higher education research projects: why complexity? Why now? Public Manag. Rev. 20, 1068–1087. doi: 10.1080/14719037.2017.1364410

Shamseer, L., Moher, D., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., et al. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ 349:g7647. doi: 10.1136/bmj.g7647

Siegel, H. (1988). Educating Reason: Rationality, Critical Thinking, and Education . Routledge: New York, United States.

Swokowski, E.W. (1988). Calculus with Analytic Geometry , 4th; Grupo Editorial Planeta: Mexico City, Mexico.

Tuononen, T., Hyytinen, H., Kleemola, K., Hailikari, T., Männikkö, I., and Toom, A. (2022). Systematic review of learning generic skills in higher education—enhancing and impeding factors. Front. Educ. 7:885917. doi: 10.3389/feduc.2022.885917

Van Damme, D., and Zahner, D. (2022). Does Higher Education Teach Students to Think Critically? OECD Publishing: Paris, France

Wechsler, S. M., Saiz, C., Rivas, S. F., Vendramini, C. M. M., Almeida, L. S., Mundim, M. C., et al. (2018). Creative and critical thinking: independent or overlapping components? Think. Skills Creat. 27, 114–122. doi: 10.1016/j.tsc.2017.12.003

Zahavi, H., and Friedman, Y. (2019). The Bologna process: an international higher education regime. Eur. J. High. Educ. 9, 23–39. doi: 10.1080/21568235.2018.1561314

Keywords: critical thinking, higher education, teaching strategies, skills, dispositions

Citation: Andreucci-Annunziata P, Riedemann A, Cortés S, Mellado A, del Río MT and Vega-Muñoz A (2023) Conceptualizations and instructional strategies on critical thinking in higher education: A systematic review of systematic reviews. Front. Educ . 8:1141686. doi: 10.3389/feduc.2023.1141686

Received: 10 January 2023; Accepted: 20 February 2023; Published: 09 March 2023.

Reviewed by:

Copyright © 2023 Andreucci-Annunziata, Riedemann, Cortés, Mellado, del Río and Vega-Muñoz. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Augusto Mellado, [email protected] ; Alejandro Vega-Muñoz, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Bookmark this page

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

An Overview of How to Design Instruction Using Critical Thinking Concepts



Instructional design involves two deeply interrelated parts: structures and tactics. In this article we focus on structures. Structures involve the "what" of the course: What am I going to teach? What content am I going to teach? What questions or problems will be central to the course? What concepts will be fundamental? What amount of information will students need to access? What point of view or frame of reference do they need to learn to reason within? What is my concept of the course? What overall plan shall I adopt? What requirements shall I set up? What grading requirements? What performance profiles? etc...



We suggest that for every course you teach, there are five defining dimensions you should carefully think through. You should note that each of these "structures" have a "tactical" dimension to them. That is, something of the "how" (you will cover) is implicit in these decisions as to "what" (you will cover). They are:

.}


  • Open access
  • Published: 05 September 2024

Exploring instructional design in K-12 STEM education: a systematic literature review

  • Suarman Halawa 1 ,
  • Tzu-Chiang Lin 2 , 3 &
  • Ying-Shao Hsu   ORCID: orcid.org/0000-0002-1635-8213 4  

International Journal of STEM Education volume  11 , Article number:  43 ( 2024 ) Cite this article

1 Altmetric

Metrics details

This study aimed to analyze articles published in the Web of Science database from 2012 to 2021 to examine the educational goals and instructional designs for STEM education. We selected articles based on the following criteria: (a) empirical research; (b) incorporating instructional design and strategies into STEM teaching; (c) including intervention; (d) focusing on K-12 education and on assessment of learning outcomes; and (e) excluding higher education and STEAM education. Based on the criteria, 229 articles were selected for coding educational goals and instructional designs for STEM education. The aspects of STEM educational goals were coded including engagement and career choice, STEM literacy, and twenty-first century competencies. The categories of instructional designs for STEM education were examined including design-based learning, inquiry-based learning, project-based learning, and problem-based learning. The results showed that engagement and career choices and STEM literacy were mainly emphasized in STEM education. Design-based learning was adopted more than inquiry-based, project-based, or problem-based learning, and this instructional design was mainly used to achieve STEM literacy. It is suggested that studies on twenty-first century competencies may require more research efforts in future STEM education research.

Introduction

Emphasizing STEM (science, technology, engineering, and mathematics) has been the main focus of policy makers in many countries (English, 2016 ; National Academy of Engineering & National Research Council, 2014 ; National Research Council, 2012 , 2013 ) to meet economic challenges (Kelley & Knowles, 2016 ). Educational systems are accordingly prioritizing STEM to prepare students’ capability for the workplace to face the sophisticated technologies and competitive economy (Kayan-Fadlelmula et al., 2022 ). Hence, students are expected to be interested in STEM so that they will engage in and pursue careers in STEM-related fields (Lie et al., 2019 ; Struyf et al., 2019 ). Besides, we need a new generation that has the abilities to develop proficient knowledge, to apply such knowledge to solve problems, and to face existing and upcoming issues of the twenty-first century (Bybee, 2010 ).

Although STEM education has been proved to benefit students, there is a lack of understanding of instructional design for STEM education, despite the fact that such understanding is critical to research and to classroom practices. Limited understanding of relevant instructional design may lead to problems in implementing STEM education in the classroom. There is hence a need to examine educational goals, specific designs, and features of the instructional designs consistently and specifically documented in the STEM education literature. Therefore, this current study conducted systematic analysis of the literature to understand the educational goals and instructional designs for STEM education. Based on the analysis, we present a thorough picture of how researchers have developed instructional designs for STEM education.

Despite the fact that many researchers have promoted STEM education, the definition of STEM education has not reached a consensus in the literature, and there is a certain degree of disagreement in the scientific community. Lamb et al. ( 2015 ) defined STEM as a broad area encompassing many disciplines and epistemological practices. Other researchers, such as Breiner et al. ( 2012 ), defined STEM as applying transdisciplinary knowledge and skills in solving real-world problems. A similar definition established by Shaughnessy ( 2013 ) regarding STEM education is problem solving based on science and mathematics concepts that incorporate engineering strategies and technology. Another study defined STEM education as teaching approaches based on technology and engineering design that integrate the concepts and practices of science and mathematics (Sanders & Wells, 2006 ). In this study, we clarify STEM education as an approach that utilizes integrations of knowledge and skills from science, technology, engineering, and/or mathematics to solve real-world problems that help students to succeed in school learning, future careers, and/or society.

The definition of STEM as an integrated approach involving science, technology, engineering, and mathematics raises several pertinent questions about its composition and expectations. First, the requirement for all four disciplines to be present in order to qualify an educational program or project as “STEM” is debatable. Conceptually, integrating any two or more fields helps foster the interdisciplinary learning that is the hallmark of STEM education. This flexibility allows educators to tailor their programs to match the available resources and specific learning outcomes without necessarily incorporating all four disciplines in every instance. Regarding the classification of “science” within STEM, it is more a conglomerate of disciplines—such as biology, chemistry, physics, and earth sciences—than a single field. This diversity within science enriches STEM education, providing a broader knowledge base and problem-solving skills. Each scientific discipline brings a unique perspective and set of tools to the interdisciplinary mix, enhancing the complexity and richness of STEM learning experiences.

Furthermore, previous studies have identified several challenges to the implementation of STEM education in the classroom including poor motivation of students, weak connection with individual learners, little support from the school system, poor content without integration across disciplines, lack of quality assessments, poor facilities, and lack of hands-on experience (Ejiwale, 2013 ; Hsu & Fang, 2019 ; Margot & Kettler, 2019 ). To help teachers face challenges in the advancement of STEM education, Hsu and Fang ( 2019 ) proposed a 5-step STEM curriculum designs framework and provided examples of how to apply it to a lesson plan to help teachers design their instruction. This previous study also suggested that researchers conduct more investigations related to instructional design to enrich our understanding of various aspects of STEM education. Teachers of STEM require more opportunities to construct their perspective and a vision of STEM education as well as to conduct appropriate instructional designs. Moreover, from review articles published from 2000 to 2016, Margot and Kettler ( 2019 ) found that in multiple studies concerning similar challenges and supports, teachers believed that the availability of a quality curriculum would enhance the success of STEM education. Teachers need to provide and use an appropriate instructional design for STEM education and understand the educational goals. Therefore, we see the need to conduct research related to STEM education, especially exploring the instructional design because identifying and using a quality instructional design could increase the effectivess of STEM education.

According to the previous literature review, educational goals for instructional design were highlighted in STEM education. First, engagement and career choice need to be emphasized in STEM learning to improve students’ interest and self-efficacy (Vongkulluksn et al., 2018 ). Students need to engage in STEM education to raise their interest and engagement in STEM and to increase and develop a STEM-capable workforce (Honey et al., 2014 ; Hsu & Fang, 2019 ; Schütte & Köller, 2015 ). Engaging students in STEM education could improve their attitudes (Vossen et al., 2018 ) and their interest in STEM fields, and encourage them to pursue STEM careers (Means et al., 2017 ).

Second, STEM literacy needs to be promoted in K-12 schools (Falloon et al., 2020 ; Jackson et al., 2021 ) to develop students’ ability to encounter global challenges (Bybee, 2010 ). Students need to have the ability to apply concepts from science, technology, engineering, and mathematics, and skills to solve problems related to social, personal, and global issues in society (Bybee, 2010 ; Jackson et al., 2021 ). Besides, improving students’ STEM literacy is needed for their decision-making, participation in civic and cultural affairs, and economic productivity (National Academy of Engineering & National Research Council, 2014 ; National Research Council, 2011 ).

Last, regarding the twenty-first century competencies, students are anticipated to have abilities of creativity and innovation, problem solving, critical thinking, collaboration and communication (Boon, 2019 ) as citizens, workers, and leaders in the twenty-first century (Bryan et al., 2015 ; National Academy of Engineering & National Research Council, 2014 ; Stehle & Peters-Burton, 2019 ). These abilities are critical for students to adapt and thrive in a changing world (National Research Council, 2013 ). Also, students need to have the abilities to adapt to the twenty-first century in order to succeed in the new workforce (Bybee, 2013 ).

Considering the achievement of students’ engagement, motivation, STEM literacy, as well as twenty-first century competencies, many countries have significantly enlarged the funding for research and education relevant to STEM (Sanders, 2009 ). One of the strands of the existing research is to help teachers know how to implement STEM education in schools (Aranda, 2020 ; Barak & Assal, 2018 ; English, 2017 ). Researchers have proposed instructional designs for STEM education including design-based learning (Kelley & Knowles, 2016 ; Yata et al., 2020 ), inquiry-based learning (Bybee, 2010 ), project-based learning (Capraro et al., 2013 ), and problem-based learning (Carpraro & Slough, 2013 ).

Design-based learning focuses on technological and engineering design. This instructional design engages students in learning about engineering design practices (Fan et al., 2021 ; Guzey et al., 2016 ; Hernandez et al., 2014 ) through the steps of designing, building, and testing (Yata et al., 2020 ). Design-based learning promotes problem solving, design, building, testing, and communication skills (Johnson et al., 2015 ) and improves students’ interest in STEM activities (Vongkulluksn et al., 2018 ). Also, design-based learning improves students’ engineering abilities and twenty-first century competencies (Wu et al., 2019 ) and attitudes (Vossen et al., 2018 ), and engages them in understanding core disciplinary ideas (Guzey et al., 2016 ).

Inquiry-based learning focuses on engaging students in hands-on activities to investigate scientific phenomena (Lederman & Lederman, 2012 ) and to construct their new knowledge (Bybee, 2010 ; Halawa et al., 2020 ). Students are encouraged to plan and design their experiments, analyze and interpret data, argue, and communicate their findings (Halawa et al., 2023 ; National Research Council, 2012 , 2013 ). Inquiry-based learning is also deemed to improve students’ knowledge, interest, engagement (Sinatra et al., 2017 ) and creativity (Smyrnaiou et al., 2020 ). Besides, researchers have noticed the importance of inquiry-based learning for improving students’ attitudes toward science-related careers (Kim, 2016 ). Although inquiry-based learning mainly focuses on science education to engage students in authentic learning (Halawa et al., 2024 ), it has been known to share common goals and characteristics with mathematics, technology, and engineering (Grangeat et al., 2021 ; Lin et al., 2020 ). Common elements in STEM education are engaging students in asking questions and testing their ideas in a systematic and interactive way (Grangeat et al., 2021 ).

Project-based learning and problem-based learning, both instructional designs, engage students in experiential and authentic learning with open-ended and real-world problems (English, 2017 ). Yet, project-based learning tends to be of longer duration and occurs over an extended period of time (Wilson, 2021 ), while problem-based learning is usually embedded in multiple problems (Carpraro & Slough, 2013 ). STEM project-based learning focuses on engaging students in an ill-defined task within a well-defined outcome situated with a contextually rich task, requiring them to solve certain problems (Capraro et al., 2013 ). Project-based learning and problem-based learning are both used to develop students’ problem solving, creativity, collaboration skills (Barak & Assal, 2018 ), and attitude (Preininger, 2017 ).

According to previous studies, researchers have adopted STEM instructional designs to achieve certain educational goals. For instance, in the aspects of engagement and career choice, Sullivan and Bers ( 2019 ) used design-based learning to improve students’ interest in engineering and students’ performance in elementary school. Kang et al. ( 2021 ) adopted inquiry-based learning for secondary school by embedding careers education to foster the students’ interest in science. Vallera and Bodzin ( 2020 ) adopted project-based learning at primary school in the northeastern United States to improve students’ STEM literacy and attitude. Preininger ( 2017 ) used problem-based learning to influence students’ attitudes toward mathematics and careers involving mathematics. In the aspect of STEM literacy, King and English ( 2016 ) adopted design-based learning to enable students to apply STEM concepts to the model of the construction of an optical instrument. Han et al. ( 2015 ) adopted STEM project-based learning to improve the performance of low-performing students in mathematics. Lastly, regarding the twenty-first century competencies, English et al. ( 2017 ) adopted design-based learning to improve students’ capabilities of handling the complexity of the task (English et al., 2017 ).

In conclusion, studies have grown to explore educational goals related to instructional designs for STEM education. However, consistent and systematic reviews related to instructional designs in K-12 STEM education are comparatively scarce. Although there are some reviews of the STEM education literature (Andrews et al., 2022 ; Gladstone & Cimpian, 2021 ; Kaya-Fadlelmula et al., 2022 ; López et al., 2022 ; Margot & Kettler, 2019 ; Martín-Páez et al., 2019 ; Nguyen et al., 2021 ), it is noteworthy that previous studies only explored undergraduate instruction in STEM education (Andrews et al., 2022 ; Henderson et al., 2011 ; Nguyen et al., 2021 ). Therefore, to fill the research gap, this current study conducted a systematic analysis of literature to understand the educational goals and instructional designs for K-12 STEM education from articles published between 2012 and 2021. The research questions of this study were formulated as follows:

What STEM education goals were more focused on in the reviewed articles? What was the trend of educational goals in the reviewed articles?

What instructional designs were more focused on in the reviewed articles? What was the trend of the instructional design in the review articles?

What instructional designs were more focused on to achieve certain educational goals in the reviewed articles?

What features of instructional designs were more focused on in the reviewed articles?

Data collection

To identify the target literature for further analysis, this study conducted several rounds of searching the Web of Science (WOS) database for articles (Gough et al., 2012 ; Møller & Myles, 2016 ). A systematic literature review using the PRISMA guidelines was used for article selection (Møller & Myles, 2016 ). First, we searched for articles using the keyword “STEM Education” along with “learning”, “teaching”, “curriculum”, and “professional development”, to refine the search results. The search identified a total of 1,531 articles published in the Web of Science from 2012 to 2021 (Fig.  1 ). We initially excluded duplicated articles; the search retrieved a total of 1,513 articles. We then screened the titles, abstract, and keywords of the articles based on the following criteria: (a) empirical research; (b) incorporating instructional design and strategies into STEM teaching; (c) including intervention; (d) focusing on K-12 education and on assessment of learning outcomes; and (e) excluding higher education and STEAM education. During this screening, we discussed which articles met the criteria through round-table discussions, and determined the preliminary target candidates composed of 394 articles. A full-text examination was then conducted. In this round of examination, we removed the articles without clear information about the educational goals and instructional designs related to STEM education. Finally, a corpus of literature comprising 229 articles was formed for further analysis.

figure 1

PRISMA flow diagram of articles selection

Data analysis

According to the research questions, for this study, we developed a coding framework to conduct content analysis and to categorize the target literature. We first selected paradigmatic references of STEM education and instructional design from high quality publications. These articles provided sets of core concepts and terms to shape the provisional coding categories. We then constantly reviewed the paradigmatic references and discussed them to improve the coding scheme. The final analytic framework with coding categories was developed as follows. The first category, STEM educational goals, includes engagement and career choice (Honey et al., 2014 ; Hsu & Fang, 2019 ), STEM literacy (Falloon et al., 2020 ; Jackson et al., 2021 ), and twenty-first century competencies (Boon, 2019 ) (see Appendix 1). The second category, instructional design, includes design-based learning (Yata et al., 2020 ), inquiry-based learning (Bybee, 2010 ; Halawa et al., 2020 ), project-based learning (Capraro & Slough, 2013 ), and problem-based learning (Priemer et al., 2020 ). From the review articles, we found that 6E - oriented STEM (engage, explore, explain, engineer, enrich, and evaluate) and game-based learning were used for STEM education. These two instructional designs were added to our coding scheme. Articles that did not specify the instructional design were coded as “others”. We then analyzed the outcomes to see whether instructional design successfully improved STEM educational goals. We analyzed design-based, inquiry-based, and project-based learning to achieve engagement and career choice, STEM literacy, and a combination of engagement and career choice and STEM literacy because the selected articles mainly concentrated on them. We categorized the outcomes as positively improved, partially improved, and none (Amador et al., 2021 ). Instructional design that successfully increased STEM educational goals was categorized as positively improved. Instructional design that only increased a part of STEM educational goals was categorized as partially improved. If the instructional design did not increase STEM educational goals, we categorized it as none.

We then extended our coding scheme to identify the features of design-based, inquiry-based, and project-based learning. We focused on these three instructional designs because the selected articles mainly adopted them. Yata et al. ( 2020 ) proposed designing, building, and testing as the features of design-based learning. Other features of instructional designs including questioning or identifying problems, experimenting, analyzing, explaining, collaborating, communicating, and reflecting were proposed as features of inquiry-based learning (Bybee, 2010 ; Halawa et al., 2020 ) and project-based learning (Capraro et al., 2013 ). From the review articles, we found that redesigning was one of the features of instructional design and so added it to the coding scheme. These features of instructional designs were adopted for our coding scheme including questioning or identifying problems, designing, building, testing, experimenting, analyzing, collaborating, reflecting, communicating, and redesigning (Appendix 2). We then calculated the number of articles that adopted these features of instructional designs. We further summarized the features of instructional designs that were frequently used in the selected articles.

In order to make sure the coding process was reliable, we conducted a trial coding by randomly selecting 40 articles and individually categorizing the articles into the aforementioned categories: (a) STEM education goal, and (b) instructional design. Interrater reliability was calculated using a percent agreement metric reaching an acceptable level of 0.85 (McHugh, 2012 ). The discrepancies between authors were negotiated and solved through discussions. The NVivo 11 software was utilized to complete coding works on the remaining articles. We then calculated and reported descriptive statistics of the coded data as the analytic results.

Engagement and career choice as the main focused STEM educational goals

Table 1 shows that more articles focused on engagement and career choice (64 articles) and STEM literacy (61 articles) than twenty-first century competencies (16 articles). The articles also mainly focused on a combination of engagement and career choice and STEM literacy (47 articles) and a combination of engagement and career choice and twenty-first century competencies (18 articles). Nine articles were found that focused on the three learning goals of engagement and career choice, STEM literacy, and twenty-first century competencies.

Table 1 shows the numbers of articles regarding educational goals for STEM education for each 2 years in the review papers. The number of articles per 2 years increased from 2012 to 2021. The trend analysis indicated that engagement and career choice and STEM literacy increased greatly from 2014 to 2021. The numbers of articles focused on the combination of two educational goals (STEM literacy and twenty-first competencies) and three learning goals (engagement and career choice, STEM literacy, and twenty-first competencies) from 2016 to 2021 are also presented.

Design-based and inquiry-based learning as the main instructional designs for STEM

Table 2 reveals the numbers of articles that used instructional design for STEM education. The instructional designs of design-based, inquiry-based, project-based, and problem-based learning were mainly used and continued to be used over the study period. The trend analysis indicated a big jump in design-based, inquiry-based, and project-based learning from 2018 to 2021.

Table 2 also shows the instructional designs and educational goals for STEM from review papers. Most articles adopted design-based (80 articles), inquiry-based (46 articles), project-based (42 articles), and problem-based (27 articles) learning.

Design-based learning mainly used to achieve STEM literacy

The findings shown in Table  3 identified that STEM instructional designs were used differently to achieve engagement and career choice, STEM literacy, and the combination of engagement and career choice and STEM literacy. We found that design-based learning was mainly adopted to achieve STEM literacy (28 articles), while inquiry-based learning was mainly used to achieve engagement and career choice (14 articles) and the combination of engagement and career choice and STEM literacy (14 articles). Also, more articles (15 articles) adopted project-based learning to achieve engagement and career choice. Furthermore, more design-based learning (7 articles) and problem-based learning (4 articles) than inquiry-based learning (2 articles) and project-based learning (1) articles were adopted to achieve twenty-first century competencies.

As we identified that a major portion of the articles adopted design-based learning, inquiry-based learning, and project-based learning focused on engagement and career choice, STEM literacy, and a combination of engagement and career choice and STEM literacy (see Table  3 ), we focused further analysis on the outcomes of STEM educational goals in the articles. The total number of selected articles was 124, of which 54 adopted design-based learning, 37 adopted inquiry-based learning, and 33 adopted project-based learning (Table  4 ).

We categorized the outcomes of STEM education goals into three categories (positively improved, partially improved, and none) (Amador et al., 2021 ). Table 4 shows that the majority of selected articles adopted design-based, inquiry-based, and project-based learning, improving STEM educational goals positively. Most selected articles found that design-based learning positively improved engagement and career choice (10 articles), STEM literacy (26 articles), and a combination of engagement and career choice and STEM literacy (15 articles). Also, most of the selected articles indicated that inquiry learning has a positive impact on engagement and career choice (14 articles), STEM literacy (7 articles), and a combination of engagement and career choice and STEM literacy (13 articles). Project-based learning has demonstrated a beneficial impact on various outcomes, as reported across the selected literature. Specifically, 12 articles documented the enhancement of engagement and career decisions, nine indicated the advancement of STEM literacy, and six discussed a combined effect on engagement, career choice, and STEM literacy.

Frequently used features of STEM instructional designs

To identify the frequently used features of STEM instructional design, we further explored the activities in the selected articles. As previous results show that the major part of articles adopted design-based learning, inquiry-based learning, and project-based learning, we further analyzed the frequently used features of these STEM instructional designs that focused on engagement and career choice, STEM literacy, and combination of engagement and career choice and STEM literacy (see Table  3 ). We selected 54 articles that adopted design-based learning, 37 adopted inquiry-based learning, and 33 adopted project-based learning (Table  5 ).

Frequently used features of design-based learning

Based on the findings, a large portion of the selected articles adopted design-based learning for STEM education (54 articles). Table 5 shows the features that were adopted to implement instructional design for design-based learning. More than half of the selected articles adopted designing, building, testing, collaborating, experimenting, and reflecting. Building (88.9%), designing (87.0%), and testing (70.4%) were used to engage students in engineering (Yata et al., 2020 ). Besides, engaging students in these activities required students to use their knowledge and skills (Kelley & Knowles, 2016 ). For example, Aranda et al. ( 2020 ) and Lie et al. ( 2019 ) implemented design-based learning by asking students to design a process to both prevent and test for cross-pollination of non-GMO from GMO fields. In these selected articles, the curriculums were focused on helping students with designing, building, and testing.

Collaborating, which engages students in working with their classmates in the process of design-based learning, was also mainly emphasized in the selected articles (64.8%). For instance, English and King ( 2019 ) asked students to work with their groups to discuss the possible design of the bridge. Researchers also emphasized experimenting (53.7%) to engage students in design-based learning. English ( 2019 ) engaged students in investigating their feet and shoes. Students collected, represented, analyzed data, and drew conclusions from their findings. Lie et al. ( 2019 ) helped students conduct an investigation to prevent cross-contamination of non-GMO from GMO corn fields. The last critical feature of design-based learning is reflecting (51.9%). In this activity, students engaged in assessing their solutions against a set of criteria and constraints, generating, and evaluating solutions (Cunningham et al., 2019 ). By engaging students in reflecting, students have an opportunity to improve their design and choose their best strategy (Aranda et al., 2020 ; Lie et al., 2019 ).

Frequently used features of inquiry-based learning

As shown in Table  5 , the inquiry-based learning approach was frequently adopted by researchers for STEM education. The features of this approach applied to achieve specific STEM education goals (e.g., engagement and career choice, and STEM literacy) included experimenting (91.9%), collaborating (83.8%), reflecting (62.2%), and communicating (51.4%) (see Table  5 ). This finding indicated that the top three frequently used features of inquiry-based learning in STEM were experimenting, collaborating, and reflecting, which play an essential role when learners try out their ideas about a real-world problem related to STEM. For example, a four-phase inquiry (clarifying the situation, hands-on experiments, representing, analyzing the produced data, and reporting/whole-class discussions) for authentic modeling tasks guided students to develop their credibility of the tasks and to acquire STEM knowledge (Carreira & Baioa, 2018 ).

Frequently used features of project-based learning

As previously mentioned, project-based learning is one of the major approaches to support instructional design in the reviewed STEM education studies. The results shown in Table  5 further indicate the features that researchers tended to integrate into instructional design for project-based learning. More than half (51.5%) of the selected articles reported “reflecting” as a pivotal part of teaching that triggered students’ project-based learning. Reflecting is deemed to depict learners’ active perceptions and deliberation of what they encounter and what they are doing. This may contribute to their competence to retrieve appropriate information, to provide feedback, and to revise the project underlying their learning. For example, in Dasgupta et al.’s ( 2019 ) study, a design journal was utilized to help students’ reflection on what they knew, what is necessary to know, as well as their learning outcomes. Vallera and Bodzin ( 2020 ) also addressed the critical design features of their curriculum to help students achieve information obtaining, evaluating, and communicating in the learning project based on real-world contexts.

Besides, researchers focused on project-based learning regarding STEM have a tendency to foster students’ learning via “identifying problems” (48.5%). These studies can be differentiated into two types based on whether the researchers provided a driving question for the learning project. In Vallera and Bodzin’s ( 2020 ) study, the instructional design arranged a clear-cut driving question to guide students’ thinking about helping farmers to prepare products for sale in a farmers’ market. This led students to extend their thinking and identify further problems while solving the driving question. As for Barak and Assal’s ( 2018 ) study, their instructional design provided open-ended tasks and ill-defined problems. Such arrangements were deemed to afford students’ learning through problem defining and learning objective setting.

It is also noteworthy to mention that the percentages of “experimenting” and “collaborating” in studies involved with project-based learning design were lower than those of studies with design-based learning or inquiry-based learning. However, researchers who were interested in STEM project-based learning would still to some extent agree with instructional design that may provide opportunities to students to access authentic scientific activities and social communications.

This study focused on analyzing the STEM educational goals and instructional designs adopted in the 2012–2021 articles. The findings of this study present knowledge and understanding of the educational goals that need to be considered in STEM education, and how these goals could be achieved by adopting various STEM instructional designs.

Educational goals for STEM education

The majority of reviewed articles adopted instructional designs to achieve the goals of engagement, career choice and STEM literacy. In contrast, few articles focused on twenty-first century competencies. It is not surprising because many recent studies in nature emphasized economic viewpoints and workplace-readiness outcomes in the STEM education field (Cheng et al., 2021 ; Kelley & Knowles, 2016 ). The aspects of engagement and career choice were frequently considered in many previous studies on STEM education (Struyf et al., 2019 ; Vongkulluksn et al., 2018 ; Vossen et al., 2018 ). It indicated that engagement and career choice are important goals for STEM education (Honey et al., 2014 ; Hsu & Fang, 2019 ; Kelley & Knowles, 2016 ). Engaging and motivating students in STEM education are necessary to enhance their understanding of their future careers (Fleer, 2021 ) and to cultivate them to continue STEM learning (Maltese et al., 2014 ). Students who were motivated and interested in STEM education would pursue STEM careers (Maltese & Tai, 2011 ). Furthermore, the aspects of STEM literacy are also addressed in the reviewed articles. The aspects of STEM literacy (e.g., knowledge and capabilities) are deemed important for students’ productive engagement with STEM studies, issues, and practices (Falloon et al., 2020 ). The focus of STEM literacy encourages students to apply their knowledge to life situations and solve problems (Bybee, 2010 ). The importance of STEM literacy has been highlighted in several national documents (e.g., Committee on STEM Education of the National Science & Technology Council, 2018 ; National Research Council, 2011 ; U.S. Department of Education, 2016 ). These findings provide insights into what teaching goals have been focused on in STEM education. For instance, engagement and career choice have been mainly focused on in STEM education because the STEM teaching was designed to connect to the students’ real-world experiences or future professional situations (Strobel et al., 2013 ). The authentic and meaningful experience could engage and motivate students in the activity, and later they should pursue their future careers related to what they have learned.

However, there are few selected articles focused on twenty-first century competencies, although many previous studies considered the twenty-first century competencies as important goals for students. Some studies have advocated that students should be engaged in interdisciplinary sets of complex problems and encourage them to use critical thinking and develop their creativity and innovation as well as collaboration (Finegold & Notabartolo, 2010 ; Jang, 2016 ). Engaging students in STEM education focused on twenty-first century competencies could prepare them for the workplace and help them become successful in STEM-related fields (Jang, 2016 ). Future researchers should consider integrating twenty-first century competencies into STEM education to complement the existing focus on engagement, career choice, and STEM literacy, preparing students for a broader range of skills necessary for the modern workforce.

Instructional design for STEM education

Although the reviewed articles adopted various instructional designs for STEM education, the articles mostly adopted design-based rather than inquiry-based, project-based, or problem-based learning. The findings are in accordance with the existing literature on STEM education. Notably, these results corroborate the conclusions drawn from a comprehensive systematic review conducted by Mclure et al. ( 2022 ). Design-based learning was adopted to achieve the goals of STEM literacy, engagement and career choice, and this instructional design tended to be used more often according to the trend analysis. This indicated that design-based learning was considered as a main instructional design for STEM education. This instructional design has become an essential approach to engaging K-12 students in STEM education (Bybee, 2013 ; National Academy of Engineering & National Research Council, 2014 ; National Research Council, 2013 ). Some researchers claimed that students who participate in design-based learning could make meaningful connections between knowledge and skills by solving problems (English & King, 2019 ; Kelley et al., 2010 ). Design-based learning engages students in authentic problems and challenges that increase their level of engagement (Sadler et al., 2000 ), help students learn fundamental scientific principles (Mehalik et al., 2008 ), and build students’ natural and intuitive experience (Fortus et al., 2004 ). In the process of design, students learn the concepts of science, technology, and mathematics in the process of designing, building, or testing products (Yata et al., 2020 ). For instance, students have to learn the concept of energy to design a house that produces more renewable energy than it consumes over a period of 1 year (Zheng et al., 2020 ). It was also found that the majority of selected articles which adopted design-based learning successfully improved learners’ engagement, career choice, and STEM literacy (Table  4 ). The results align with the findings of a previous meta-analysis focusing on STEM education at the middle school level (Thomas & Larwin, 2023 ). K-12 students’ STEM learning successfully improved because the selected articles reported studies conducting design-based learning in K-12 education. For example, Cunningham et al. ( 2019 ) successfully implemented design-based learning to improve elementary students’ learning outcomes, while Fan et al. ( 2018 ) found that design-based learning positively improved secondary students’ conceptual knowledge and attitude.

However, the selected articles have not equally used the features of design-based learning such as collaborating, reflecting, and redesigning. We identified that the selected articles mainly used designing, building, and testing to engage students in engineering activities. One of the explanations for this finding is that researchers may face challenges in implementing a full cycle of design-based learning because of the time limit of instruction, so they only focus on the process of designing, building, and testing. Collaborating, reflecting, and redesigning should be emphasized while adopting effective design-based learning because students could solve complex problems by collaborating with others. With collaboration, the students can learn/solve problems through discussion within the group. This activity allows students to share new ideas and debate with others to generate solutions. Reflecting on the data and experience allows students to make improvements to their model and leads them to redesign it to produce a better model. This process could also grow students’ science knowledge (Fortus et al., 2004 ). This finding hence suggests future studies, and educators emphasize more collaborating, reflecting, and redesigning for design-based learning for STEM instruction.

Moreover, inquiry-based learning, project-based learning, and problem-based learning were adopted in some selected articles. Inquiry-based learning was considered to enable and to promote connections within and across curriculum disciplines and improve students’ engagement in STEM education (Attard et al., 2021 ). Project-based and problem-based learning can be used to engage students in authentic problems (Blumenfeld et al., 1991 ) and to improve their engagement in STEM education (Beckett et al., 2016 ). Furthermore, we identified that inquiry-based learning mainly engages students in experimenting, collaborating, and reflecting (Kim, 2016 ), and project-based learning (Han et al., 2015 ) mainly engages students in identifying problems and reflecting. This finding reveals the frequently used features of inquiry-based learning and project-based learning. Teachers could use these components of instructional design for preparing their instruction for teaching STEM. Given these findings, it is advisable to explore the integration of inquiry-based, project-based, and problem-based learning alongside design-based learning in STEM education. Such an approach may enhance the effectiveness of STEM education by providing a more comprehensive strategy to improve STEM literacy, engagement, and career choice among K-12 students.

However, we identified that some essentials of these instructional designs have not been included in selected articles. For instance, studies adopting inquiry-based learning rarely asked students to propose their questions, although questioning is one of the frequently used features of inquiry (National Research Council, 2012 , 2013 ). One of the possible explanations for this finding is that students may have a lack of experience with inquiry learning and not know how to formulate meaningful questions, and they may tend to propose low-level factual questions related to their personal interests (Krajcik et al., 1998 ). Besides, STEM education requires students to engage in complex real-world problems, which requires sufficient ability to propose meaningful questions. Yet, we expect that future studies and teachers should encourage students to propose their own questions because questioning improves students’ creativity, critical thinking, and problem solving skills (Hofstein et al., 2005 ). Teachers could start asking students to propose their own questions once they have experience and ability to propose good questions. Krajcik et al. ( 1998 ) suggested providing situations in which students can receive informative and critical feedback from teachers, classmates, and others so as to propose their own significant questions.

Conclusions

From an instructional design perspective, this study provides crucial insights into practical STEM education approaches. The findings underscore the importance of aligning instructional designs with specific STEM educational goals. The trend analysis revealed a significant increase in focus on engagement, career choice, and STEM literacy from 2014 to 2021, with a particularly sharp rise observed between 2018 and 2021. Each instructional design approach demonstrated unique strengths: design-based learning fosters STEM literacy. In contrast, inquiry-based and project-based learning effectively enhanced engagement and career choice. The study delineates specific features of these instructional designs that contribute to their success, such as building and testing in design-based learning, experimenting and collaborating in inquiry-based learning, and reflecting and problem identification in project-based learning.

Furthermore, this study advocates for a deliberate and systematic application of inquiry-based and project-based learning alongside design-based learning. Such integration is likely to cultivate a more dynamic and interactive learning environment that encourages critical thinking, problem-solving, and collaborative skills among students. The integration of twenty-first century competencies in the instructional design of STEM, though less presented, suggests a potential research space for further exploration of STEM teaching. This study recommends an expanded focus on incorporating these competencies to ensure a holistic educational approach that addresses immediate educational goals and equips students with essential skills for future challenges.

Teachers’ limited understanding of STEM instructional design also presents a significant challenge, necessitating targeted professional development initiatives. Educators must comprehend and implement a comprehensive approach that aligns educational goals with appropriate instructional designs to optimize STEM learning outcomes. This approach involves clearly defining learning objectives, such as STEM literacy, selecting suitable instructional designs, and effectively guiding students through the chosen learning process.

The findings in this study furnish instructional designers and educators with a clear framework for developing targeted STEM curricula. The research accentuates the importance of aligning instructional design features with specific educational goals, suggesting that a nuanced, goal-oriented approach to STEM instruction can significantly enhance student outcomes in literacy, engagement, and career readiness. These insights offer a robust foundation for refining and optimizing instructional design strategies in STEM education.

Availability of data and materials

No applicable.

Amador, J. M., Bragelman, J., & Superfine, A. C. (2021). Prospective teachers’ noticing: A literature review of methodological approaches to support and analyze noticing. Teaching and Teacher Education . https://doi.org/10.1016/j.tate.2020.103256

Article   Google Scholar  

Andrews, T. C., Speer, N. M., & Shultz, G. V. (2022). Building bridges: A review and synthesis of research on teaching knowledge for undergraduate instruction in science, engineering, and mathematics. International Journal of STEM Education, 9 (1), 66. https://doi.org/10.1186/s40594-022-00380-w

Aranda, M. L., Guzey, S. S., & Moore, T. J. (2020). Multidisciplinary discourses in an engineering design-based science curricular unit. International Journal of Technology and Design Education, 30 (3), 507–529. https://doi.org/10.1007/s10798-019-09517-5

Attard, C., Berger, N., & Mackenzie, E. (2021). The positive influence of inquiry-based learning teacher professional learning and industry partnerships on student engagement with STEM. Frontiers in Education . https://doi.org/10.3389/feduc.2021.693221

Barak, M., & Assal, M. (2018). Robotics and STEM learning: Students’ achievements in assignments according to the P3 Task Taxonomy-practice, problem solving, and projects. International Journal of Technology and Design Education, 28 (1), 121–144. https://doi.org/10.1007/s10798-016-9385-9

Beckett, G. H., Hemmings, A., Maltbie, C., Wright, K., Sherman, M., & Sersion, B. (2016). Urban high school student engagement through cincySTEM iTEST projects. Journal of Science Education and Technology, 25 (6), 995–1007. https://doi.org/10.1007/s10956-016-9640-6

Berland, L., Steingut, R., & Ko, P. (2014). High school student perceptions of the utility of the engineering design process: Creating opportunities to engage in engineering practices and apply math and science content. Journal of Science Education and Technology, 23 (6), 705–720. https://doi.org/10.1007/s10956-014-9498-4

Blumenfeld, P. C., Soloway, E., Marx, R. W., Krajcik, J. S., Guzdial, M., & Palincsar, A. (1991). Motivating project-based learning: Sustaining the doing, supporting the learning. Educational Psychologist, 26 (3–4), 369–398. https://doi.org/10.1080/00461520.1991.9653139

Boon, N. S. (2019). Exploring STEM competences for the 21st century . Current and Critical Issues in Curriculum, Learning and Assessment 30. Geneva: IBEUNESCO. Retrieved June 19, 2022, from https://unesdoc.unesco.org/ark:/48223/pf0000368485.locale=en

Breiner, J. M., Harkness, S. S., Johnson, C. C., & Koehler, C. M. (2012). What is STEM? A discussion about conceptions of STEM in education and partnerships. School Science and Mathematics, 112 (1), 3–11. https://doi.org/10.1111/j.1949-8594.2011.00109.x

Bryan, L. A., Moore, T. J., Johnson, C. C., & Roehrig, G. H. (2015). Integrated STEM education. In C. C. Johnson, E. E. Peters-Burton, & T. J. Moore (Eds.), STEM roadmap: A framework for integration (pp. 23–37). Taylor & Francis.

Chapter   Google Scholar  

Bybee, R. W. (2010). Advancing STEM education: A 2020 vision. Technology and Engineering Teacher, 70 (1), 30–35.

Google Scholar  

Bybee, R. W. (2013). The case for STEM education . NSTA press.

Capraro, R. M., Capraro, M. M., & Morgan, J. (Eds.). (2013). Project-based learning: An integrated science, technology, engineering, and mathematics (STEM) approach (2nd ed.). Sense.

Capraro, R. M., & Slough, S. W. (2013). Why PBL? Why STEM? Why now? an introduction to STEM project-based learning. In R. M. Capraro, M. M. Capraro, & J. R. Morgan (Eds.), STEM Project-based learning: An integrated Science, Technology, Engineering, and Mathematics (STEM) approach (pp. 1–5). Sense Publishers.

Carreira, S., & Baioa, A. M. (2018). Mathematical modelling with hands-on experimental tasks: On the student’s sense of credibility. ZDM-Mathematics Education, 50 (1–2), 201–215. https://doi.org/10.1007/s11858-017-0905-1

Chen, C. S., & Lin, J. W. (2019). A practical action research study of the impact of maker-centered STEM-PjBL on a rural middle school in Taiwan. International Journal of Science and Mathematics Education, 17 , S85–S108. https://doi.org/10.1007/s10763-019-09961-8

Cheng, L., Antonenko, P., Ritzhaupt, A. D., & MacFadden, B. (2021). Exploring the role of 3D printing and STEM integration levels in students’ STEM career interest. British Journal of Educational Technology, 52 (3), 1262–1278. https://doi.org/10.1111/bjet.13077

Committee on STEM Education of the National Science & Technology Council. (2018). Charting a course for success: America’s strategy for STEM education . Executive Office of the President National Science and Technology Council.

Cunningham, C. M., Lachapelle, C. P., Brennan, R. T., Kelly, G. J., Tunis, C. S. A., & Gentry, C. A. (2019). The impact of engineering curriculum design principles on elementary students’ engineering and science learning. Journal of Research in Science Teaching, 57 (3), 423–453. https://doi.org/10.1002/tea.21601

Dasgupta, C., Magana, A. J., & Vieira, C. (2019). Investigating the affordances of a CAD enabled learning environment for promoting integrated STEM learning. Computers & Education, 129 , 122–142. https://doi.org/10.1016/j.compedu.2018.10.014

Ejiwale, J. (2013). Barriers to successful implementation of STEM education. Journal of Education and Learning, 7 (2), 63–74.

English, L. D. (2016). STEM education K-12: Perspectives on integration. International Journal of STEM Education, 3 (1), 3. https://doi.org/10.1186/s40594-016-0036-1

English, L. D. (2017). Advancing elementary and middle school STEM education. International Journal of Science and Mathematics Education, 15 (1), 5–24. https://doi.org/10.1007/s10763-017-9802-x

English, L. D. (2019). Learning while designing in a fourth-grade integrated STEM problem. International Journal of Technology and Design Education, 29 (5), 1011–1032. https://doi.org/10.1007/s10798-018-9482-z

English, L. D., & King, D. (2019). STEM integration in sixth grade: Designing and constructing paper bridges. International Journal of Science and Mathematics Education, 17 (5), 863–884. https://doi.org/10.1007/s10763-018-9912-0

English, L. D., King, D., & Smeed, J. (2017). Advancing integrated STEM learning through engineering design: Sixth-grade students’ design and construction of earthquake resistant buildings. Journal of Educational Research, 110 (3), 255–271. https://doi.org/10.1080/00220671.2016.1264053

Falloon, G., Hatzigianni, M., Bower, M., Forbes, A., & Stevenson, M. (2020). Understanding K-12 STEM education: A framework for developing STEM literacy. Journal of Science Education and Technology, 29 (3), 369–385. https://doi.org/10.1007/s10956-020-09823-x

Fan, S.-C., Yu, K.-C., & Lin, K.-Y. (2021). A framework for implementing an engineering-focused stem curriculum. International Journal of Science and Mathematics Education, 19 (8), 1523–1541. https://doi.org/10.1007/s10763-020-10129-y

Fan, S.-C., Yu, K.-C., & Lou, S.-J. (2018). Why do students present different design objectives in engineering design projects? International Journal of Technology and Design Education . https://doi.org/10.1007/s10798-017-9420-5

Finegold, D., & Notabartolo, A. S. (2010). 21st century competencies and their impact: an interdisciplinary literature review. In D. Finegold, M. Gatta, H. Salzman, & S. J. Schurman (Eds.), Transforming the US workforce development system (pp. 19–56). Labor and Employment Relations Association.

Fleer, M. (2021). When preschool girls engineer: Future imaginings of being and becoming an engineer. Learning Culture and Social Interaction, 30 , 100372. https://doi.org/10.1016/j.lcsi.2019.100372

Fortus, D., Dershimer, R. C., Krajcik, J., Marx, R. W., & Mamlok-Naaman, R. (2004). Design-based science and student learning. Journal of Research in Science Teaching, 41 (10), 1081–1110. https://doi.org/10.1002/tea.2004

Gladstone, J. R., & Cimpian, A. (2021). Which role models are effective for which students? A systematic review and four recommendations for maximizing the effectiveness of role models in STEM. International Journal of STEM Education, 8 (1), 59. https://doi.org/10.1186/s40594-021-00315-x

Gough, D., Oliver, S., & Thomas, J. (2012). Introducing systematic reviews. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews (pp. 1–16). Sage.

Grangeat, M., Harrison, C., & Dolin, J. (2021). Exploring assessment in STEM inquiry learning classrooms. International Journal of Science Education, 43 (3), 345–361. https://doi.org/10.1080/09500693.2021.1903617

Guzey, S. S., Moore, T. J., Harwell, M., & Moreno, M. (2016). STEM integration in middle school life science: Student learning and attitudes. Journal of Science Education and Technology, 25 (4), 550–560. https://doi.org/10.1007/s10956-016-9612-x

Halawa, S., Hsu, Y.-S., & Zhang, W.-X. (2023). Analysis of physics textbooks through the lens of inquiry practices. The Asia-Pacific Education Researcher, 32 (4), 497–506. https://doi.org/10.1007/s40299-022-00671-4

Halawa, S., Hsu, Y.-S., & Zhang, W.-X. (2024). Inquiry activity design from Singaporean and indonesian physics textbooks. Science & Education, 33 (3), 581–607. https://doi.org/10.1007/s11191-022-00396-2

Halawa, S., Hsu, Y.-S., Zhang, W.-X., Kuo, Y.-R., & Wu, J.-Y. (2020). Features and trends of teaching strategies for scientific practices from a review of 2008–2017 articles. International Journal of Science Education, 42 (7), 1183–1206. https://doi.org/10.1080/09500693.2020.1752415

Han, S., Capraro, R., & Capraro, M. M. (2015). How science, technology, engineering, and mathematics (stem) project-based learning (pbl) affects high, middle, and low achievers differently: The impact of student factors on achievement. International Journal of Science and Mathematics Education, 13 (5), 1089–1113. https://doi.org/10.1007/s10763-014-9526-0

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48 (8), 952–984. https://doi.org/10.1002/tea.20439

Hernandez, P. R., Bodin, R., Elliott, J. W., Ibrahim, B., Rambo-Hernandez, K. E., Chen, T. W., & de Miranda, M. A. (2014). Connecting the STEM dots: Measuring the effect of an integrated engineering design intervention. International Journal of Technology and Design Education, 24 (1), 107–120. https://doi.org/10.1007/s10798-013-9241-0

Hofstein, A., Navon, O., Kipnis, M., & Mamlok-Naaman, R. (2005). Developing students’ ability to ask more and better questions resulting from inquiry-type chemistry laboratories. Journal of Research in Science Teaching, 42 (7), 791–806. https://doi.org/10.1002/tea.20072

Honey, M., Pearson, G., & Schweingruber, H. (Eds.). (2014). STEM integration in K-12 education: Status, prospects and an agenda for research . National Academies Press.

Hsu, Y.-S., & Fang, S.-C. (2019). Opportunities and challenges of STEM education. In Y.-S. Hsu & Y.-F. Yeh (Eds.), Asia-Pacific STEM teaching practices: From theoretical frameworks to practices (pp. 1–16). Springer Singapore.

Jackson, C., Mohr-Schroeder, M. J., Bush, S. B., Maiorca, C., Roberts, T., Yost, C., & Fowler, A. (2021). Equity-oriented conceptual framework for k-12 STEM literacy. International Journal of STEM Education, 8 (1), 38. https://doi.org/10.1186/s40594-021-00294-z

Jang, H. (2016). Identifying 21st century STEM competencies using workplace data. Journal of Science Education and Technology, 25 (2), 284–301. https://doi.org/10.1007/s10956-015-9593-1

Johnson, C. C., Peters-Burton, E. E., & Moore, T. J. (2015). STEM road map: A framework for integrated STEM Education . Routledge.

Book   Google Scholar  

Kang, J., Salonen, A., Tolppanen, S., Scheersoi, A., Hense, J., Rannikmae, M., & Keinonen, T. (2021). Effect of embedded careers education in science lessons on students’ interest, awareness, and aspirations. International Journal of Science and Mathematics Education . https://doi.org/10.1007/s10763-021-10238-2

Kayan-Fadlelmula, F., Sellami, A., Abdelkader, N., & Umer, S. (2022). A systematic review of STEM education research in the GCC countries: Trends, gaps and barriers. International Journal of STEM Education, 9 (1), 2. https://doi.org/10.1186/s40594-021-00319-7

Kelley, T. R., Brenner, D. C., & Pieper, J. T. (2010). Two approaches to engineering design: Observations in STEM education. Journal of STEM Teacher Education, 47 (2), 5–40.

Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3 (1), 11. https://doi.org/10.1186/s40594-016-0046-z

Kim, H. (2016). Inquiry-based science and technology enrichment program for middle school-aged female students. Journal of Science Education and Technology, 25 (2), 174–186. https://doi.org/10.1007/s10956-015-9584-2

Krajcik, J., Blumenfeld, P. C., Marx, R. W., Bass, K. M., Fredricks, J., & Soloway, E. (1998). Inquiry in project-based science classrooms: Initial attempts by middle school students. Journal of the Learning Sciences, 7 (3–4), 313–350. https://doi.org/10.1080/10508406.1998.9672057

Lamb, R., Akmal, T., & Petrie, K. (2015). Development of a cognition-priming model describing learning in a STEM classroom. Journal of Research in Science Teaching, 52 (3), 410–437. https://doi.org/10.1002/tea.21200

Lederman, N. G., & Lederman, J. S., et al. (2012). Nature of scientific knowledge and scientific inquiry: Building instructional capacity through professional development. In B. J. Fraser (Ed.), Second international handbook of science education (pp. 335–359). Springer.

Lie, R., Guzey, S. S., & Moore, T. J. (2019). Implementing engineering in diverse upper elementary and middle school science classrooms: Student learning and attitudes. Journal of Science Education and Technology, 28 (2), 104–117. https://doi.org/10.1007/s10956-018-9751-3

Lin, K.-Y., Hsiao, H.-S., Williams, P. J., & Chen, Y.-H. (2020). Effects of 6E-oriented STEM practical activities in cultivating middle school students’ attitudes toward technology and technological inquiry ability. Research in Science & Technological Education, 38 (1), 1–18. https://doi.org/10.1080/02635143.2018.1561432

López, N., Morgan, D. L., Hutchings, Q. R., & Davis, K. (2022). Revisiting critical STEM interventions: A literature review of STEM organizational learning. International Journal of STEM Education, 9 (1), 39. https://doi.org/10.1186/s40594-022-00357-9

Maltese, A. V., Melki, C. S., & Wiebke, H. L. (2014). The nature of experiences responsible for the generation and maintenance of interest in STEM. Science Education, 98 (6), 937–962. https://doi.org/10.1002/sce.21132

Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational experiences with earned degrees in STEM among US students. Science Education, 95 (5), 877–907. https://doi.org/10.1002/sce.20441

Margot, K. C., & Kettler, T. (2019). Teachers’ perception of STEM integration and education: A systematic literature review. International Journal of STEM Education, 6 (1), 2. https://doi.org/10.1186/s40594-018-0151-2

Martín-Páez, T., Aguilera, D., Perales-Palacios, F. J., & Vílchez-González, J. M. (2019). What are we talking about when we talk about STEM education? A Review of Literature. Science Education, 103 (4), 799–822. https://doi.org/10.1002/sce.21522

McHugh, M. L. (2012). Interrater reliability: The kappa statistic. Biochemia Medica, 22 , 276–282.

McLure, F. I., Tang, K.-S., & Williams, P. J. (2022). What do integrated STEM projects look like in middle school and high school classrooms? A systematic literature review of empirical studies of iSTEM projects. International Journal of STEM Education, 9 (1), 73. https://doi.org/10.1186/s40594-022-00390-8

Means, B., Wang, H., Wei, X., Lynch, S., Peters, V., Young, V., & Allen, C. (2017). Expanding STEM opportunities through inclusive STEM-focused high schools. Science Education, 101 (5), 681–715. https://doi.org/10.1002/sce.21281

Mehalik, M. M., Doppelt, Y., & Schuun, C. D. (2008). Middle-school science through design-based learning versus scripted inquiry: Better overall science concept learning and equity gap reduction. Journal of Engineering Education, 97 (1), 71–85. https://doi.org/10.1002/j.2168-9830.2008.tb00955.x

Møller, A. M., & Myles, P. S. (2016). What makes a good systematic review and meta-analysis? BJA British Journal of Anaesthesia, 117 (4), 428–430. https://doi.org/10.1093/bja/aew264

National Academy of Engineering and National Research Council. (2014). STEM integration in K-12 education: Status, prospects, and an agenda for research . National Academies Press.

National Research Council. (2011). Successful K-12 STEM education: Identifying effective approaches in science, technology, engineering, and mathematics . National Academies Press.

National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas . The National Academies Press.

National Research Council. (2013). Monitoring progress toward successful K-12 STEM education: A nation advancing? National Academies Press. https://doi.org/10.17226/13509

Nguyen, K. A., Borrego, M., Finelli, C. J., DeMonbrun, M., Crockett, C., Tharayil, S., Shekhar, P., Waters, C., & Rosenberg, R. (2021). Instructor strategies to aid implementation of active learning: A systematic literature review. International Journal of STEM Education, 8 (1), 9. https://doi.org/10.1186/s40594-021-00270-7

Preininger, A. M. (2017). Embedded mathematics in chemistry: A case study of students’ attitudes and mastery. Journal of Science Education and Technology, 26 (1), 58–69. https://doi.org/10.1007/s10956-016-9651-3

Priemer, B., Eilerts, K., Filler, A., Pinkwart, N., Rösken-Winter, B., Tiemann, R., & Zu Belzen, A. U. (2020). A framework to foster problem-solving in STEM and computing education. Research in Science & Technological Education, 38 (1), 105–130. https://doi.org/10.1080/02635143.2019.1600490

Sadler, P. M., Coyle, H. P., & Schwartz, M. (2000). Engineering competitions in the middle school classroom: Key elements in developing effective design challenges. Journal of the Learning Sciences, 9 (3), 299–327. https://doi.org/10.1207/S15327809JLS0903_3

Sanders, M. (2009). STEM, STEM education, STEMmania. The Technology Teacher, 68 (4), 20–26.

Sanders, M. E., & Wells, J. (2006). Integrative STEM education course syllabi & instructional materials: STEM education foundations . In STEM Education Trends & Issues, STEM Education Seminar.

Schütte, K., & Köller, O. (2015). ‘Discover, understand, implement, and transfer’: Effectiveness of an intervention programme to motivate students for science. International Journal of Science Education, 37 (14), 2306–2325. https://doi.org/10.1080/09500693.2015.1077537

Shaughnessy, J. M. (2013). Mathematics in a STEM context. Mathematics Teaching in the Middle School, 18 (6), 324. https://doi.org/10.5951/mathteacmiddscho.18.6.0324

Sinatra, G. M., Mukhopadhyay, A., Allbright, T. N., Marsh, J. A., & Polikoff, M. S. (2017). Speedometry: A vehicle for promoting interest and engagement through integrated STEM instruction. Journal of Educational Research, 110 (3), 308–316. https://doi.org/10.1080/00220671.2016.1273178

Smyrnaiou, Z., Georgakopoulou, E., & Sotiriou, S. (2020). Promoting a mixed-design model of scientific creativity through digital storytelling—the CCQ model for creativity. International Journal of STEM Education, 7 (1), 25. https://doi.org/10.1186/s40594-020-00223-6

Stehle, S. M., & Peters-Burton, E. E. (2019). Developing student 21st Century skills in selected exemplary inclusive STEM high schools. International Journal of STEM Education, 6 (1), 39. https://doi.org/10.1186/s40594-019-0192-1

Strobel, J., Wang, J., Weber, N. R., & Dyehouse, M. (2013). The role of authenticity in design-based learning environments: The case of engineering education. Computers & Education, 64 , 143–152. https://doi.org/10.1016/j.compedu.2012.11.026

Struyf, A., De Loof, H., Boeve-de Pauw, J., & Van Petegem, P. (2019). Students’ engagement in different STEM learning environments: Integrated STEM education as promising practice? International Journal of Science Education, 41 (10), 1387–1407. https://doi.org/10.1080/09500693.2019.1607983

Sullivan, A., & Bers, M. U. (2019). Investigating the use of robotics to increase girls’ interest in engineering during early elementary school. International Journal of Technology and Design Education, 29 (5), 1033–1051. https://doi.org/10.1007/s10798-018-9483-y

Thomas, D. R., & Larwin, K. H. (2023). A meta-analytic investigation of the impact of middle school STEM education: Where are all the students of color? International Journal of STEM Education, 10 (1), 43. https://doi.org/10.1186/s40594-023-00425-8

U.S. Department of Education. (2016). STEM 2026: A vision for innovation in STEM education . U.S. Department of Education.

Vallera, F. L., & Bodzin, A. M. (2020). Integrating STEM with AgLIT (Agricultural literacy through innovative technology): The efficacy of a project-based curriculum for upper-primary students. International Journal of Science and Mathematics Education, 18 (3), 419–439. https://doi.org/10.1007/s10763-019-09979-y

Vongkulluksn, V. W., Matewos, A. M., Sinatra, G. M., & Marsh, J. A. (2018). Motivational factors in makerspaces: A mixed methods study of elementary school students’ situational interest, self-efficacy, and achievement emotions. International Journal of STEM Education, 5 (1), 43. https://doi.org/10.1186/s40594-018-0129-0

Vossen, T. E., Henze, I., Rippe, R. C. A., Van Driel, J. H., & De Vries, M. J. (2018). Attitudes of secondary school students towards doing research and design activities. International Journal of Science Education, 40 (13), 1629–1652. https://doi.org/10.1080/09500693.2018.1494395

Wilson, K. (2021). Exploring the challenges and enablers of implementing a STEM project-based learning programme in a diverse junior secondary context. International Journal of Science and Mathematics Education, 19 (5), 881–897. https://doi.org/10.1007/s10763-020-10103-8

Wu, Y., Guo, S., & Zhu, L. (2019). Design and implementation of data collection mechanism for 3D design course based on xAPI standard. Interactive Learning Environments, 28 (5), 602–619. https://doi.org/10.1080/10494820.2019.1696842

Yata, C., Ohtani, T., & Isobe, M. (2020). Conceptual framework of STEM based on Japanese subject principles. International Journal of STEM Education, 7 (1), 12. https://doi.org/10.1186/s40594-020-00205-8

Zheng, J., Xing, W., Zhu, G., Chen, G., Zhao, H., & Xie, C. (2020). Profiling self-regulation behaviors in STEM learning of engineering design. Computers & Education . https://doi.org/10.1016/j.compedu.2019.103669

Download references

Acknowledgements

The authors express their sincere gratitude to the editors and reviewers for their invaluable inputs and suggestions, which have significantly enhanced the quality of this work.

This work was financially supported by the Institute for Research Excellence in Learning Sciences of National Taiwan Normal University (NTNU) from The Featured Areas Research Center Program within the framework of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan.

Author information

Authors and affiliations.

Research Center for Education, National Research and Innovation Agency, Jakarta, Indonesia

Suarman Halawa

Center for Teacher Education, National Kaohsiung University of Science and Technology, Kaohsiung, Taiwan

Tzu-Chiang Lin

Center for the Liberal Arts, National Kaohsiung University of Science and Technology, Kaohsiung, Taiwan

Graduate Institute of Science Education, National Taiwan Normal University, Taipei, Taiwan

Ying-Shao Hsu

You can also search for this author in PubMed   Google Scholar

Contributions

SH contributed to the conception of the study, research question, methods, analysis, and interpretation of the data. TC contributed to the data collection, analysis and interpretation of data, and editing of the manuscript. YS contributed to the conception of the study, data analysis and interpretation, and editing of the manuscript. All authors equally contributed to writing, reading, and approving the manuscript.

Corresponding author

Correspondence to Ying-Shao Hsu .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

The author declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Description of STEM education goals

STEM education goals

Brief description

Representational articles

Engagement and career choice

The goals of instruction focus on students’ emotional responses to learning STEM subjects and pursuing a professional degree in one of the STEM fields

Fan et al. ( )

STEM literacy

The goals of instruction focus on students’ ability to apply concepts from science, technology, engineering, and mathematics to solve problems that cannot be solved with a single subject

Vallera and Bodzin ( )

21st-century competencies

The goals of instruction focus on students’ abilities of critical thinking, creativity, innovation, leadership, and adaptability which can be used to adapt in the twenty-first century

Chen and Lin ( )

Description of the elements of instructional design for STEM education

Features

Brief description

Representational articles

Questioning or identifying problems

Students propose questions or identify problems in the STEM activity

Vallera and Bodzin ( )

Designing

Students design their model

Aranda et al. ( )

Building

Students build a prototype based on their model

English ( )

Testing

Students test their design and prototype

Zheng et al.,

Redesigning

Students redesign their model after they test it

Lie et al. ( )

Experimenting

Students engage in hands-on activities in the STEM education

Kim,

Analyzing

Students use mathematics to analyze the data from the STEM activity

Berland et al. ( )

Collaborating

Students interact or collaborate with other students to solve problems in the STEM activity

English and King ( )

Reflecting

Students evaluate/assess their experience in the STEM activity

Dasgupta et al. ( )

Communicating

Students present/share their work to/with the whole class

Chen and Lin ( )

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Halawa, S., Lin, TC. & Hsu, YS. Exploring instructional design in K-12 STEM education: a systematic literature review. IJ STEM Ed 11 , 43 (2024). https://doi.org/10.1186/s40594-024-00503-5

Download citation

Received : 15 April 2024

Accepted : 24 August 2024

Published : 05 September 2024

DOI : https://doi.org/10.1186/s40594-024-00503-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • STEM education
  • Engagement and career choice
  • STEM literacy

critical thinking in instructional design

Making the Link Between Design Thinking and Instructional Design

Wed Jul 29 2020

Making the Link Between Design Thinking and Instructional Design

Design thinking is a human-centered problem-solving approach (as opposed to a business-centric approach) to solving complex business problems. To help advance this effort, learning and development experts Sharon Boller and Laura Fletcher have written Design Thinking for Training and Development: Creating Learning Journeys That Get Results (ATD Press, June 2020).

Sharon Boller is a managing director at TiER1 Performance and a frequent speaker at industry conferences on topics such as performance-focused learning design, UX, technology and trends, learning game design, and design thinking. Laura Fletcher is a seasoned learning consultant with 15 years of experience in learning and development.

In the book, Boller and Fletcher explain that design thinking features five core steps:

Empathize with users—for instance, with those affected by a situation or in need.

Define the problem to be solved.

Ideate with target users to come up with possible “solves”.

Craft and test quick and dirty prototypes of potential solutions.

Iterate on and refine the prototypes based on testing outcomes.

What’s more, these same steps can be applied to instructional design (ID) and other talent development efforts. Here are some insights from Boller and Fletcher about how TD pros can apply this approach.

Is there a link between design thinking and instructional design?

Boller: The five steps of design thinking are similar and different to traditional ID steps and offer ways to improve those traditional steps. The first key difference is the impetus for even starting the process. In ID, we react to a problem someone presents us with. In design thinking, we empathize with users and see if we can distill the problem after building empathy. If we flip the traditional instructional design framework known as ADDIE to a “learning experience framework” rooted in design thinking, we shift from “audience analysis” to “insight gathering.” We use design thinking tools and techniques such as experience mapping and empathy mapping to get perspective. Such tools get us great insight into learners and their needs more than does demographic info gathering or pure task analysis.

Armed with this perspective, we can refine the problem (instead of defining, which is a DT step). With the problem clarified, we can then proceed to ideate and co-create potential solutions with our learners, inviting them to help us shape prototypes that we can test. We can think in terms of an entire learning journey rather than just developing an event-focused solution such as a workshop or an e-learning course.

Traditional instructional design frameworks follow design with development and development with implementation, usually done first as a pilot. Design thinking offers a testing approach as well—but that testing starts earlier when it is cheaper to do. The goal is to build rapid, cheap prototypes and test those prototypes before proceeding to a full buildout. This early testing leads to iteration before things get expensive—something ID frameworks could benefit from. Waiting until a solution is fully built out to test it is costly and time-consuming. It also means we can be reluctant to make changes because we’ve invested so heavily by the time we pilot.

The link between ID and DT is in the similarity of the steps and the intent to be iterative. Design Thinking has some fantastic tools that can seriously enhance the ID process with experience mapping and empathy mapping being two of the big ones.

What’s the difference between learning experience design (LXD) and instructional design?

Boller: Learning experience design is focused on the entire learning experience, including how I notice a need to learn, commit to that learning, do the learning, build memory and proficiency over time, deepen and expand proficiency through exploration and reflect, and sustain performance for the long haul. Instructional design is task focused. It analyzes tasks and figures out ways to teach those tasks. I think of LXD as imagining the experience I want a hotel guest to have while in my hotel and ID as the blueprint for the hotel itself. I don’t think there is a role of “learning experience designer” because just as design thinking is executed by a cross-functional team, LXD too is best done as a team sport. It’s not done by an individual. Learning experience design is more about what you are creating and not the role you perform.

Is there an easy way to get our feet wet with design thinking?

Fletcher: The first time we experimented with design thinking, we invited learners to the design meeting to create an empathy map and persona. That was it—no elaborate brainstorming, no prototyping, just extra focus on the learner during design. That mindset of gaining the perspective of the learner is a great place to start, whether you use an empathy map, an experience map, or a focus group. The tools themselves are easy to facilitate, but the insight they generate can have a huge effect on the ultimate solution.

Boller: We also experimented early with experience mapping, using no other tools. We blocked out the steps of a process and invited people to consider the thoughts/feelings happening as part of each step as well as the magical and miserable moments associated with doing each step. The insights from that were huge, including recognizing that the sales process was not at all as the subject matter experts in the room assumed it to be.

Are design thinking tools effective virtually?

Boller: The best thing about empathy mapping and persona development are the ease with which you can create them via virtual means. Tools like Miro, Fun Retro, Mural, or even a white board in Teams can be used to create collaborative workspaces without people physically being in the same room.

Fletcher: It has always been hard to get the right people in a room together, even before social distancing and travel restrictions, so we have had ample opportunity to experiment using the tools virtually. We have found that live collaboration works better than asynchronous. Early on, we tried getting input from learners asynchronously and found the results were not as good. The advantage of being live while building an empathy map or experience map, for example, is that it allows learners to build on each other’s responses and enables the designer to facilitate and ask probing questions. But live doesn’t have to mean face-to-face. We’ve had good luck pairing voice-to-voice connection with real-time collaboration tools. Virtual collaboration can be just as effective as sticky notes on the wall and is more efficient for learners who can’t leave their home office.

For more guidance and advice on how to apply is approach in your work, check out Design Thinking for Training and Development: Creating Learning Journeys That Get Results .

You've Reached ATD Member-only Content

Become an ATD member to continue

Developing disposition to critical thinking and problem-solving perception in instructional design projects for producing digital materials

  • Original Research Article
  • Published: 23 January 2021
  • Volume 32 , pages 1267–1292, ( 2022 )

Cite this article

critical thinking in instructional design

  • Sacip Toker   ORCID: orcid.org/0000-0003-1437-6642 1 &
  • Meltem Huri Baturay 2  

1334 Accesses

6 Citations

2 Altmetric

Explore all metrics

This study investigated the development of perceptions of critical thinking and problem-solving skills among a group of students taking part in instructional design projects to produce digital materials using different instructional design models. The study participants were students from a computer science teaching department who were enrolled in an instructional design course. Participants were divided into two groups according to instructional design model. The rapid prototyping model (RPM) group consisted of 47 students working in 9 teams on an assignment to develop an e-book for educational use, and the Dick and Carey model (DCM) group consisted of 37 students working in 7 teams on an assignment to design digital materials to enrich courses on a specific subject. Student perceptions of the development of their critical thinking and problem-solving skills were analyzed using a causal-comparative approach, with the Big Five Personality traits as covariance. The RPM group indicated significant improvements in their perceived problem-solving skills, particularly with respect to their confidence in undertaking tasks, whereas the DCM group perceived significant improvements in their disposition to critical-thinking, particularly with respect to self-confidence and analyticity. The Openness to Experience trait was reported to be a significant covariance on the self-confidence sub-factor of both skills, as was the trait Extraversion. The findings are discussed in detail, along with recommendations for further research.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

critical thinking in instructional design

Similar content being viewed by others

critical thinking in instructional design

A comparison of recursive and nonrecursive models of attitude towards problem-based learning, disposition to critical thinking, and creative thinking in an computer literacy course for preservice teachers

critical thinking in instructional design

The progress of 21st-century skills throughout instructional design projects: a quasi-experimental comparison of rapid prototyping and dick and carey models

Investigating peer review as a systemic pedagogy for developing the design knowledge, skills, and dispositions of novice instructional design students, explore related subjects.

  • Artificial Intelligence
  • Digital Education and Educational Technology

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., et al. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research., 78 (4), 1102–1134.

Article   Google Scholar  

Akkuş, Y., Kaplan, F., & Kaçar, N. (2010). Kars sağlık yüksekokulu hemşirelik öğrencilerinin eleştirel düşünme düzeyleri ve etkileyen faktörlerin belirlenmesi [The Critical Thinking Levels of Nursing Students of Kars Health Higher School and Influencing Factors]. Fırat Sağlık Hizmetleri Dergisi, 5 (15), 103–112.

Google Scholar  

Andrews, D. H., & Goodson, L. A. (1980). A comparative analysis of models of instructional design. Journal of Instructional Development, 3 (4), 2–16.

Antonietti, A., Ignazi, S., & Perego, P. (2000). Metacognitive knowledge about problem-solving methods. British Journal of Educational Psychology., 70 (1), 1–16. https://doi.org/10.1348/000709900157921 .

Arts, J. A., Gijselaers, W. H., & Segers, M. S. (2002). Cognitive effects of an authentic computer-supported, problem-based learning environment. Instructional Science, 30 (6), 465–495.

Baaki, J., & Luo, T. (2019). Instructional designers guided by external representations in a design process. International Journal of Technology and Design Education, 1–29.

Bacanlı, H., İlhan, T., & Aslan, S. (2009). Beş faktör kuramina dayali bir kişilik ölçeğinin geliştirilmesi: sifatlara dayali kişilik testi (SDKT) [Development of a personality scale based on five factor theory: adjective based personality test (ABPT)]. Türk Eğitim Bilimleri Dergisi, 7 (2), 261–279.

Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological Review, 84 (2), 191.

Bandura, A. (1997). Self-efficacy: The exercise of control . New York: Freeman.

Bannan-Ritland, B. (2001). Teaching instructional design: An action learning approach. Performance Improvement Quarterly, 14 (2), 37–52.

Bauer, K. W., & Liang, Q. (2003). The effect of personality and precollege characteristics on first-year activities and academic performance. Journal of College Student Development, 44, 277–290.

Bausell, R. B., & Li, Y. F. (2002). Power analysis for experimental research: a practical guide for the biological, medical and social sciences . Cambridge: Cambridge University Press.

Book   Google Scholar  

Bennet, S., Harper, B., & Hedberg, J. (2002). Designing real life cases to support authentic design activities. Australasian Journal of Educational Technology, 18 (1), 1–12.

Brookfield, S. D. (2011). Teaching for critical thinking: Tools and techniques to help students question their assumptions . San Francisco: John Wiley & Sons.

Browne, M. N., & Keeley, S. M. (2007). Asking the right questions: A guide to critical thinking . Englewood Cliffs: Prentice Hall.

Bulut, S., Ertem, G., & Sevil, Ü. (2009). Hemşirelik öğrencilerinin eleştirel düşünme düzeylerinin incelenmesi [Examination of nursing students’ level of critical thinking]. Dokuz Eylül Üniversitesi Hemşirelik Yüksekokulu Elektronik Dergisi, 2 (2), 27–38.

Cennamo, K., & Kalk, D. (2019). Real world instructional design: An iterative approach to designing learning experiences . London: Routledge.

Chartrand, J. M., Rose, M. L., Elliott, T. R., Marmarosh, C., & Caldwell, S. (1993). Peeling back the onion: Personality, problem solving, and career decision-making style correlates of career indecision. Journal of Career Assessment, 1 (1), 66–82.

Cheung, C. K., Rudowicz, E., Lang, G., Yue, X. D., & Kwan, A. S. (2001). Critical thinking among university students: does the family background matter? College Student Journal, 35 (4), 577.

Clifford, J. S., Boufal, M. M., & Kurtz, J. E. (2004). Personality traits and critical thinking skills in college students: Empirical tests of a two-factor theory. Assessment, 11, 169–176.

Cole, R. S., & Todd, J. B. (2003). Effects of web-based multimedia homework with immediate rich feedback on student learning in general chemistry. Journal of Chemical Education, 80 (11), 1338–1343.

Copeland, W. D., Birmingham, C., de la Cruz, E., & Lewin, B. (1993). The reflective practitioner in teaching: Toward a research agenda. Teaching and Teacher Education, 9 (4), 347–359.

Costa, P. T., & MacCrae, R. R. (1992). Revised NEO personality inventory (NEO PI-R) and NEO five-factor inventory (NEO-FFI): Professional manual . Odessa: Psychological Assessment Resources, Incorporated.

Dean, D., & Kuhn, D. (2003). Metacognition and critical thinking.

Dick, W., Carey, L., & Carey, J. O. (2005). The systematic design of instruction (6th ed.). Boston: Pearson Allyn & Bacon.

Dick, W., & Carey, L. (1996). The systematic design of instruction . Chicago: Scott, Foresman.

Dihoff, R. E., Brosvic, G. M., Epstein, M. L., & Cook, M. J. (2004). Provision of feedback during preparation for academic testing: Learning is enhanced by immediate but not delayed feedback. The Psychological Record, 54 (2), 207.

Elliott, T. R., Herrick, S. M., MacNair, R. R., & Harkins, S. W. (1994). Personality correlates of self-appraised problem solving ability: Problem orientation and trait affectivity. Journal of Personality Assessment, 63 (3), 489–505.

El-sayed, R. S., Sleem, W. F., El-sayed, N. M., & Ramada, F. A. (2011). Disposition of staff nurses’ critical thinking and its relation to quality of their performance at Mansoura University Hospital. Journal of American Science, 7, 388–395.

Ennis, R. H. (1987). A taxonomy of critical thinking dispoisitions and abilities. In J. B. Baron & J. J. Stem Berg (Eds.), Teaching thinking skills: Theory and practice (pp. 9–26). New York: Freeman.

Epstein, M. L., Lazarus, A. D., Calvano, T. B., & Matthews, K. A. (2002). Immediate feedback assessment technique promotes learning and corrects inaccurate first responses. The Psychological Record, 52 (2), 187.

Ernst, J., & Monroe, M. (2006). The effects of environment-based education on students’ critical thinking skills and disposition toward critical thinking: Reprinted from Environmental Education Research (2004) 10(4), pp. 507–522. Environmental Education Research, 12 (3–4), 429–443.

Eysenck, H. J. (1959). Personality and the estimation of time. Perceptual and Motor Skills, 9, 405–406.

Facione, N. C., Facione, P. A., & Sanchez, C. A. (1994). Critical thinking disposition as a measure of competent clinical judgment: the development of the California Critical Thinking Disposition Inventory. The Journal of Nursing Education, 33 (8), 345–350.

Facione, P., & Gittens, C. A. (2015). Think critically . New York: Pearson Inc.

Facione, P. A., Facione, N. C., & Giancarlo, C. A. (1997). Professional judgment and the disposition toward critical thinking. Retrieved Nov 21, 2020 from https://insightassessment.com/wp-content/uploads/ia/pdf/Prof_Jdgmnt__Dsp_CT_97_Frnch1999.pdf .

Facione, P. A., Sanchez, C. A., Facione, N. C., & Gainen, J. (1995). The disposition toward critical thinking. The Journal of General Education, 44 (1), 1–25.

Fahim, M., & Teimourtash, M. (2012). A critical look at the notion of critical thinking from a new personality trait perspective: “Midtrovert”.

Field, A. (2009). Discovering statistics using SPSS . London: Sage Publications.

Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2011). How to design and evaluate research in education . New York: McGraw-Hill Humanities/Social Sciences/Languages.

Gagne, R. M. (1980). The conditions of learning (3rd ed.). New York: Holt, Rinehart & Winston.

Gagne, R., Briggs, L., & Wager, W. (1992). Principles of instructional design (4rth ed.). New York: Holt, Rinehart & Winston.

Garside, C. (1996). Look who’s talking: A comparison of lecture and group discussion teaching strategies in developing critical thinking skills. Communication Education, 45 (3), 212–227.

Ge, X., Chen, C. H., & Davis, K. A. (2005). Scaffolding novice instructional designers’ problem-solving processes using question prompts in a web-based learning environment. Journal of Educational Computing Research, 33 (2), 219–248.

Gellin, A. (2003). The effect of undergraduate student involvement on critical thinking: A meta-analysis of the literature 1991–2000. Journal of college student development, 44 (6), 746–762.

Gerlach, V. S., & Ely, D. P. (1980). Teaching and media: A systematic approach (2nd ed.). Englewood Cliffs: Prentice-Hall.

Glaser, R. (1991). The maturing of the relationship between the science of learning and cognition and educational practice. Learning and Instruction, 1 (2), 129–144.

Gokhale, A. A. (1995). Collaborative learning enhances critical thinking. Journal of Technology Education , 7 (1). http://scholar.lib.vt.edu/ejournals/JTE/v7n1/gokhale.jte-v7n1.html?ref=Sawos.Org .

Goldberg, L. R. (1993). The structure of phenotypic personality traits. American Psychologist, 48 (1), 26.

Griffin, M. L. (2003). Using critical incidents to promote and assess reflective thinking in preservice teachers. Reflective practice, 4 (2), 207–220.

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains: Disposition, skills, structure training, and metacognitive monitoring. American Psychologist, 53 (4), 449.

Harris, I. (1993). New expectations for professional competence. In L. Curry & J. F. Wergin (Eds.), Educating professionals: Responding to new expectations for competence and accountability . San Francisco: Jossey-Bass.

Haynes, T., & Bailey, G. (2003). Are you and your basic business students asking the right questions? Business Education Forum, 57 (3), 33–37.

Heppner, P. P., & Petersen, C. H. (1982). The development and implications of a personal problem-solving inventory. Journal of Counseling Psychology, 29 (1), 66.

Houtz, J. C., Ponterotto, J. G., Burger, C., & Marino, C. (2010). Problem-solving style and multicultural personality dispositions: A study of construct validity. Psychological Reports, 106 (3), 927–938. https://doi.org/10.2466/pr0.106.3.927-938 .

IBSTPI, International Board of Standards for Training, Performance, and Instruction. (1994). Instructional design competencies: The standards. International Board of Standards for Training, Performance, and Instruction.

Jonassen, D. (2002). Integration of problem solving into instructional design. In R. Reiser & J. Dempsey (Eds.), Trends and issues in instructional design and technology (pp. 107–120). Upper Saddle River: Merrill/Prentice Hall.

Jonassen, D. H. (2000). Toward a design theory of problem solving. Educational Technology Research and Development, 48 (4), 63–85.

Jones, T. S., & Richey, R. C. (2000). Rapid prototyping methodology in action: A developmental study. Educational Technology Research and Development, 48 (2), 63–80.

Kirmizi, F. S., Saygi, C., & Yurdakal, I. H. (2015). Determine the relationship between the disposition of critical thinking and the perception about problem solving skills. Procedia-Social and Behavioral Sciences, 191, 657–661.

Kökdemir, D. (2003). Belirsizlik Durumlarında Karar Verme ve Problem Çözme [Decision making and problem solving under uncertainty]. Unpublished doctoral dissertation, Ankara University Social Sciences Institute, Social Psychology Division. Ankara.

Lee, H.-J. (2000). The nature of the changes in reflective thinking in preservice mathematics teachers engaged in student teaching field experience in Korea. Paper presented at the Annual Meeting of the America Educational Research Association (AERA), New Orleans, LA, April 24–28, 2000.

Lee, J., & Jang, S. (2014). A methodological framework for instructional design model development: Critical dimensions and synthesized procedures. Educational Technology Research and Development, 62 (6), 743–765.

Mayer, R. E. (1998). Cognitive, metacognitive, and motivational aspects of problem solving. Instructional Science, 26 (1–2), 49–63.

McMillan, J. H. (1987). Enhancing college students’ critical thinking: A review of studies. Research in Higher Education, 26 (1), 3–29.

Morrison, G., Ross, S., & Kemp, J. (2001). Designing effective instruction (3rd ed.). New York: John Wiley & Sons Inc.

Mount, M. K., & Barrick, M. R. (1995). The Big Five personality dimensions: Implications for research and practice in human resources management. Research in Personnel and Human Resources Management, 13 (3), 153–200.

Moursund, D., & Bielefeldt, T. (1999). Will new teachers be prepared to teach in a digital age? In A national survey on information technology in teacher education. Santa Monica: Milken Family Foundation.

Nelson, C. E. (1994). Critical thinking and collaborative learning. New Directions for Teaching and Learning., 1994 (59), 45–58.

Nelson, W. A., Macliaro, S., & Sherman, T. M. (1988). The intellectual content of instructional design. Journal of Instructional Development, 2 (1), 29–35.

Paul, R., & Scriven, M. (1987). Critical thinking as defined by the national council for excellence in critical thinking. In 8th Annual international conference on critical thinking and education reform , Berkeley.

Perez, R. S., & Emery, C. D. (1995). Designer thinking: How novices and experts think about instructional design. Performance Improvement Quarterly, 8 (3), 80–95.

Phelps, E., & Damon, W. (1989). Problem solving with equals: Peer collaboration as a context for learning mathematics and spatial concepts. Journal of Educational Psychology, 81 (4), 639.

Posner, G. J., & Rudnitsky, A. N. (1994). Course design: A guide to curriculum development for teachers . White Plains: Longman.

Quinn, J. (1994). Connecting education and practice in an instructional design graduate program. Educational Technology Research and Development, 42 (3), 71–82.

Reigeluth, C. M. (1999). What is instructional design theory? In C. M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 5–29). Mahwah: Lawrence Erlbaum Associates.

Reiser, R. A., & Dick, W. (1996). Instructional planning: A guide for teachers . Boston: Allyn and Bacon.

Richey, R. C., Fields, D. C., & Foxon, M. (2001). Instructional design competencies: The standards (3rd ed.). Syracuse: ERIC Clearinghouse on Information and Technology.

Savery, J. R., & Duffy, T. M. (1995). Problem Based learning: An instructional model and its constructivist framework. Educational Technology, 35 (5), 31–38.

Scheeler, M. C., Congdon, M., & Stansbery, S. (2010). Providing immediate feedback to co-teachers through bug-in-ear technology: An effective method of peer coaching in inclusion classrooms. Teacher Education and Special Education: The Journal of the Teacher Education Division of the Council for Exceptional Children, 33 (1), 83–96.

Schenker, J. D., & Rumrill, P. D., Jr. (2004). Causal-comparative research designs. Journal of Vocational Rehabilitation, 21 (3), 117–121.

Sepahvand, E., Shehni Yailagh, M., Allipour Birgany, S., & Behroozi, N. (2017). Testing a model of causal relationships of family communication patterns, metacognition and personality traits with critical thinking disposition, mediated by epistemic beliefs of female high school students in Ahvaz. Journal of Psychological Achievements, 24 (1), 23–44.

Shambaugh, R. N., & Magliaro, S. G. (2001). A reflexive model for teaching instructional design. Educational Technology Research and Development, 49 (2), 69–91.

Simon, A., & Ward, L. O. (1974). The performance on the Watson-Glaser Critical Thinking Appraisal of university students classified according to sex, type of course pursued, and personality score category. Educational and Psychological Measurement, 34 (4), 957–960.

Snyder, L. G., & Snyder, M. J. (2008). Teaching critical thinking and problem solvingproblem-solving skills. The Journal of Research in Business Education, 50 (2), 90.

Sosu, E. M. (2013). The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity, 9, 107–119.

Soto, V. J. (2013). Which instructional design models are educators using to design virtual world instruction. MERLOT Journal of Online Learning and Teaching, 9 (3), 364–375.

Spector, J. M., Muraida, D. J., & Dallman, B. E. (1990). Establishing instructional strategies for advanced interactive technologies. In Proceedings of the psychology in the DoD symposium (vol. 12, pp. 347–352).

Spector, J. M., Muraida, D. J., & Marlino, M. R. (1992). Cognitively based models of courseware development. Educational Technology Research and Development, 40 (2), 45–54.

Squires, D. (1999). Educational software and learning: Subversive use and volatile design. Educational technology, 39 (3), 48–54.

Swanson, H. L. (1990). Influence of metacognitive knowledge and aptitude on problem solving. Journal of Educational Psychology, 82 (2), 306.

Taylan, S. (1990). Heppner’in problem çözme envanterinin uyarlama, güvenirlik ve geçerlik çalışmaları. [An adaptation, reliability and validity studies of Heppner’s problem-solving inventory]. Yayınlanmamış Yüksek Lisans Tezi, Ankara Üniversitesi, Ankara. [Unpublished master’s thesis, Ankara University, Ankara.].

Terenzini, P. T., Springer, L., Pascarella, E. T., & Nora, A. (1995). Influences affecting the development of students’ critical thinking skills. Research in Higher Education, 36 (1), 23–39.

Tessmer, M., & Wedman, J. (1995). Context-sensitive instructional design models: A response to design research, studies, and criticism. Performance Improvement Quarterly, 8 (3), 38–54.

Titzer, J. L., Swenty, C. F., & Hoehn, W. G. (2012). An interprofessional simulation promoting collaboration and problem solving among nursing and allied health professional students. Clinical Simulation in Nursing, 8 (8), e325–e333.

Tripp, S. D., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research and Development, 38 (1), 31–44.

Visscher-Voerman, I., & Gustafson, K. L. (2004). Paradigms in the theory and practice of education and training design. Educational Technology Research and Development, 52 (2), 69–89.

Visscher-Voerman, I., Gustafson, K., & Plomp, T. (1999). Educational design and development: An overview of paradigms. In J. Van der Akker, R. M. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 15–28). Netherlands: Springer.

Chapter   Google Scholar  

Waite, S., & Davis, B. (2006). Collaboration as a catalyst for critical thinking in undergraduate research. Journal of Further and Higher Education, 30 (4), 405–419.

Wedman, J., & Tessmer, M. (1993). Instructional designers decisions and priorities: A survey of design practice. Performance Improvement Quarterly, 6 (2), 43–57.

WienerVazquez-Abad, L. R. J. (1995). The present and future of ID practice. Performance Improvement Quarterly, 8 (3), 55–67.

Winn, W. (1989). Toward a rationale and theoretical basis for educational technology. Educational Technology Research and Development, 37 (1), 35–46.

Winn, W. (1997). Advantages of a theory based curriculum in instructional technology. Educational Technology, 37 (1), 34–41.

Yang, Y. T. C., Newby, T. J., & Bill, R. L. (2005). Using Socratic questioning to promote critical thinking skills through asynchronous discussion forums in distance learning environments. The American Journal of Distance Education, 19 (3), 163–181.

York, C. S., & Ertmer, P. A. (2016). Examining instructional design principles applied by experienced designers in practice. Performance Improvement Quarterly, 29 (2), 169–192.

Young, M. F. (1993). Instructional design for situated learning. Educational Technology Research and Development, 41 (1), 43–85.

Download references

Author information

Sacip Toker and Meltem Huri Baturay both authors equally contributed to this work

Authors and Affiliations

Information Systems Engineering, College of Engineering, Atilim University, Kizilcasar Mah., Incek, Ankara, Turkey

Sacip Toker

Center for Teaching and Learning, Atilim University, Kizilcasar Mah., Incek, Ankara, Turkey

Meltem Huri Baturay

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sacip Toker .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Toker, S., Baturay, M.H. Developing disposition to critical thinking and problem-solving perception in instructional design projects for producing digital materials. Int J Technol Des Educ 32 , 1267–1292 (2022). https://doi.org/10.1007/s10798-020-09646-2

Download citation

Accepted : 10 December 2020

Published : 23 January 2021

Issue Date : April 2022

DOI : https://doi.org/10.1007/s10798-020-09646-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Instructional design projects
  • E-book design
  • Digital material design
  • Disposition to critical thinking
  • Problem solving inventory
  • Find a journal
  • Publish with us
  • Track your research

The eLearning Coach

For designing effective learning experiences

Connie Malamed

How To Use Design Thinking In Learning Experience Design

by Connie Malamed

How to Use Design Thinking in Learning Experience Design

In light of these concerns, you may find that a Design Thinking model suits your needs. Design Thinking is an approach for deeply understanding the audience and their challenges, in order to generate creative and effective solutions. It resembles Agile models in its methods of prototyping and testing. It differs in its emphasis on human-centered solutions.

Design Thinking is Human-centered

It places a great value on empathy for your users. The practice of Design Thinking seems to be sorely missing from instructional design university programs, professional training and workplace practices.

If Design Thinking has the potential to help us come up with better design solutions, then let’s make room for it as we design and promote learning experiences.

Solutions for the 21st Century

Those of us who feel hampered by current models may already practice some Design Thinking techniques. The more we understand these practices as a framework, the more likely we will leverage Design Thinking in our daily work.

Innovation is critical now because the solution to many problems may be more complex, broader, and more integrated into work than one training course can provide. Perhaps the solution to a problem requires developing a community of practice—not formal training. Maybe there is a need for performance support combined with a user interface redesign. Perhaps a change in organizational processes will supplement interactive training.

I often speak of the practitioners in our field as “solution finders” rather than course builders. Design Thinking aligns more with the complex problems a learning experience designer might need to solve.

A Design Thinking Process

There are many variations to the Design Thinking framework. Generally, there are between three and six steps. Here is one of the original methods as taught by Stanford’s d.school modified with an approach that can work with learning experience design. You can extend the Define Phase to include your instructional design practice for identifying and writing measurable performance objectives.

critical thinking in instructional design

Empathy involves more than just analyzing an audience or users. This is one of the keys to Design Thinking. Empathy is about experiencing the feelings of others. You are attempting to understand what it is like to be in their job and to have their challenges.

As an ethnographer, you research the audience, studying and recording a group of people’s viewpoints. In the workplace context, this is to discover the needs of a target population and explore their universe.

Through the empathetic experience, you can create more effective solutions than when you are simply an order taker. Empathy may involve collaborating and co-designing with the audience.

Industrial designer and design educator, Paul Backett, writes that, “Great designers are great empathizers. It’s what separates a design that has soul from one that’s simply well-realized.”

Tools that will help you research users and their challenges:

  • Field Research: Talk to and observe audience members in their jobs and imagine what it would be like to have that job. Discover less apparent problems.
  • Interviews: Speak with both supervisors and staff to understand their issues and the characteristics of the people you want to help.
  • Personas: Capture the collective characteristics and attributes of your audience or a subset of your audience (see  Learner Personas for Instructional Design )
  • Empathy Maps: A visual tool to collect what the persona thinks, feels, says, and does when faced with the challenge of your focus
  • Attitude Research: Run focus groups to determine what motivates and demotivates the audience.

During this phase, you can go ahead with a traditional analysis too. It is likely that the empathy and research you bring to your analysis will change the perspective you bring to your analysis.

DEFINE THE PROBLEM 

Research and empathy ultimately help you define the real problem. How often have days of training been developed based on an incorrect understanding of a problem? In Design Thinking, the problem space begins to evolve by understanding the challenges of the target audience.

If you spend time specifically defining a problem, you may discover that a simple solution, like one training course, will not be effective. That’s why defining the problem through research—looking at it from many angles and perspectives—can set you on the right track. Without correctly defining a problem, it’s nearly impossible to generate a corresponding solution.

Once you define the problem, I recommend writing it as a measurable performance goal. Then you can determine if training is an appropriate solution or partial solution. If so, apply your go-to instructional design method, such as SAM, Action Mapping, or ADDIE to tease out the learning objectives that will help people reach the performance goal.

Tools that will help you define the problem:

  • Root Cause Analysis: Method for understanding the underlying cause of an incident or event
  • “How Might We” statements: Statements that redefine the problem from another viewpoint, such as “How might we help staff remember to wash their hands?” or “How might we help sales reps quickly access the information they need?” or “How might we help international students feel comfortable at our school?”
  • Instructional design analysis practices that lead to measurable performance objectives, such as Action Mapping, Information Processing Analysis, Task Analysis, etc.

The practice of conceiving ideas, or ideation, is a critical step of Design Thinking. This is where you and ideally, a cross-disciplinary team, generate potential solutions to the performance problem defined during research. Although many companies won’t allow for this, getting ideas from audience members is very valuable. Try to generate as many ideas as possible because more ideas means more potential solutions. All ideas are considered and there are no constraints or restrictions.

Tools to help you generate possible solutions include:

  • Brainstorming with Sticky Notes: Brainstorming involves conceiving lots of ideas while suspending judgement. The technique has its critics. But brainstorming with sticky notes is a different matter. Team members write potential solutions on sticky notes and post them on a board or wall. The process of writing and posting continues until there are no more ideas left. This approach is more anonymous than brainstorming, as everyone is busy writing and posting. At the end, the team organizes the sticky notes into some type of coherent structure and discusses all of the ideas. It’s a fun and energetic approach to finding solutions.
  • Sketching: For many people, sketching short-circuits the judgement side of the brain and helps them tap into a flow of ideas. Sketching is visual brainstorming. Using stick figures and geometric shapes is completely acceptable and gets the job done. Sketching is exploration.
  • Manipulative Verbs: From the creator of brainstorming, Alex Osborn, comes an exercise using a list of action verbs that are applied to various ideas or problems. This works particularly well when sketching. You can make up a long list of verbs and see what ideas are generated from this approach.
  • Mind Maps: mind maps, which are radiant drawings showing connected ideas, are good for exploring many aspects to a problem. You can create these alone or with a team.
  • “How Might We” statements (see above)

A prototype is a preliminary model of an approach. Prototyping involves hands-on exploration. It provides a way to rapidly try out ideas without a large investment of time and money. Think of a prototype as a low-resolution or low-fidelity version of an idea.

In the world of industrial design, a prototype might be constructed from cardboard. For graphic design, a prototype might include a series of sketches. In learning experience design, a prototype could involve storyboarding an interaction. They say it’s better to fail early and often with your prototypes because with each failure comes a better understanding of what will work.

Some ways to prototype or to create form include:

  • Sketching: Using pencil and paper or a digital drawing tool, prototype sketches are more involved than in the previous phase. They might include storyboarding a scenario or visualizing all possible responses to an interaction.
  • Mock-ups: A mock-up is a simulated version of an idea, that replicates how it will look and behave. These can range from a Styrofoam model to a working user interface.
  • Small Implementation: If your idea involves something that isn’t physical, such as learning with social media or a face-to face instruction, then your prototype would involve building a very small and rough implementation of the approach that would work for a small group of people. Or in the case of face-to-face instruction, create a short pilot program and test it on sample audience members.

Testing is all about seeing what works in the real world, getting feedback from learners and stakeholders, and refining (or ditching) prototypes. It’s important to test your innovative approach with the target population through all of its iterations. You can see how Design Thinking is an iterative process that involves lots of testing and adapting.

Some ways to test include:

  • Provide learners/users with a task and watch as they perform it
  • Ask users to think aloud as they work
  • Prepare a list of questions and discuss the person’s reactions to a program

Closing Words

Design Thinking isn’t a silver bullet, but it’s one model for dealing with the “be creative on demand” requirements in our line of work. And it might provide valuable solutions for the learning problems of the 21st Century. You can join me at a full-day Design Thinking Pre-conference Workshop at Learning Solutions on March 30th.

Would love to hear your thoughts. Comment below.

critical thinking in instructional design

March 12, 2020 at 3:14 pm

This is the way I am working with courses because it really encompasses the entore learner experience.

' src=

March 1, 2020 at 9:13 pm

Sounds good! Would love to chat if I’m there. Connie

' src=

February 25, 2020 at 6:05 pm

Good work Connie! Sue Czeropski and I have been working on a model that integrates similar processes into an adaptation of the HPT Model– we call it “Transformative HPT.” I’d love to chat with you about our model at DevLearn ’20!

' src=

October 11, 2018 at 11:29 pm

Hi Connie So very well stated. Thank You

March 7, 2018 at 8:52 pm

Hi Tania, That is an interesting dilemma, Tania, but I think you are on the right track. I gather you are helping managers to improve on the job. So therefore, the common denominator of pain points seem like a good idea. Also, take advantage of the diversity of industries, which can bring different perspectives to a project. Enable collaboration through group problem-solving activities. Knowing how to work with a diversity of opinions is also part of design thinking and it’s an important skill for the modern workplace as cross-disciplinary teams become more common.

' src=

March 7, 2018 at 11:34 am

Love the idea of looking through the lens of empathy. My challenges is how to design for the learner’s problem when you don’t know who you learners will be. I design open enrollment programs that can attract individuals from different industries, roles, levels etc. I seem to therefore rely on more generic learning objectives based on best guesses i.e. Common pain points mangers face etc. Any suggestions?

February 19, 2018 at 11:14 am

Thanks for your comment, Slawomir. The Learning Battle Cards are pretty interesting. Glad they are working for you. Connie

' src=

February 19, 2018 at 6:44 am

As for me the main challenge is to make us (educators) understand that emphatize is not about thinking what they need, but implement activities to get that knowledge from field. Working with educators on workshops we are using LBC Canvas and the first columns are Analysis and Awarness and people have some problems with planning activities there – they would prefer to assume that we know it. The Ideation phase is very important, because one needs to get out of own routine to fully benefit the blended learning potential. We are doing it with Learning Battle Cards trying to inspire people to look non-obvious solutions. And it works.

February 14, 2018 at 7:11 am

Hi Claudia, Thanks. You know, Ideo, the product design company that promotes design thinking has a lot of info for educators. See this: https://designthinkingforeducators.com/ . Pass it on! Best, Connie

' src=

February 13, 2018 at 3:49 pm

Dear Connie, Your article perfectly fit one of the principles I used in my course Self-access Learning Materials; though we have different purposes and words, definitely your post help me much to let my graduate students consider what is needed to design a good learning object.

I think your post actually considers a current topic that should be kept in mind by all teachers.

[…] in the chat: https://theelearningcoach.com/ https://theelearningcoach.com/elearning_design/design-thinking-for-instructional-design/ https://theelearningcoach.com/category/podcasts/ […]

[…] Is Design Thinking Missing From ADDIE?: This is a must-read post by Connie Malamed that brilliantly points out the missing practice of design thinking from our current models. Not just stopping with that, it elaborates on an approach to incorporate this aspect into Instructional Design, and suggests some valuable resources on the subject for further reading. […]

[…] Is Design Thinking Missing From ADDIE? […]

[…] Design thinking certainly isn’t taught in most instructional design programs, if any. And it’s the black box of the ADDIE model. […]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed .

Transforming Teachers’ Instructional Design for Enhancing Critical Thinking in Ugandan Schools: Assessment through Rubric

How to cite, download citation.

Crossref

The needs of our society are quickly evolving and soft or transferable skills are key to lifelong learning and the creation of an adaptable and resilient workforce. There is an ever-growing demand for individuals who can process data, evaluate concepts, and develop arguments; the development of critical thinking skills is crucial. This study shows the effectiveness of a professional development model that aimed at improving teachers’ instructional design skills for nurturing critical thinking in the classroom. The study adopted a quantitative research approach in order to identify and assess the transformation in teachers’ pedagogical practices while developing lesson plans designed to elicit and nurture critical thinking among their learners. This study focused on a sample of 16 teachers at a secondary school in Central Uganda. The researcher purposefully selected the teachers, who specialised in three different subjects: English (5), mathematics (5), and history (6). The teachers who participated in the study were on average 32 years old and had 8 years of teaching experience. To evaluate the effectiveness of the lesson plans that the participants designed, the researcher developed a contextualised rubric that was then validated by experts to assess the teachers’ improvements in designing lessons for critical thinking enhancement. The findings confirmed that after the training intervention, the teachers showed a greater ability to differentiate between cognitive process and mere rote learning, helping them to elicit critical thinking in their students. At the end of the process, the lesson plans designed were clearer and more coherent, incorporating activities that could improve the learners’ critical-thinking skills. This study provides an important contribution in terms of how to promote contextually appropriate and innovative pedagogical strategies.

Abrami, P. C., Bernard, R. M., Borokhovski, E., Waddington, D. I., Wade, C. A., & Persson, T. (2015). Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research, 85(2), 275–314. https://doi.org/10.3102/0034654314551063

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, C. A., Surkes, M. A., Tamim, R., & Zhang, D. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage 1 Meta-Analysis. Review of Educational Research, 78(4), 1102–1134. https://doi.org/10.3102/0034654308326084

Applegate, A. J., & Applegate, M. D. (2004). The Peter effect: Reading habits and attitudes of preservice teachers. The Reading Teacher, 57(6), 554–563. https://thoughtfulliteracy.com/Applegate%20and%20Applegate,%202004%20The%20Peter%20Effect.pdf

Aspfors, J., & Fransson, G. (2015). Research on mentor education for mentors of newly qualified teachers: A qualitative meta-synthesis. Teaching and Teacher Education, 48(5), 75–86. https://doi.org/10.1016/j.tate.2015.02.004

Ball, A. L., Knobloch, N. A., & Hoop, S. (2007). The instructional planning experiences of beginning teachers. Journal of Agricultural Education, 48(2), 56–65. https://doi.org/10.5032/jae.2007.02056

Beattie, V., Collins, B., & McInnes, B. (1997). Deep and surface learning: A simple or simplistic dichotomy? International Journal of Phytoremediation, 21(1), 1–12. https://doi.org/10.1080/096392897331587

Cambridge Education. (2012). Lower secondary curriculum assessment and examination reform programme. Labour market survey. https://www.academia.edu/34996839/Lower_Secondary_Curriculum_Assessment_and_Examination_Reform_Programme_Creative_Arts_Learning_Area_Syllabus_National_Curriculum_Development_Centre

Creswell, J. W. (2007). Five qualitative approaches to inquiry. In J. W. Creswell (Ed.), Qualitative inquiry and research design: Choosing among five approaches (2nd ed., pp. 53–84). SAGE Publications.

Dunne, G. (2019). Critical thinking: A neo-Aristotelian perspective [Doctoral Dissertation, Trinity College Dublin]. http://www.tara.tcd.ie/bitstream/handle/2262/86046/Gerry%20Dunne%20thesis%202019.pdf?sequence=3

Elder, L. (2012). Critical thinking: On the need for a minimalist, comprehensive, integrated framework. In M. F. Shaughnessy (Ed.), Critical thinking and higher order thinking: A current perspective (pp. 1–22). Nova Science Publishers. https://novapublishers.com/shop/critical-thinking-and-higher-order-thinking-a-current-perspective/

Elder, L., & Paul, R. W. (1994). Critical thinking: Why we must transform our teaching. Journal of Developmental Education, 18(1), 34–35.

Ennis, R. H. (1964). A definition of critical thinking. The Reading Teacher, 17(8), 599–612.

Ennis, R. H. (1984). Problems in testing informal logic, critical thinking, reasoning ability. Informal Logic, 6(1), 3–9. https://doi.org/10.22329/il.v6i1.2717

Ennis, R. H. (1992). Teaching critical thinking. Educational Studies, 23(4), 462–472.

Ennis, R. H. (2013). Critical Thinking Across the Curriculum (CTAC). OSSA Conference Archive, 44. http://scholar.uwindsor.ca/ossaarchivehttp://scholar.uwindsor.ca/ossaarchive/OSSA10/papersandcommentaries/44

Ennis, R. H. (2018). Critical thinking across the curriculum: A vision. Topoi, 37(1), 165–184. https://doi.org/10.1007/s11245-016-9401-4

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Reserach findings and recommendations. 315. https://www.researchgate.net/publication/242279575_Critical_Thinking_A_Statement_of_Expert_Consensus_for_Purposes_of_Educational_Assessment_and_Instruction

Facione, P. A. (2009). Critical thinking: What it is and why it counts. Measured Reasons LLC. www.insightassessment.com/pdf_files/what&why98.pdf

Fernández-Gómez, E., Martín-Salvador, A., Luque-Vara, T., Sánchez-Ojeda, M. A., Navarro-Prado, S., & Enrique-Mirón, C. (2020). Content validation through expert judgement of an instrument on the nutritional knowledge, beliefs, and habits of pregnant women. Nutrients, 12(4), 1136. https://doi.org/10.3390/nu12041136

Fong, C. J., Kim, Y., Davis, C. W., Hoang, T., & Kim, Y. W. (2017). A meta-analysis on critical thinking and community college student achievement. Thinking Skills and Creativity, 26(5), 71–83. https://doi.org/10.1016/j.tsc.2017.06.002

Fullan, M. (1994a). Change forces: Probing the depths of educational reform. The Fulmer Press. https://files.eric.ed.gov/fulltext/ED373391.pdf

Fullan, M. (1994b). Coordinating top-down and bottom-up strategies for educational reform. In R. J. Anson (Ed.), Systemic reform: Perspectives on personalizing education (pp. 7–24). U.S. Government Printing Office. https://files.eric.ed.gov/fulltext/ED376557.pdf

Garrison, D. R. (2016). E-learning in the 21st century: A community of inquiry framework for research and practice. Taylor & Francis. https://doi.org/10.4324/9781315667263

Gelder, T. Van. (2005). Teaching critical thinking: Some lessons from cognitive science. College Teaching, 53(1), 41–48. https://doi.org/10.3200/CTCH.53.1.41-48

Gellin, A. (2003). The effect of undergraduate student involvement on critical thinking: A meta-analysis of the literature 1991-2000. Journal of College Student Development, 44(6), 746–762. https://doi.org/10.1353/csd.2003.0066

Giacomazzi, M. (2021). Defining critical thinking in Uganda: A constructionist grounded theory study. In A. La Marca, G. Moretti, & I. Vannini (Eds.), La ricerca educativa e didattica nelle scuole di dottorato in Italia (pp. 29–50). Pensa MultiMedia Editore s.r.l. https://www.pensamultimedia.it/download/1720/4226a6da86ae/la-ricerca-educativa_qdd-5-2021.pdf

Giacomazzi, M. (2022). Soft skills assessment and enhancement: A call for contextualisation. GiLE Journal of Skills Development, 2(1), 5–8. https://doi.org/10.52398/gjsd.2022.v2.i1.pp5-8

Giacomazzi, M., Fontana, M., & Camilli Trujillo, C. (2022). Contextualization of critical thinking in sub-Saharan Africa: A systematic integrative review. Thinking Skills and Creativity, 43(1), 100978. https://doi.org/10.1016/j.tsc.2021.100978

Giacomazzi, M., Zecca, L., & Maggioni, L. (2022). Enhancing critical thinking in Ugandan secondary school classrooms: Teacher professional development action research. Advance. Preprint. https://doi.org/10.31124/advance.19518136.v1

Goldston, M. J., Day, J. B., Sundberg, C., & Dantzler, J. (2010). Psychometric analysis of a 5E learning cycle lesson plan assessment instrument. International Journal of Science and Mathematics Education, 8(4), 633–648. https://doi.org/10.1007/s10763-009-9178-7

Grosser, M. M., & Lombard, B. J. J. (2008). The relationship between culture and the development of critical thinking abilities of prospective teachers. Teaching and Teacher Education: An International Journal of Research and Studies, 24(5), 1364–1375. https://doi.org/10.1016/j.tate.2007.10.001

Hager, P., & Kaye, M. (1992). Critical thinking in teacher education: A process-oriented research agenda. Australian Journal of Teacher Education, 17(2), 26–33. https://doi.org/10.14221/ajte.1992v17n2.4

Halpern, D. F. (1993). Assessing the effectiveness of critical-thinking instruction. The Journal of General Education, 42(4), 238–254. https://doi.org/10.1353/jge.2001.0024

Halpern, D. F. (2001). Why wisdom? Educational Psychologist, 36(4), 253–256. https://doi.org/10.1207/S15326985EP3604_4

Halpern, D. F. (2014). Critical thinking across the curriculum: A brief edition of thought and knowledge. Routledge. https://doi.org/10.4324/9781315805719

Hipp, K. K., Huffman, J. B., Pankake, A. M., & Olivier, D. F. (2008). Sustaining professional learning communities: Case studies. Journal of Educational Change, 9(2), 173–195. https://doi.org/10.1007/s10833-007-9060-8

Jacobs, C. L., Martin, S. N., & Otieno, T. C. (2008). A science lesson plan analysis instrument for formative and summative program evaluation of a teacher education program. Science Education, 92(6), 1096–1126. https://doi.org/10.1002/sce.20277

Kadir, M. A. A. (2017). What teacher knowledge matters in effectively developing critical thinkers in the 21st Century curriculum? Thinking Skills and Creativity, 23(1), 79–90. https://doi.org/10.1016/j.tsc.2016.10.011

Könings, K. D., Brand-Gruwel, S., & van Merriënboer, J. J. G. (2010). An approach to participatory instructional design in secondary education: An exploratory study. Educational Research, 52(1), 45–59. https://doi.org/10.1080/00131881003588204

Lipman, M. (1988). Critical thinking: What can it be? Institute for Critical Thinking. https://eric.ed.gov/?id=ED352326

Lynn, M. R. (1986). Determination and quantification of content validity. Nursing Research, 35(6), 382–385. https://doi.org/10.1097/00006199-198611000-00017

Madondo, M. M. (2018). A requiem too soon or a landing strand too far? Teacher-centred pedagogy versus teaching for critical thinking in the Zimbabwe curriculum framework 2015-2022. Zimbabwe Journal of Educational Research, 30(1), 1–14. https://www.ajol.info/index.php/zjer/article/view/169064

Marin, L. M., & Halpern, D. F. (2011). Pedagogy for developing critical thinking in adolescents: Explicit instruction produces greatest gains. Thinking Skills and Creativity, 6(1), 1–13. https://doi.org/10.1016/j.tsc.2010.08.002

McPeck, J. E. (1990). Critical thinking and subject specificity: A reply to Ennis. Educational Researcher, 19(4), 10–12. https://doi.org/10.3102%2F0013189X019004010

Mertler, C. A. (2008). Introduction to educational research (3rd ed.). SAGE Publications. https://eric.ed.gov/?id=ED426074

Mignolo, W. D., & Escobar, A. (Eds.). (2013). Globalization and the decolonial option. Routledge. https://doi.org/10.4324/9781315868448

Miller, D. I., & Halpern, D. F. (2014). The new science of cognitive sex differences. Trends in Cognitive Sciences, 18(1), 37–45. https://doi.org/10.1016/j.tics.2013.10.011

Ministry of Education and Sports. (2019). The National Teacher Policy. Ministry of Education and Sports. https://www.education.go.ug/wp-content/uploads/2022/04/National-Teachers-policy.pdf

Ministry of Education and Sports. (2020). Final study report on the implementation of teacher education curriculum in Uganda. Ministry of Education and Sports

Mitana, J. M. V., Giacomazzi, M., & Fontana, M. (2021). The role of assessment practices in fostering higher order thinking skills: The case of Uganda certificate of education. American Journal of Educational Research, 9(10), 612–620. https://doi.org/10.12691/education-9-10-2

Mitana, J. M. V., Muwagga, A. M., Giacomazzi, M., Kizito, O. S., & Ariapa, M. (2019). Assessing educational outcomes in the 21st century in Uganda: A focus on soft skills. Journal of Emerging Trends in Educational Research and Policy Studies, 10(1), 62–70. https://journals.co.za/doi/10.10520/EJC-17aa80fc14

Moseley, D., Baumfield, V., Elliott, J., Higgins, S., Miller, J., Newton, D. P., & Gregson, M. (2005). Frameworks for thinking: A handbook for teaching and learning. Cambridge University Press. https://doi.org/10.1017/CBO9780511489914

NCDC. (2019). Lower secondary curriculum. National Curriculum Development Centre. https://www.mukalele.net/wp-content/uploads/2021/12/New-Curriculum-Framework-with-Subject-Menu-Ammendment.pdf

Ngudgratoke, S. (2018). A meta-analysis of relationships between teaching methods and critical thinking. 61st World Assembly ICET 2017, 258–268. https://www.researchgate.net/profile/Dev-Paneru-3/publication/329643785_INFORMATION_COMMUNICATION_TECHNOLOGY_IN_TEACHING_ENGLISH_AS_FOREIGN_LANGUAGE_AN_ANALYSIS_OF_TEACHING_METHODS_IN_CLASS_FROM_THE_PERSPECTIVE_OF_JOURNALS/links/5c137c9f299bf139c75725e2/I

Niu, L., Behar-Horenstein, L. S., & Garvan, C. W. (2013). Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educational Research Review, 9(2), 114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Onen, D. (2019). Cultivating critical thinking amongst university graduate students. In J. Keengwe & R. Byamukama (Eds.), Handbook of research on promoting higher-order skills and global competencies in life and work (pp. 209–224). IGI Global. https://doi.org/10.4018/978-1-5225-6331-0.ch013

Pascarella, E. T., Pierson, C. T., Wolniak, G. C., & Terenzini, P. T. (2004). First-generation college students. The Journal of Higher Education, 75(3), 249–284. https://doi.org/10.1080/00221546.2004.11772256

Pascarella, E. T., & Terenzini, P. T. (2009). The impact of college on students: Myths, rational myths, and some other things that may not be true. NACADA Journal, 29(1), 90–97. https://doi.org/10.12930/0271-9517-29.1.90

Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). SAGE Publications.

Paul, R. W., Binker, A. J. A., Vetrano, C., & Kreklau, H. (1989). Critical thinking handbook: 6th-9th Grades. A guide for remodelling lesson plans in language arts, social studies, & science. Center for Critical Thinking and Moral Critique Sonoma State University.

Paul, R. W., & Elder, L. (2002). Critical thinking: Tools for taking charge of your professional and personal life. Prentice Hall. https://www.scinapse.io/papers/72277116

Paul, R. W., & Elder, L. (2005). A guide for educators to critical thinking competency standards, principles, performance indicators, and outcomes with a critical thinking master rubric. Foundation for Critical Thinking.

Perkins, C., & Murphy, E. (2006). Identifying and measuring individual engagement in critical thinking in online discussions: An exploratory case study. Journal of Educational Technology & Society, 9(1), 298–307. http://www.ifets.info/journals/9_1/24.pdf

Polit, D. F., & Beck, C. T. (2006). The content validity index: Are you sure you know what’s being reported? Critique and recommendations. Research in Nursing & Health, 29(5), 489–497. https://doi.org/10.1002/nur.20147

Pun, K. F. (2013). Assessing critical thinking skills of project management practitioners: An exploratory study. The Journal of the Association of Professional Engineers of Trinidad and Tobago, 41(1), 35–42. https://www.academia.edu/14903058/00_JAPETT_Vol41_No1_April_and_May2013_Full_version_

Schendel, R. (2015). Critical thinking at Rwanda’s public universities: Emerging evidence of a crucial development priority. International Journal of Educational Development, 42(3), 96–105. https://doi.org/10.1016/j.ijedudev.2015.04.003

Shaughnessy, M. F. (Ed.). (2012). Critical thinking and higher order thinking: A current perspective. Nova Science Publisher Inc. https://novapublishers.com/shop/critical-thinking-and-higher-order-thinking-a-current-perspective/

Siegel, H. (1989). The rationality of science, critical thinking, and science education. Synthese, 80(1), 9–41. https://doi.org/10.1007/BF00869946

Siegel, H. (2004). Rationality and judgement. Metaphilosophy, 35(5), 597. https://doi.org/10.1111/j.1467-9973.2004.00340.x

Subramaniam, K. (2005). Creating a microteaching evaluation form: The needed evaluation criteria. Education, 126(4), 666–678. http://www.projectinnovation.biz/education.html%0A

Tsui, L. (2003). Reproducing social inequalities through higher education: Critical thinking as valued capital. The Journal of Negro Education, 72(3), 318. https://doi.org/10.2307/3211250

van den Akker, J. (2007). Curriculum design research. In T. Plomp & N. Nieveen (Eds.), An Introduction to Educational Design Research (pp. 37–52). SLO. https://ris.utwente.nl/ws/portalfiles/portal/14472302/Introduction_20to_20education_20design_20research.pdf%0A

van Gelder, T. (2015). Using argument mapping to improve critical thinking skills. In M. Davies & R. Barnett (Eds.), The Palgrave handbook of critical thinking in higher education (pp. 183–192). Palgrave Macmillan. https://doi.org/10.1057/9781137378057_12

Walker, D. F. (2003). Fundamentals of curriculum: Passion and professionalism (2nd ed.). Routledge.

Walsh, D., & Paul, R. W. (1986). The goal of critical thinking: From educational ideal to educational reality. American Federation of Teachers. https://files.eric.ed.gov/fulltext/ED295916.pdf

Willingham, D. T. (2008). Critical thinking: Why it is so hard to teach? Arts Education Policy Review, 109(4), 21–32. https://doi.org/10.3200/AEPR.109.4.21-32

Woolfolk, A. E. (1998). Educational psychology (7th ed.). Allyn & Bacon.

Woolfolk Hoy, A., Davis, H. A., & Anderman, E. M. (2013). Theories of learning and teaching in TIP. Theory Into Practice, 52(1), 9–21. https://doi.org/10.1080/00405841.2013.795437

Zhou, G., & Xu, J. (2017). Microteaching lesson study: An approach to prepare teacher candidates to teach science through inquiry. International Journal of Education in Mathematics, Science and Technology, 5(3), 235–235. https://doi.org/10.18404/ijemst.296039

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License .

Copyright (c) 2023 Mauro Giacomazzi, Edimond Serwanga, Gillian Atuheire

Most read articles by the same author(s)

  • Mauro Giacomazzi, Soft skills assessment and enhancement , GILE Journal of Skills Development: Vol. 2 No. 1 (2022): GiLE Journal of Skills Development

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 31 August 2024

Development and validation of a higher-order thinking skills (HOTS) scale for major students in the interior design discipline for blended learning

  • Dandan Li 1 ,
  • Xiaolei Fan 2 &
  • Lingchao Meng 3  

Scientific Reports volume  14 , Article number:  20287 ( 2024 ) Cite this article

Metrics details

  • Environmental social sciences

Assessing and cultivating students’ HOTS are crucial for interior design education in a blended learning environment. However, current research has focused primarily on the impact of blended learning instructional strategies, learning tasks, and activities on the development of HOTS, whereas few studies have specifically addressed the assessment of these skills through dedicated scales in the context of blended learning. This study aimed to develop a comprehensive scale for assessing HOTS in interior design major students within the context of blended learning. Employing a mixed methods design, the research involved in-depth interviews with 10 education stakeholders to gather qualitative data, which informed the development of a 66-item soft skills assessment scale. The scale was administered to a purposive sample of 359 undergraduate students enrolled in an interior design program at a university in China. Exploratory and confirmatory factor analyses were also conducted to evaluate the underlying factor structure of the scale. The findings revealed a robust four-factor model encompassing critical thinking skills, problem-solving skills, teamwork skills, and practical innovation skills. The scale demonstrated high internal consistency (Cronbach's alpha = 0.948–0.966) and satisfactory convergent and discriminant validity. This scale provides a valuable instrument for assessing and cultivating HOTS among interior design major students in blended learning environments. Future research can utilize a scale to examine the factors influencing the development of these skills and inform instructional practices in the field.

Similar content being viewed by others

critical thinking in instructional design

A meta-analysis of the effects of design thinking on student learning

critical thinking in instructional design

Blended knowledge sharing model in design professional

critical thinking in instructional design

Using design thinking for interdisciplinary curriculum design and teaching: a case study in higher education

Introduction.

In the contemporary landscape of the twenty-first century, students face numerous challenges that necessitate the development of competitive skills, with a particular emphasis on the cultivation of HOTS 1 , 2 , 3 , this has become a crucial objective in educational reform. Notably, it is worth noting that the National Education Association (NEA, 2012) has clearly identified critical thinking and problem-solving, communication, collaboration, creativity, and innovation as key competencies that students must possess in the current era, which are considered important components of twenty-first century skills 4 , 5 , 6 , 7 . As learners in the fields of creativity and design, students in the interior design profession also need to possess HOTS to address complex design problems and the evolving demands of the industry 8 , 9 .

Currently, blended learning has become an important instructional model in interior design education 10 , 11 . It serves as a teaching approach that combines traditional face-to-face instruction with online learning, providing students with a more flexible and personalized learning experience 12 , 13 . Indeed, several scholars have recognized the benefits of blended learning in providing students with diverse learning resources, activities, and opportunities for interaction, thereby fostering HOTS 14 , 15 , 16 , 17 . For example, blended learning, as evidenced by studies conducted by Anthony et al. 10 and Castro 11 , has demonstrated its efficacy in enhancing students' HOTS. The integration of online resources, virtual practices, and online discussions in blended learning fosters active student engagement and improves critical thinking, problem solving, and creative thinking skills. Therefore, teachers need to determine appropriate assessment methods and construct corresponding assessment tasks to assess students' expected learning outcomes. This decision requires teachers to have a clear understanding of students' learning progress and the development of various skills, whereas students have knowledge of only their scores and lack awareness of their individual skill development 18 , 19 .

Nevertheless, the precise assessment of students' HOTS in the blended learning milieu poses a formidable challenge. The dearth of empirically validated assessment tools impedes researchers from effectively discerning students' levels of cognitive aptitude and developmental growth within the blended learning realm 20 , 21 , 22 . In addition, from the perspective of actual research topics, current studies on blended learning focus mainly on the "concept, characteristics, mechanisms, models, and supporting technologies of blended learning 23 . " Research on "measuring students' HOTS in blended learning" is relatively limited, with most of the focus being on elementary, middle, and high school students 24 , 25 . Few studies have specifically examined HOTS measurement in the context of university students 26 , 27 , particularly in practical disciplines such as interior design. For example, Bervell et al. 28 suggested that the lack of high-quality assessment scales inevitably impacts the quality of research. Additionally, Schmitt 29 proposed the “Three Cs” principle for measurement, which includes clarity, coherence, and consistency. He highlighted that high-quality assessment scales should possess clear and specific measurement objectives, logically coherent items, and consistent measurement results to ensure the reliability and validity of the data. This reflects the importance of ensuring the alignment of the measurement goals of assessment scales with the research questions and the content of the discipline in the design of assessments.

The development of an assessment scale within the blended learning environment is expected to address the existing gap in measuring and assessing HOTS scores in interior design education. This scale not only facilitates the assessment of students' HOTS but also serves as a guide for curriculum design, instructional interventions, and student support initiatives. Ultimately, the integration of this assessment scale within the blended learning environment has the potential to optimize the development of HOTS among interior design students, empowering them to become adept critical thinkers, creative problem solvers, and competent professionals in the field.

Therefore, this study follows a scientific scale development procedure to develop an assessment scale specifically designed to measure the HOTS of interior design students in blended learning environments. This endeavor aims to provide educators with a reliable instrument for assessing students' progress in cultivating and applying HOTS, thus enabling the implementation of more effective teaching strategies and enhancing the overall quality of interior design education. The research questions are as follows:

What key dimensions should be considered when developing a HOTS assessment scale to accurately capture students' HOTS in an interior design major blended learning environment?

How can an advanced thinking skills assessment scale for blended learning in interior design be developed?

How can the reliability and validity of the HOTS assessment scale be verified and ensured, and is it reliable and effective in the interior design of major blended learning environments?

Key dimensions of HOTS assessment scale in an interior design major blended learning environment

The research results indicate that in the blended learning environment of interior design, this study identified 16 initial codes representing key dimensions for enhancing students' HOTS. These codes were further categorized into 8 main categories and 4 overarching themes: critical thinking, problem-solving, teamwork skills and practical innovation skills. They provide valuable insights for data comprehension and analysis, serving as a comprehensive framework for the HOTS scale. Analyzing category frequency and assessing its significance and universality in a qualitative dataset hold significant analytical value 30 , 31 . High-frequency terms indicate the central position of specific categories in participants' narratives, texts, and other data forms 32 . Through interviews with interior design experts and teachers, all core categories were mentioned more than 20 times, providing compelling evidence of their universality and importance within the field of interior design's HOTS dimensions. As shown in Table 1 .

Themes 1: critical thinking skills

Critical thinking skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS. This discovery emphasizes the importance of critical thinking in interior design learning. This mainly includes the categories of logical reasoning and judgment, doubt and reflection, with a frequency of more than 8, highlighting the importance of critical thinking skills. Therefore, a detailed discussion of each feature is warranted. As shown in Table 2 .

Category 1: logical reasoning and judgment

The research results indicate that in a blended learning environment for interior design, logical reasoning and judgment play a key role in cultivating critical thinking skills. Logical reasoning refers to inferring reasonable conclusions from information through analysis and evaluation 33 . Judgment is based on logic and evidence for decision-making and evaluation. The importance of these concepts lies in their impact on the development and enhancement of students' HOTS. According to the research results, interior design experts and teachers unanimously believe that logical reasoning and judgment are very important. For example, as noted by Interviewee 1, “For students, logical reasoning skills are still very important. Especially in indoor space planning, students use logical reasoning to determine whether the layout of different functional areas is reasonable”. Similarly, Interviewee 2 also stated that “logical reasoning can help students conduct rational analysis of various design element combinations during the conceptual design stage, such as color matching, material selection, and lighting application”.

As emphasized by interviewees 1 and 2, logical reasoning and judgment are among the core competencies of interior designers in practical applications. These abilities enable designers to analyze and evaluate design problems and derive reasonable solutions from them. In the interior design industry, being able to conduct accurate logical reasoning and judgment is one of the key factors for success. Therefore, through targeted training and practice, students can enhance their logical thinking and judgment, thereby better addressing design challenges and providing innovative solutions.

Category 2: skepticism and reflection

Skepticism and reflection play crucial roles in cultivating students' critical thinking skills in a blended learning environment for interior design. Doubt can prompt students to question and explore information and viewpoints, whereas reflection helps students think deeply and evaluate their own thinking process 34 . These abilities are crucial for cultivating students' higher-order thinking skills. According to the research findings, most interior design experts and teachers agree that skepticism and reflection are crucial. For example, as noted by interviewees 3, “Sometimes, when facing learning tasks, students will think about how to better meet the needs of users”. Meanwhile, Interviewee 4 also agreed with this viewpoint. As emphasized by interviewees 3 and 4, skepticism and reflection are among the core competencies of interior designers in practical applications. These abilities enable designers to question existing perspectives and practices and propose innovative design solutions through in-depth thinking and evaluation. Therefore, in the interior design industry, designers with the ability to doubt and reflect are better able to respond to complex design needs and provide clients with unique and valuable design solutions.

Themes 2: problem-solving skills

The research findings indicate that problem-solving skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS. This discovery emphasizes the importance of problem-solving skills in interior design learning. Specifically, categories such as identifying and defining problems, as well as developing and implementing plans, have been studied more than 8 times, highlighting the importance of problem-solving skills. Therefore, it is necessary to discuss each function in detail to better understand and cultivate students' problem-solving skills. As shown in Table 3 .

Category 1: identifying and defining issues

The research findings indicate that in a blended learning environment for interior design, identifying and defining problems play a crucial role in fostering students' problem-solving skills. Identifying and defining problems require students to possess the ability to analyze and evaluate problems, enabling them to accurately determine the essence of the problems and develop effective strategies and approaches to solve them 35 . Interior design experts and teachers widely recognize the importance of identifying and defining problems as core competencies in interior design practice. For example, Interviewee 5 emphasized the importance of identifying and defining problems, stating, "In interior design, identifying and defining problems is the first step in addressing design challenges. Students need to be able to clearly identify the scope, constraints, and objectives of the problems to engage in targeted thinking and decision-making in the subsequent design process." Interviewee 6 also supported this viewpoint. As stressed by Interviewees 5 and 6, identifying and defining problems not only require students to possess critical thinking abilities but also necessitate broad professional knowledge and understanding. Students need to comprehend principles of interior design, spatial planning, human behavior, and other relevant aspects to accurately identify and define problems associated with design tasks.

Category 2: developing and implementing a plan

The research results indicate that in a blended learning environment for interior design, developing and implementing plans plays a crucial role in cultivating students' problem-solving abilities. The development and implementation of a plan refers to students identifying and defining problems, devising specific solutions, and translating them into concrete implementation plans. Specifically, after determining the design strategy, students refine it into specific implementation steps and timelines, including drawing design drawings, organizing PPT reports, and presenting design proposals. For example, Interviewee 6 noted, “Students usually break down design strategies into specific tasks and steps by refining them.” Other interviewees also unanimously support this viewpoint. As emphasized by respondent 6, developing and implementing plans can help students maintain organizational, systematic, and goal-oriented problem-solving skills, thereby enhancing their problem-solving skills.

Themes 3: teamwork skills

The research results indicate that teamwork skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS. This discovery emphasizes the importance of teamwork skills in interior design learning. This mainly includes communication and coordination and division of labor and collaboration, which are mentioned frequently in the interview documents. Therefore, it is necessary to discuss each function in detail to better understand and cultivate students' teamwork skills. As shown in Table 4 .

Category 1: communication and coordination

The research results indicate that communication and collaboration play crucial roles in cultivating students' teamwork abilities in a blended learning environment for interior design. Communication and collaboration refer to the ability of students to effectively share information, understand each other's perspectives, and work together to solve problems 36 . Specifically, team members need to understand each other's resource advantages integrate and share these resources to improve work efficiency and project quality. For example, Interviewee 7 noted, “In interior design, one member may be skilled in spatial planning, while another member may be skilled in color matching. Through communication and collaboration, team members can collectively utilize this expertise to improve work efficiency and project quality.” Other interviewees also unanimously believe that this viewpoint can promote students' teamwork skills, thereby promoting the development of their HOTS. As emphasized by the viewpoints of these interviewees, communication and collaboration enable team members to collectively solve problems and overcome challenges. Through effective communication, team members can exchange opinions and suggestions with each other, provide different solutions, and make joint decisions. Collaboration and cooperation among team members contribute to brainstorming and finding the best solution.

Category 2: division of labor and collaboration

The research results indicate that in the blended learning environment of interior design, the division of labor and collaboration play crucial roles in cultivating students' teamwork ability. The division of labor and collaboration refer to the ability of team members to assign different tasks and roles in a project based on their respective expertise and responsibilities and work together to complete the project 37 . For example, Interviewee 8 noted, “In an internal design project, some students are responsible for space planning, some students are responsible for color matching, and some students are responsible for rendering production.” Other interviewees also support this viewpoint. As emphasized by interviewee 8, the division of labor and collaboration help team members fully utilize their respective expertise and abilities, promote resource integration and complementarity, cultivate a spirit of teamwork, and enable team members to collaborate, support, and trust each other to achieve project goals together.

Themes 4: practical innovation skills

The research results indicate that practical innovation skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS. This discovery emphasizes the importance of practical innovation skills in interior design learning. This mainly includes creative conception and design expression, as well as innovative application of materials and technology, which are often mentioned in interview documents. Therefore, it is necessary to discuss each function in detail to better understand and cultivate students' practical innovation skills. As shown in Table 5 .

Category 1: creative conception and design expression

The research results indicate that in the blended learning environment of interior design, creative ideation and design expression play crucial roles in cultivating students' practical and innovative skills. Creative ideation and design expression refer to the ability of students to break free from traditional thinking frameworks and try different design ideas and methods through creative ideation, which helps stimulate their creativity and cultivate their ability to think independently and solve problems. For example, interviewee 10 noted that "blended learning environments combine online and offline teaching modes, allowing students to acquire knowledge and skills more flexibly. Through learning and practice, students can master various expression tools and techniques, such as hand-drawn sketches, computer-aided design software, model making, etc., thereby more accurately conveying their design concepts." Other interviewees also expressed the importance of this viewpoint, emphasizing the importance of creative ideas and design expression in blended learning environments that cannot be ignored. As emphasized by interviewee 10, creative ideation and design expression in the blended learning environment of interior design can not only enhance students' creative thinking skills and problem-solving abilities but also strengthen their application skills in practical projects through diverse expression tools and techniques. The cultivation of these skills is crucial for students' success in their future careers.

Category 2: innovative application of materials and technology

Research findings indicate that the innovative application of materials and technology plays a crucial role in developing students' practical and creative skills within a blended learning environment for interior design. The innovative application of materials and technology refers to students' exploration and utilization of new materials and advanced technologies, enabling them to overcome the limitations of traditional design thinking and experiments with diverse design methods and approaches. This process not only stimulates their creativity but also significantly enhances their problem-solving skills. Specifically, the innovative application of materials and technology involves students gaining a deep understanding of the properties of new materials and their application methods in design, as well as becoming proficient in various advanced technological tools and equipment, such as 3D printing, virtual reality (VR), and augmented reality (AR). These skills enable students to more accurately realize their design concepts and effectively apply them in real-world projects.

For example, Interviewee 1 stated, "The blended learning environment combines online and offline teaching modes, allowing students to flexibly acquire the latest knowledge on materials and technology and apply these innovations in real projects." Other interviewees also emphasized the importance of this view. Therefore, the importance of the innovative application of materials and technology in a blended learning environment cannot be underestimated. As emphasized by interviewee 1, the innovative application of materials and technologies is crucial in the blended learning environment of interior design. This process not only enables students to flexibly acquire the latest materials and technical knowledge but also enables them to apply these innovations to practice in practical projects, thereby improving their practical abilities and professional ethics.

In summary, through research question 1 research, the dimensions of the HOTS assessment scale in blended learning for interior design include four main aspects: critical thinking skills, problem-solving skills, teamwork skills, and practical innovation skills. Based on the assessment scales developed by previous scholars in various dimensions, the researcher developed a HOTS assessment scale suitable for blended learning environments in interior design and collected feedback from interior design experts through interviews.

Development of the HOTS assessment scale

The above research results indicate that the dimensions of the HOTS scale mainly include critical thinking, problem-solving, teamwork skills and practical innovation skills. The dimensions of a scale represent the abstract characteristics and structure of the concept being measured. Since these dimensions are often abstract and difficult to measure directly, they need to be converted into several concrete indicators that can be directly observed or self-reported 38 . These concrete indicators, known as dimension items, operationalize the abstract dimensions, allowing for the measurement and evaluation of various aspects of the concept. This process transforms the abstract dimensions into specific, measurable components. The following content is based on the results of research question 1 to develop an advanced thinking skills assessment scale for mixed learning in interior design.

Dimension 1: critical thinking skills

The research results indicate that critical thinking skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS. Critical thinking skills refer to the ability to analyze information objectively and make a reasoned judgment 39 . Scholars tend to emphasize this concept as a method of general skepticism, rational thinking, and self-reflection 7 , 40 . For example, Goodsett 26 suggested that it should be based on rational skepticism and careful thought about external matters as well as open self-reflection about internal thoughts and actions. Moreover, the California Critical Thinking Disposition Inventory (CCTDI) is widely used to measure critical thinking skills, including dimensions such as seeking truth, confidence, questioning and courage to seek truth, curiosity and openness, as well as analytical and systematic methods 41 . In addition, maturity means continuous adjustment and improvement of a person's cognitive system and learning activities through continuous awareness, reflection, and self-awareness 42 . Moreover, Nguyen 43 confirmed that critical thinking and cognitive maturity can be achieved through these activities, emphasizing that critical thinking includes cognitive skills such as analysis, synthesis, and evaluation, as well as emotional tendencies such as curiosity and openness.

In addition, in a blended learning environment for interior design, critical thinking skills help students better understand, evaluate, and apply design knowledge and skills, cultivating independent thinking and innovation abilities 44 . If students lack these skills, they may accept superficial information and solutions without sufficient thinking and evaluation, resulting in the overlooking of important details or the selection of inappropriate solutions in the design process. Therefore, for the measurement of critical thinking skills, the focus should be on cognitive skills such as analysis, synthesis, and evaluation, as well as curiosity and open mindedness. The specific items for critical thinking skills are shown in Table 6 .

Dimension 2: problem-solving skills

Problem-solving skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS. Problem-solving skills involve the ability to analyze and solve problems by understanding them, identifying their root causes, and developing appropriate solutions 45 . According to the 5E-based STEM education approach, problem-solving skills encompass the following abilities: problem identification and definition, formulation of problem-solving strategies, problem representation, resource allocation, and monitoring and evaluation of solution effectiveness 7 , 46 . Moreover, D'zurilla and Nezu 47 and Tan 48 indicated that attitudes, beliefs, and knowledge skills during problem solving, as well as the quality of proposed solutions and observable outcomes, are demonstrated. In addition, D'Zurilla and Nezu devised the Social Problem-Solving Inventory (SPSI), which comprises seven subscales: cognitive response, emotional response, behavioral response, problem identification, generation of alternative solutions, decision-making, and solution implementation. Based on these research results, the problem-solving skills dimension questions designed in this study are shown in Table 7 .

Dimension 3: teamwork skills

The research results indicate that teamwork skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS. Teamwork skills refer to the ability to effectively collaborate, coordinate, and communicate with others in a team environment 49 . For example, the Teamwork Skills Assessment Tool (TWKSAT) developed by Stevens and Campion 50 identifies five core dimensions of teamwork: conflict management; collaborative problem-solving; communication; goal setting; performance management; decision-making; and task coordination. The design of this tool highlights the essential skills in teamwork and provides a structured approach for evaluating these skills. In addition, he indicated that successful teams need to have a range of skills for problem solving, including situational control, conflict management, decision-making and coordination, monitoring and feedback, and an open mindset. These skills help team members effectively address complex challenges and demonstrate the team’s collaboration and flexibility. Therefore, the assessment of learners' teamwork skills needs to cover the above aspects. As shown in Table 8 .

Dimension 4: practice innovative skills

The research results indicate that practical innovation skills constitute a key core category in blended learning environments for interior design, which is crucial for cultivating students' HOTS. The practice of innovative skills encompasses the utilization of creative cognitive processes and problem-solving strategies to facilitate the generation of original ideas, solutions, and approaches 51 . This practice places significant emphasis on two critical aspects: creative conception and design expression, as well as the innovative application of materials and technology. Tang et al. 52 indicated that creative conception and design expression involve the generation and articulation of imaginative and inventive ideas within a given context. With the introduction of concepts such as 21st-century learning skills, the "5C" competency framework, and core student competencies, blended learning has emerged as the goal and direction of educational reform. It aims to promote the development of students' HOTS, equipping them with the essential qualities and key abilities needed for lifelong development and societal advancement. Blended learning not only emphasizes the mastery of core learning content but also requires students to develop critical thinking, complex problem-solving, creative thinking, and practical innovation skills. To adapt to the changes and developments in the blended learning environment, this study designed 13 preliminary test items based on 21st-century learning skills, the "5C" competency framework, core student competencies, and the TTCT assessment scale developed by Torrance 53 . These items aim to assess students' practice of innovative skills within a blended learning environment, as shown in Table 9 .

The researchers' results indicate that the consensus among the interviewed expert participants is that the structural integrity of the scale is satisfactory and does not require modification. However, certain measurement items have been identified as problematic and require revision. The primary recommendations are as follows: Within the domain of problem-solving skills, the item "I usually conduct classroom and online learning with questions and clear goals" was deemed biased because of its emphasis on the "online" environment. Consequently, the evaluation panel advised splitting this item into two separate components: (1) "I am adept at frequently adjusting and reversing a negative team atmosphere" and (2) "I consistently engage in praising and encouraging others, fostering harmonious relationships. “The assessment process requires revisions and adjustments to specific projects, forming a pilot test scale consisting of 66 observable results from the original 65 items. In addition, there were other suggestions about linguistic formulation and phraseology, which are not expounded upon herein.

Verify the effectiveness of the HOTS assessment scale

The research results indicate that there are significant differences in the average scores of the four dimensions of the HOTS, including critical thinking skills (A1–A24 items), problem-solving skills (B1–B13 items), teamwork skills (C1–C16 items), and practical innovation skills (D1–D13 items). Moreover, this also suggests that each item has discriminative power. Specifically, this will be explained through the following aspects.

Project analysis based on the CR value

The critical ratio (CR) method, which uses the CR value (decision value) to remove measurement items with poor discrimination, is the most used method in project analysis. The specific process involves the use of the CR value (critical value) to identify and remove such items. First, the modified pilot test scale data are aggregated and sorted. Individuals representing the top and bottom 27% of the distribution were subsequently selected, constituting 66 respondents in each group. The high-score group comprises individuals with a total score of 127 or above (including 127), whereas the low-score group comprises individuals with a total score of 99 or below (including 99). Finally, an independent sample t test was conducted to determine the significant differences in the mean scores for each item between the high-score and low-score groups. The statistical results are presented in Table 10 .

The above table shows that independent sample t tests were conducted for all the items; their t values were greater than 3, and their p values were less than 0.001, indicating that the difference between the highest and lowest 27% of the samples was significant and that each item had discriminative power.

In summary, based on previous research and relevant theories, the HOTS scale for interior design was revised. This revision process involved interviews with interior design experts, teachers, and students, followed by item examination and homogeneity testing via the critical ratio (CR) method. The results revealed significant correlations ( p  < 0.01) between all the items and the total score, with correlation coefficients (R) above 0.4. Therefore, the scale exhibits good accuracy and internal consistency in capturing measured HOTS. These findings provide a reliable foundation for further research and practical applications.

Pilot study exploratory factor analysis

This study used SPSS (version 28) to conduct the KMO and Bartlett tests on the scale. The total HOTS test scale as well as the KMO and Bartlett sphericities were first calculated for the four subscales to ensure that the sample data were suitable for factor analysis 7 . The overall KMO value is 0.946, indicating that the data are highly suitable for factor analysis. Additionally, Bartlett's test of sphericity was significant, further supporting the appropriateness of conducting factor analysis ( p  < 0.05). All the values are above 0.7, indicating that the data for these subscales are also suitable for factor analysis. According to Javadi et al. 54 , these results suggest the presence of shared factors among the items within the subscales, as shown in Table 11 .

For each subscale, exploratory factor analysis was conducted to extract factors with eigenvalues greater than 1 while eliminating items with communalities less than 0.30, loadings less than 0.50, and items that cross multiple (more than one) common factors 55 , 56 . Additionally, items that were inconsistent with the assumed structure of the measure were identified and eliminated to ensure the best structural validity. These principles were applied to the factor analysis of each subscale, ensuring that the extracted factor structure and observed items are consistent with the hypothesized measurement structure and analysis results, as shown in the table 55 , 58 . In the exploratory factor analysis (EFA), the latent variables were effectively interpreted and demonstrated a significant response, with cumulative explained variances of the common factors exceeding 60%. This finding confirms the alignment between the scale structure, comprising the remaining items, and the initial theoretical framework proposed in this study. Additionally, the items were systematically reorganized to construct the final questionnaire. Consequently, items A1 to A24 were associated with the critical thinking skills dimension, items B25 to B37 were linked to problem-solving skills, items C38 to C53 were indicative of teamwork skills, and items D54 to D66 were reflective of practical innovation skills. As shown in Table 12 below.

In addition, the criterion for extracting principal components in factor analysis is typically based on eigenvalues, with values greater than 1 indicating greater explanatory power than individual variables. The variance contribution ratio reflects the proportion of variance explained by each principal component relative to the total variance and signifies the ability of the principal component to capture comprehensive information. The cumulative variance contribution ratio measures the accumulated proportion of variance explained by the selected principal components, aiding in determining the optimal number of components to retain while minimizing information loss. The above table shows that four principal components can be extracted from the data, and their cumulative variance contribution rate reaches 59.748%.

However, from the scree plot (as shown in Fig.  1 ), the slope flattens starting from the fifth factor, indicating that no distinct factors can be extracted beyond that point. Therefore, retaining four factors seems more appropriate. The factor loading matrix is the core of factor analysis, and the values in the matrix represent the factor loading of each item on the common factors. Larger values indicate a stronger correlation between the item variable and the common factor. For ease of analysis, this study used the maximum variance method to rotate the initial factor loading matrix, redistributing the relationships between the factors and original variables and making the correlation coefficients range from 0 to 1, which facilitates interpretation. In this study, factor loadings with absolute values less than 0.4 were filtered out. According to the analysis results, the items of the HOTS assessment scale can be divided into four dimensions, which is consistent with theoretical expectations.

figure 1

Gravel plot of factors.

Through the pretest of the scale and selection of measurement items, 66 measurement items were ultimately determined. On this basis, a formal scale for assessing HOTS in a blended learning environment was developed, and the reliability and validity of the scale were tested to ultimately confirm its usability.

Confirmatory factor analysis of final testing

Final test employed that AMOS (version 26.0), a confirmatory factor analysis (CFA) was conducted on the retested sample data to validate the stability of the HOTS structural model obtained through exploratory factor analysis. This analysis aimed to assess the fit between the measurement results and the actual data, confirming the robustness of the derived HOTS structure and its alignment with the empirical data. The relevant model was constructed based on the factor structure of each component obtained through EFA and the observed variables, as shown in the diagram. The model fit indices are presented in Fig.  2 (among them, A represents critical thinking skills, B represents problem-solving skills, C represents teamwork skills, and D represents practical innovation skills). The models strongly support the "4-dimensional" structure of the HOTS, which includes four first-order factors: critical thinking skills, problem-solving skills, teamwork skills, and practical innovation skills. Critical thinking skills play a pivotal role in the blended learning environment of interior design, connecting problem-solving skills, teamwork skills, and innovative practices. These four dimensions form the assessment structure of HOTS, with critical thinking skills serving as the core element, inspiring individuals to assess problems and propose innovative solutions. By providing appropriate learning resources, diverse learning activities, and learning tasks, as well as designing items for assessment scales, it is possible to delve into the measurement and development of HOTS in the field of interior design, providing guidance for educational and organizational practices. This comprehensive approach to learning and assessment helps cultivate students' HOTS and lays a solid foundation for their comprehensive abilities in the field of interior design. Thus, the CFA structural models provide strong support for the initial hypothesis of the proposed HOTS assessment structure in this study. As shown in Fig.  2 .

figure 2

Confirmatory factor analysis based on 4 dimensions. *A represents the dimension of critical thinking. B represents the dimension of problem-solving skills. C represents the dimension of teamwork skills. D represents the dimension of practical innovation skills.

Additionally, χ2. The fitting values of RMSEA and SRMR are both below the threshold, whereas the fitting values of the other indicators are all above the threshold, indicating that the model fits well. As shown in Table 13 .

Reliability and validity analysis

The reliability and validity of the scale need to be assessed after the model fit has been determined through validation factor analysis 57 . Based on the findings of Marsh et al. 57 , the following conclusions can be drawn. In terms of hierarchical and correlational model fit, the standardized factor loadings of each item range from 0.700 to 0.802, all of which are greater than or equal to 0.7. This indicates a strong correspondence between the observed items and each latent variable. Furthermore, the Cronbach's α coefficients, which are used to assess the internal consistency or reliability of the scale, ranged from 0.948 to 0.966 for each dimension, indicating a high level of data reliability and internal consistency. The composite reliabilities ranged from 0.948 to 0.967, exceeding the threshold of 0.6 and demonstrating a substantial level of consistency (as shown in Table 14 ).

Additionally, the diagonal bold font represents the square root of the AVE for each dimension. All the dimensions have average variance extracted (AVE) values ranging from 0.551 to 0.589, all of which are greater than 0.5, indicating that the latent variables have strong explanatory power for their corresponding items. These results suggest that the scale structure constructed in this study is reliable and effective. Furthermore, according to the results presented in Table 15 , the square roots of the AVE values for each dimension are greater than the absolute values of the correlations with other dimensions, indicating discriminant validity of the data. Therefore, these four subscales demonstrate good convergent and discriminant validity, indicating that they are both interrelated and independent. This implies that they can effectively capture the content required to complete the HOTS test scale.

Discussion and conclusion

The assessment scale for HOTS in interior design blended learning encompasses four dimensions: critical thinking skills, problem-solving skills, teamwork skills, and practical innovation skills. The selection of these dimensions is based on the characteristics and requirements of the interior design discipline, which aims to comprehensively evaluate students' HOTS demonstrated in blended learning environments to better cultivate their ability to successfully address complex design projects in practice. Notably, multiple studies have shown that HOTSs include critical thinking, problem-solving skills, creative thinking, and decision-making skills, which are considered crucial in various fields, such as education, business, and engineering 20 , 59 , 60 , 61 . Compared with prior studies, these dimensions largely mirror previous research outcomes, with notable distinctions in the emphasis on teamwork skills and practical innovation skills 62 , 63 . Teamwork skills underscore the critical importance of collaboration in contemporary design endeavors, particularly within the realm of interior design 64 . Effective communication and coordination among team members are imperative for achieving collective design objectives.

Moreover, practical innovation skills aim to increase students' capacity for creatively applying theoretical knowledge in practical design settings. Innovation serves as a key driver of advancement in interior design, necessitating students to possess innovative acumen and adaptability to evolving design trends for industry success. Evaluating practical innovation skills aims to motivate students toward innovative thinking, exploration of novel concepts, and development of unique design solutions, which is consistent with the dynamic and evolving nature of the interior design sector. Prior research suggests a close interplay between critical thinking, problem-solving abilities, teamwork competencies, and creative thinking, with teamwork skills acting as a regulatory factor for critical and creative thought processes 7 , 65 . This interconnected nature of HOTS provides theoretical support for the construction and validation of a holistic assessment framework for HOTS.

After the examination by interior design expert members, one item needed to be split into two items. The results of the CR (construct validity) analysis of the scale items indicate that independent sample t tests were subsequently conducted on all the items. The t values were greater than 3, with p values less than 0.001, indicating significant differences between the top and bottom 27% of the samples and demonstrating the discriminant validity of each item. This discovery highlights the diversity and effectiveness of the scale's internal items, revealing the discriminatory power of the scale in assessing the study subjects. The high t values and significant p values reflect the substantiality of the internal items in distinguishing between different sample groups, further confirming the efficacy of these items in evaluating the target characteristics. These results provide a robust basis for further refinement and optimization of the scale and offer guidance for future research, emphasizing the importance of scale design in research and providing strong support for data interpretation and analysis.

This process involves evaluating measurement scales through EFA, and it was found that the explanatory variance of each subscale reached 59.748%, and the CR, AVE, Cronbach's alpha, and Pearson correlation coefficient values of the total scale and subscales were in a better state, which strongly demonstrates the structure, discrimination, and convergence effectiveness of the scale 57 .

The scale structure and items of this study are reliable and effective, which means that students in the field of interior design can use them to test their HOTS level and assess their qualities and abilities. In addition, scholars can use this scale to explore the relationships between students' HOTS and external factors, personal personalities, etc., to determine different methods and strategies for developing and improving HOTS.

Limitations and future research

The developed mixed learning HOTS assessment scale for interior design also has certain limitations that need to be addressed in future research. The first issue is that, owing to the requirement of practical innovation skills, students need to have certain practical experience and innovative abilities. First-grade students usually have not yet had sufficient opportunities for learning and practical experience, so it may not be possible to evaluate their abilities effectively in this dimension. Therefore, when this scale is used for assessment, it is necessary to consider students' grade level and learning experience to ensure the applicability and accuracy of the assessment tool. For first-grade students, it may be necessary to use other assessment tools that are suitable for their developmental stage and learning experience to evaluate other aspects of their HOTS 7 . Future research should focus on expanding the scope of this dimension to ensure greater applicability.

The second issue is that the sample comes from ordinary private undergraduate universities in central China and does not come from national public universities or key universities. Therefore, there may be regional characteristics in the obtained data. These findings suggest that the improved model should be validated with a wider range of regional origins, a more comprehensive school hierarchy, and a larger sample size. The thirdly issue is the findings of this study are derived from self-reported data collected from participants through surveys. However, it is important to note that the literature suggests caution in heavily relying on such self-reported data, as perception does not always equate to actions 66 . In addition, future research can draw on this scale to evaluate the HOTS of interior design students, explore the factors that affect their development, determine their training and improvement paths, and cultivate skilled talent for the twenty-first century.

This study adopts a mixed method research approach, combining qualitative and quantitative methods to achieve a comprehensive understanding of the phenomenon 67 . By integrating qualitative and quantitative research methods, mixed methods research provides a comprehensive and detailed exploration of research questions, using multiple data sources and analytical methods to obtain accurate and meaningful answers 68 . To increase the quality of the research, the entire study followed the guidelines for scale development procedures outlined by Professor Li after the data were obtained. As shown in Fig.  3

figure 3

Scale development program.

Basis of theory

This study is guided by educational objectives such as 21st-century learning skills, the "5C" competency framework, and students' core abilities 4 . The construction process of the scale is based on theoretical foundations, including Bloom's taxonomy. Drawing from existing research, such as the CCTDI 41 , SPSI 69 , and TWKSAT scales, the dimensions and preliminary items of the scale were developed. Additionally, to enhance the validity and reliability of the scale, dimensions related to HOTS in interior design were obtained through semi-structured interviews, and the preliminary project adapted or directly cited existing research results. The preliminary items were primarily adapted or directly referenced from existing research findings. Based on existing research, such as the CCTDI, SPSI, TWKSAT, and twenty-first century skills frameworks, this study takes "critical thinking skills, problem-solving skills, teamwork skills, and practical innovative skills" as the four basic dimensions of the scale.

Participants and procedures

This study is based on previous research and develops a HOTS assessment scale to measure the thinking levels of interior design students in blended learning. By investigating the challenges and opportunities students encounter in blended learning environments and exploring the complexity and diversity of their HOTS, this study aims to obtain comprehensive insights. For research question 1, via the purposive sampling method, 10 interior design experts are selected to investigate the dimensions and evaluation indicators of HOTS in blended learning of interior design. The researcher employed a semi structured interview method, and a random sampling technique was used to select 10 senior experts and teachers in the field of interior design, holding the rank of associate professor or above. This included 5 males and 5 females. As shown in Table 16 .

For research question 2 and 3, the research was conducted at an undergraduate university in China, in the field of interior design and within a blended learning environment. In addition, a statement confirms that all experimental plans have been approved by the authorized committee of Zhengzhou University of Finance and Economics. In the process of practice, the methods used were all in accordance with relevant guidelines and regulations, and informed consent was obtained from all participants. The Interior Design Blended Learning HOTS assessment scale was developed based on sample data from 350 students who underwent one pre-test and retest. The participants in the study consisted of second-, third-, and fourth-grade students who had participated in at least one blended learning course. The sample sizes were 115, 118, and 117 for the respective grade levels, totaling 350 individuals. Among the participants, there were 218 male students and 132 female students, all of whom were within the age range of 19–22 years. Through purposeful sampling, this study ensured the involvement of relevant participants and focused on a specific university environment with diverse demographic characteristics and rich educational resources.

This approach enhances the reliability and generalizability of the research and contributes to a deeper understanding of the research question (as shown in Table 17 ).

Instruments

The tools used in this study include semi structured interview guidelines and the HOTS assessment scale developed by the researchers. For research question 1, the semi structured interview guidelines were reviewed by interior design experts to ensure the accuracy and appropriateness of their content and questions. In addition, for research question 2 and 3, the HOTS assessment scale developed by the researchers will be checked via the consistency ratio (CR) method to assess the consistency and reliability of the scale items and validate their effectiveness.

Data analysis

For research question 1, the researcher will utilize the NVivo version 14 software tool to conduct thematic analysis on the data obtained through semi structured interviews. Thematic analysis is a commonly used qualitative research method that aims to identify and categorize themes, concepts, and perspectives that emerge within a dataset 70 . By employing NVivo software, researchers can effectively organize and manage large amounts of textual data and extract themes and patterns from them.

For research question 2, the critical ratio (CR) method was employed to conduct item analysis and homogeneity testing on the items of the pilot test questionnaire. The CR method allows for the assessment of each item's contribution to the total score and the evaluation of the interrelationships among the items within the questionnaire. These analytical techniques served to facilitate the evaluation and validation of the scale's reliability and validity.

For research question 3, this study used SPSS (version 26), in which confirmatory factor analysis (CFA) was conducted on the confirmatory sample data via maximum likelihood estimation. The purpose of this analysis was to verify whether the hypothesized factor structure model of the questionnaire aligned with the actual survey data. Finally, several indices, including composite reliability (CR), average variance extracted (CR), average variance extracted (AVE), Cronbach's alpha coefficient, and the Pearson correlation coefficient, were computed to assess the reliability and validity of the developed scale and assess its reliability and validity.

In addition, exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are commonly utilized techniques in questionnaire development and adaptation research 31 , 70 . The statistical software packages SPSS and AMOS are frequently employed for implementing these analytical techniques 71 , 72 , 73 . CFA is a data-driven approach to factor generation that does not require a predetermined number of factors or specific relationships with observed variables. Its focus lies in the numerical characteristics of the data. Therefore, prior to conducting CFA, survey questionnaires are typically constructed through EFA to reveal the underlying structure and relationships between observed variables and the latent structure.

In contrast, CFA tests the hypothesized model structure under specific theoretical assumptions or structural hypotheses, including the interrelationships among factors and the known number of factors. Its purpose is to validate the hypothesized model structure. Thus, the initial validity of the questionnaire structure, established through EFA, necessitates further confirmation through CFA 57 , 70 . Additionally, a sample size of at least 200 is recommended for conducting the validation factor analysis. In this study, confirmatory factor analysis was performed on a sample size of 317.

Data availability

All data generated or analyzed during this study are included in this published article. All the experimental protocols were approved by the Zhengzhou College of Finance and Economics licensing committee.

Hariadi, B. et al. Higher order thinking skills based learning outcomes improvement with blended web mobile learning Model. Int. J. Instr. 15 (2), 565–578 (2022).

Google Scholar  

Sagala, P. N. & Andriani, A. Development of higher-order thinking skills (HOTS) questions of probability theory subject based on bloom’s taxonomy. J. Phys. Conf. Ser. https://doi.org/10.1088/1742-6596/1188/1/012025 (2019).

Article   Google Scholar  

Yudha, R. P. Higher order thinking skills (HOTS) test instrument: Validity and reliability analysis with the rasch model. Eduma Math. Educ. Learn. Teach. https://doi.org/10.24235/eduma.v12i1.9468 (2023).

Leach, S. M., Immekus, J. C., French, B. F. & Hand, B. The factorial validity of the Cornell critical thinking tests: A multi-analytic approach. Think. Skills Creat. https://doi.org/10.1016/j.tsc.2020.100676 (2020).

Noroozi, O., Dehghanzadeh, H. & Talaee, E. A systematic review on the impacts of game-based learning on argumentation skills. Entertain. Comput. https://doi.org/10.1016/j.entcom.2020.100369 (2020).

Supena, I., Darmuki, A. & Hariyadi, A. The influence of 4C (constructive, critical, creativity, collaborative) learning model on students’ learning outcomes. Int. J. Instr. 14 (3), 873–892. https://doi.org/10.29333/iji.2021.14351a (2021).

Zhou, Y., Gan, L., Chen, J., Wijaya, T. T. & Li, Y. Development and validation of a higher-order thinking skills assessment scale for pre-service teachers. Think. Skills Creat. https://doi.org/10.1016/j.tsc.2023.101272 (2023).

Musfy, K., Sosa, M. & Ahmad, L. Interior design teaching methodology during the global COVID-19 pandemic. Interiority 3 (2), 163–184. https://doi.org/10.7454/in.v3i2.100 (2020).

Yong, S. D., Kusumarini, Y. & Tedjokoesoemo, P. E. D. Interior design students’ perception for AutoCAD SketchUp and Rhinoceros software usability. IOP Conf. Ser. Earth Environ. Sci. https://doi.org/10.1088/1755-1315/490/1/012015 (2020).

Anthony, B. et al. Blended learning adoption and implementation in higher education: A theoretical and systematic review. Technol. Knowl. Learn. 27 (2), 531–578. https://doi.org/10.1007/s10758-020-09477-z (2020).

Castro, R. Blended learning in higher education: Trends and capabilities. Edu. Inf. Technol. 24 (4), 2523–2546. https://doi.org/10.1007/s10639-019-09886-3 (2019).

Alismaiel, O. Develop a new model to measure the blended learning environments through students’ cognitive presence and critical thinking skills. Int. J. Emerg. Technol. Learn. 17 (12), 150–169. https://doi.org/10.3991/ijet.v17i12.30141 (2022).

Gao, Y. Blended teaching strategies for art design major courses in colleges. Int. J. Emerg. Technol. Learn. https://doi.org/10.3991/ijet.v15i24.19033 (2020).

Banihashem, S. K., Kerman, N. T., Noroozi, O., Moon, J. & Drachsler, H. Feedback sources in essay writing: peer-generated or AI-generated feedback?. Int. J. Edu. Technol. Higher Edu. 21 (1), 23 (2024).

Ji, J. A Design on Blended Learning to Improve College English Students’ Higher-Order Thinking Skills. https://doi.org/10.18282/l-e.v10i4.2553 (2021).

Noroozi, O. The role of students’ epistemic beliefs for their argumentation performance in higher education. Innov. Edu. Teach. Int. 60 (4), 501–512 (2023).

Valero Haro, A., Noroozi, O., Biemans, H. & Mulder, M. First- and second-order scaffolding of argumentation competence and domain-specific knowledge acquisition: A systematic review. Technol. Pedag. Edu. 28 (3), 329–345. https://doi.org/10.1080/1475939x.2019.1612772 (2019).

Narasuman, S. & Wilson, D. M. Investigating teachers’ implementation and strategies on higher order thinking skills in school based assessment instruments. Asian J. Univ. Edu. https://doi.org/10.24191/ajue.v16i1.8991 (2020).

Valero Haro, A., Noroozi, O., Biemans, H. & Mulder, M. Argumentation competence: Students’ argumentation knowledge, behavior and attitude and their relationships with domain-specific knowledge acquisition. J. Constr. Psychol. 35 (1), 123–145 (2022).

Johansson, E. The Assessment of Higher-order Thinking Skills in Online EFL Courses: A Quantitative Content Analysis (2020).

Noroozi, O., Kirschner, P. A., Biemans, H. J. A. & Mulder, M. Promoting argumentation competence: Extending from first- to second-order scaffolding through adaptive fading. Educ. Psychol. Rev. 30 (1), 153–176. https://doi.org/10.1007/s10648-017-9400-z (2017).

Noroozi, O., Weinberger, A., Biemans, H. J. A., Mulder, M. & Chizari, M. Facilitating argumentative knowledge construction through a transactive discussion script in CSCL. Comput. Educ. 61 , 59–76. https://doi.org/10.1016/j.compedu.2012.08.013 (2013).

Noroozi, O., Weinberger, A., Biemans, H. J. A., Mulder, M. & Chizari, M. Argumentation-based computer supported collaborative learning (ABCSCL): A synthesis of 15 years of research. Educ. Res. Rev. 7 (2), 79–106. https://doi.org/10.1016/j.edurev.2011.11.006 (2012).

Setiawan, Baiq Niswatul Khair, Ratnadi Ratnadi, Mansur Hakim, & Istiningsih, S. Developing HOTS-Based Assessment Instrument for Primary Schools (2019).

Suparman, S., Juandi, D., & Tamur, M. Does Problem-Based Learning Enhance Students’ Higher Order Thinking Skills in Mathematics Learning? A Systematic Review and Meta-Analysis 2021 4th International Conference on Big Data and Education (2021).

Goodsett, M. Best practices for teaching and assessing critical thinking in information literacy online learning objects. J. Acad. Lib. https://doi.org/10.1016/j.acalib.2020.102163 (2020).

Putra, I. N. A. J., Budiarta, L. G. R., & Adnyayanti, N. L. P. E. Developing Authentic Assessment Rubric Based on HOTS Learning Activities for EFL Teachers. In Proceedings of the 2nd International Conference on Languages and Arts across Cultures (ICLAAC 2022) (pp. 155–164). https://doi.org/10.2991/978-2-494069-29-9_17 .

Bervell, B., Umar, I. N., Kumar, J. A., Asante Somuah, B. & Arkorful, V. Blended learning acceptance scale (BLAS) in distance higher education: Toward an initial development and validation. SAGE Open https://doi.org/10.1177/21582440211040073 (2021).

Byrne, D. A worked example of Braun and Clarke’s approach to reflexive thematic analysis. Qual. Quant. 56 (3), 1391–1412 (2022).

Xu, W. & Zammit, K. Applying thematic analysis to education: A hybrid approach to interpreting data in practitioner research. Int. J. Qual. Methods 19 , 1609406920918810 (2020).

Braun, V. & Clarke, V. Conceptual and design thinking for thematic analysis. Qual. Psychol. 9 (1), 3 (2022).

Creswell, A., Shanahan, M., & Higgins, I. Selection-inference: Exploiting large language models for interpretable logical reasoning. arXiv:2205.09712 (2022).

Baron, J. Thinking and Deciding 155–156 (Cambridge University Press, 2023).

Book   Google Scholar  

Silver, N., Kaplan, M., LaVaque-Manty, D. & Meizlish, D. Using Reflection and Metacognition to Improve Student Learning: Across the Disciplines, Across the Academy (Taylor & Francis, 2023).

Oksuz, K., Cam, B. C., Kalkan, S. & Akbas, E. Imbalance problems in object detection: A review. IEEE Trans. Pattern Anal. Mach. Intell. 43 (10), 3388–3415 (2020).

Saputra, M. D., Joyoatmojo, S., Wardani, D. K. & Sangka, K. B. Developing critical-thinking skills through the collaboration of jigsaw model with problem-based learning model. Int. J. Instr. 12 (1), 1077–1094 (2019).

Imam, H. & Zaheer, M. K. Shared leadership and project success: The roles of knowledge sharing, cohesion and trust in the team. Int. J. Project Manag. 39 (5), 463–473 (2021).

DeCastellarnau, A. A classification of response scale characteristics that affect data quality: A literature review. Qual. Quant. 52 (4), 1523–1559 (2018).

Article   PubMed   Google Scholar  

Haber, J. Critical Thinking 145–146 (MIT Press, 2020).

Hanscomb, S. Critical Thinking: The Basics 180–181 (Routledge, 2023).

Sulaiman, W. S. W., Rahman, W. R. A. & Dzulkifli, M. A. Examining the construct validity of the adapted California critical thinking dispositions (CCTDI) among university students in Malaysia. Proc. Social Behav. Sci. 7 , 282–288 (2010).

Jaakkola, N. et al. Becoming self-aware—How do self-awareness and transformative learning fit in the sustainability competency discourse?. Front. Educ. https://doi.org/10.3389/feduc.2022.855583 (2022).

Nguyen, T. T. B. Critical thinking: What it means in a Vietnamese tertiary EFL context. English For. Language Int. J. 2 (3), 4–23 (2022).

Henriksen, D., Gretter, S. & Richardson, C. Design thinking and the practicing teacher: Addressing problems of practice in teacher education. Teach. Educ. 31 (2), 209–229 (2020).

Okes, D. Root cause analysis: The core of problem solving and corrective action 179–180 (Quality Press, 2019).

Eroğlu, S. & Bektaş, O. The effect of 5E-based STEM education on academic achievement, scientific creativity, and views on the nature of science. Learn. Individual Differ. 98 , 102181 (2022).

Dzurilla, T. J. & Nezu, A. M. Development and preliminary evaluation of the social problem-solving inventory. Psychol. Assess. J. Consult. Clin. Psychol. 2 (2), 156 (1990).

Tan, O.-S. Problem-based learning innovation: Using problems to power learning in the 21st century. Gale Cengage Learning (2021).

Driskell, J. E., Salas, E. & Driskell, T. Foundations of teamwork and collaboration. Am. Psychol. 73 (4), 334 (2018).

Lower, L. M., Newman, T. J. & Anderson-Butcher, D. Validity and reliability of the teamwork scale for youth. Res. Social Work Pract. 27 (6), 716–725 (2017).

Landa, R. Advertising by design: generating and designing creative ideas across media (Wiley, 2021).

Tang, T., Vezzani, V. & Eriksson, V. Developing critical thinking, collective creativity skills and problem solving through playful design jams. Think. Skills Creat. 37 , 100696 (2020).

Torrance, E. P. Torrance tests of creative thinking. Educational and psychological measurement (1966).

Javadi, M. H., Khoshnami, M. S., Noruzi, S. & Rahmani, R. Health anxiety and social health among health care workers and health volunteers exposed to coronavirus disease in Iran: A structural equation modeling. J. Affect. Disord. Rep. https://doi.org/10.1016/j.jadr.2022.100321 (2022).

Article   PubMed   PubMed Central   Google Scholar  

Hu, L. & Bentler, P. M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. Multidiscip. J. 6 (1), 1–55. https://doi.org/10.1080/10705519909540118 (1999).

Matsunaga, M. Item parceling in structural equation modeling: A primer. Commun. Methods Measures 2 (4), 260–293. https://doi.org/10.1080/19312450802458935 (2008).

Marsh, H. W., Morin, A. J., Parker, P. D. & Kaur, G. Exploratory structural equation modeling: An integration of the best features of exploratory and confirmatory factor analysis. Ann. Rev. Clin. Psychol. 10 (1), 85–110 (2014).

Song, Y., Lee, Y. & Lee, J. Mediating effects of self-directed learning on the relationship between critical thinking and problem-solving in student nurses attending online classes: A cross-sectional descriptive study. Nurse Educ. Today https://doi.org/10.1016/j.nedt.2021.105227 (2022).

Chu, S. K. W., Reynolds, R. B., Tavares, N. J., Notari, M., & Lee, C. W. Y. 21st century skills development through inquiry-based learning from theory to practice . Springer (2021).

Eliyasni, R., Kenedi, A. K. & Sayer, I. M. Blended learning and project based learning: the method to improve students’ higher order thinking skill (HOTS). Jurnal Iqra’: Kajian Ilmu Pendidikan 4 (2), 231–248 (2019).

Yusuf, P. & Istiyono,. Blended learning: Its effect towards higher order thinking skills (HOTS). J. Phys. Conf. Ser. https://doi.org/10.1088/1742-6596/1832/1/012039 (2021).

Byron, K., Keem, S., Darden, T., Shalley, C. E. & Zhou, J. Building blocks of idea generation and implementation in teams: A meta-analysis of team design and team creativity and innovation. Personn. Psychol. 76 (1), 249–278 (2023).

Walid, A., Sajidan, S., Ramli, M. & Kusumah, R. G. T. Construction of the assessment concept to measure students’ high order thinking skills. J. Edu. Gift. Young Sci. 7 (2), 237–251 (2019).

Alawad, A. Evaluating online learning practice in the interior design studio. Int. J. Art Des. Edu. 40 (3), 526–542. https://doi.org/10.1111/jade.12365 (2021).

Awuor, N. O., Weng, C. & Militar, R. Teamwork competency and satisfaction in online group project-based engineering course: The cross-level moderating effect of collective efficacy and flipped instruction. Comput. Educ. 176 , 104357 (2022).

Noroozi, O., Alqassab, M., Taghizadeh Kerman, N., Banihashem, S. K. & Panadero, E. Does perception mean learning? Insights from an online peer feedback setting. Assess. Eval. Higher Edu. https://doi.org/10.1080/02602938.2024.2345669 (2024).

Creswell, J. W. A concise introduction to mixed methods research. SAGE publications124–125 (2021) .

Tashakkori, A., Johnson, R. B., & Teddlie, C. Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. Sage Publications 180–181(2020).

Jiang, X., Lyons, M. D. & Huebner, E. S. An examination of the reciprocal relations between life satisfaction and social problem solving in early adolescents. J. Adolescence 53 (1), 141–151. https://doi.org/10.1016/j.adolescence.2016.09.004 (2016).

Orcan, F. Exploratory and confirmatory factor analysis: Which one to use first. Egitimde ve Psikolojide Olçme ve Degerlendirme Dergisi https://doi.org/10.21031/epod.394323 (2018).

Asparouhov, T. & Muthén, B. Exploratory structural equation modeling. Struct. Eq. Model. Multidiscip. J. 16 (3), 397–438 (2009).

Article   MathSciNet   Google Scholar  

Finch, H., French, B. F., & Immekus, J. C. Applied psychometrics using spss and amos. IAP (2016).

Marsh, H. W., Guo, J., Dicke, T., Parker, P. D. & Craven, R. G. Confirmatory factor analysis (CFA), exploratory structural equation modeling (ESEM), and Set-ESEM: Optimal balance between goodness of fit and parsimony. Multivar. Behav. Res. 55 (1), 102–119. https://doi.org/10.1080/00273171.2019.1602503 (2020).

Download references

Acknowledgements

Thanks to the editorial team and reviewers of Scientific Reports for their valuable comments.

Author information

Authors and affiliations.

Faculty of Education, SEGI University, 47810 Petaling Jaya, Selangor, Malaysia

Department of Art and Design, Zhengzhou College of Finance and Economics, Zhengzhou, 450000, Henan, China

Xiaolei Fan

Faculty of Humanities and Arts, Macau University of Science and Technology, Avenida Wai Long, 999078, Taipa, Macao, Special Administrative Region of China

Lingchao Meng

You can also search for this author in PubMed   Google Scholar

Contributions

D.L. Conceptualized a text experiment, and wrote the main manuscript text. D.L. and X.F. conducted experiments, D.L., X.F. and L.M. analyzed the results. L.M. contributed to the conceptualization, methodology and editing, and critically reviewed the manuscript. All authors have reviewed the manuscript.

Corresponding author

Correspondence to Lingchao Meng .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Li, D., Fan, X. & Meng, L. Development and validation of a higher-order thinking skills (HOTS) scale for major students in the interior design discipline for blended learning. Sci Rep 14 , 20287 (2024). https://doi.org/10.1038/s41598-024-70908-3

Download citation

Received : 28 February 2024

Accepted : 22 August 2024

Published : 31 August 2024

DOI : https://doi.org/10.1038/s41598-024-70908-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Assessment scale
  • Higher-order thinking skills
  • Interior design
  • Blended learning

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing: Anthropocene newsletter — what matters in anthropocene research, free to your inbox weekly.

critical thinking in instructional design

IMAGES

  1. 10 Essential Critical Thinking Skills (And How to Improve Them

    critical thinking in instructional design

  2. Educational Classroom Posters And Resources

    critical thinking in instructional design

  3. Bloom's Taxonomy Critical Thinking Instructional Design Education, PNG

    critical thinking in instructional design

  4. PPT

    critical thinking in instructional design

  5. why is Importance of Critical Thinking Skills in Education

    critical thinking in instructional design

  6. The Importance of Teaching Critical Thinking Skills

    critical thinking in instructional design

VIDEO

  1. Design of critical thinking

  2. Instructional Design for eLearning

  3. Critical Thinking in Large Classes

  4. Annotation

  5. Peter Andersen

  6. 2023 ChatGPT Webinar Series Part 1

COMMENTS

  1. PDF An Overview of How to Design Instruction Using Critical Thinking Concepts

    In sum, instructional design involves a teacher thinking about instruction in both structural and tactical ways. Overall structural thinking-for example, about the concept for the course-can help free a teacher from the Didactic Model into which we have been conditioned and the ineffective teaching that invariably accompanies it. Simple and

  2. Eight Instructional Strategies for Promoting Critical Thinking

    Learn how to promote critical thinking in your classroom with eight instructional strategies from experts and teachers. Explore how to use current events, data analysis, and questioning techniques ...

  3. The Instructional Designer's Guide to Critical Thinking

    As instructional designers, we should think about instruction in both structural and strategic ways. This will enable us to move away from the didactic method and the ineffective teaching that invariably accompanies it toward active learning through critical thinking. However, our learning solutions will not be transformed simply because we ...

  4. The Link Between Critical Thinking and Effective Design

    The Link Between Critical Thinking and Effective Design. Brigg Patten : Oct 18, 2016 2:00:00 PM. Instructional Design. Curriculum design is a hot topic these days, especially with the debates over the Common Core teaching strategies in the United States. Often, many critics find that programs like the Common Core aren't helping students learn ...

  5. 13 Towards a Critical Instructional Design Framework

    A practitioner-researcher explores how to critically analyze an existing course design based on power, process, and positionality. The chapter provides a framework for design justice in instructional design contexts and reflects on the role and challenges of instructional designers.

  6. PDF Instructional Design and Facilitation Approaches that Promote Critical

    Instructional Design and Facilitation Approaches that Promote Critical Thinking in Asynchronous Online Discussions: A Review of the Literature ...

  7. PDF Designing Learning Environments for Critical Thinking: Examining

    Ennis (1989) refers to such instructional strategies as an Immersion approach. Advocates of this approach (e.g. McPeck, 1990a, 1990b) assume that a well-designed subject-matter instruction is sufficient to promote the development of CT skills and equip students to competently perform CT tasks across domains.

  8. PDF Designing Instruction for Critical Thinking: A Case of a Graduate ...

    The emphasis on cultivating critical thinking (CT) skills in students across all ages has been growing in the past decade. Educational standards for K-12 education emphasize improved CT as an outcome (e.g., Common Core Standards and 21st Century Skills), and it is also relevant during and after postsecondary education.

  9. Tools That Teach: Lessons for Critical Instructional Design

    How can educators use skeuomorphism, the design of tools that mimic physical objects, to teach online learning? This article explores the benefits and drawbacks of skeuomorphism and how it affects rituals and processes in digital environments.

  10. 9 Instructional design principles and how to use them

    Learn from Robert Gagne's nine-step process for creating effective learning experiences, whatever your learning format. Find practical tips and examples for applying these instructional strategies in your eLearning courses, training sessions or blended environments.

  11. Instructional Design as Critical and Creative Thinking: A Journey

    The role of critical and creative thinking has been debated within the field of instructional design. Through an instructional design and development project we have identified how critical and creative thinking are essential to the instructional design process. This paper highlights a recent project focused on a virtual Native American village and the development of supporting instructional ...

  12. Conceptualizations and instructional strategies on critical thinking in

    Citation: Andreucci-Annunziata P, Riedemann A, Cortés S, Mellado A, del Río MT and Vega-Muñoz A (2023) Conceptualizations and instructional strategies on critical thinking in higher education: A systematic review of systematic reviews. Front. Educ. 8:1141686. doi: 10.3389/feduc.2023.1141686. Received: 10 January 2023; Accepted: 20 February 2023;

  13. Instructional Design and Facilitation Approaches that Promote Critical

    Higher Learning Research Communications - December 2014 Volume 4, Number 4 14 Laura A. Schindler and Gary J. Burkholder - Instructional Design and Facilitation Approaches that Promote Critical Thinking in Asynchronous Online Discussions: A Review of the Literature critical thinking skills could be developed, as it really was not possible to develop a "critical

  14. An Overview of How to Design Instruction Using Cri

    In sum, instructional design involves a teacher thinking about instruction in both structural and tactical ways. Overall structural thinking-for example, about the concept for the course-can help free a teacher from the Didactic Model into which we have been conditioned and the ineffective teaching that invariably accompanies it. Simple and ...

  15. Designing Learning Environments for Critical Thinking: Examining

    Fostering the development of students' critical thinking (CT) is regarded as an essential outcome of higher education. However, despite the large body of research on this topic, there has been little consensus on how educators best support the development of CT. In view of some of the controversies surrounding the teaching of CT skills in higher education, this study examined the effects of ...

  16. Exploring instructional design in K-12 STEM education: a systematic

    This study aimed to analyze articles published in the Web of Science database from 2012 to 2021 to examine the educational goals and instructional designs for STEM education. We selected articles based on the following criteria: (a) empirical research; (b) incorporating instructional design and strategies into STEM teaching; (c) including intervention; (d) focusing on K-12 education and on ...

  17. Making the Link Between Design Thinking and Instructional Design

    In the book, Boller and Fletcher explain that design thinking features five core steps: Empathize with users—for instance, with those affected by a situation or in need. Define the problem to be solved. Ideate with target users to come up with possible "solves". Craft and test quick and dirty prototypes of potential solutions.

  18. Design Thinking For Instructional Design

    Learn how to use design thinking techniques to create learning products that meet the needs of the learners and differentiate from artificial intelligence. This article is the first part of a four-part series on design thinking for instructional design.

  19. Developing disposition to critical thinking and problem-solving

    This study investigated the development of perceptions of critical thinking and problem-solving skills among a group of students taking part in instructional design projects to produce digital materials using different instructional design models. The study participants were students from a computer science teaching department who were enrolled in an instructional design course. Participants ...

  20. How to Use Design Thinking in Instructional Design

    The practice of Design Thinking seems to be sorely missing from instructional design university programs, professional training and workplace practices. ... The practice of conceiving ideas, or ideation, is a critical step of Design Thinking. This is where you and ideally, a cross-disciplinary team, generate potential solutions to the ...

  21. Transforming Teachers' Instructional Design for Enhancing Critical

    The needs of our society are quickly evolving and soft or transferable skills are key to lifelong learning and the creation of an adaptable and resilient workforce. There is an ever-growing demand for individuals who can process data, evaluate concepts, and develop arguments; the development of critical thinking skills is crucial. This study shows the effectiveness of a professional ...

  22. Development and validation of a higher-order thinking skills (HOTS

    Themes 1: critical thinking skills. Critical thinking skills constitute a key core category in blended learning environments for interior design and are crucial for cultivating students' HOTS.

  23. Enhancing EFL students' critical thinking and writing ...

    This study introduced an instructional pattern that integrated the framework of the International Critical Thinking Reading and Writing Test (ICTRWT), designed by Paul and Elder, into a tertiary ...