• Columbia University in the City of New York
  • Office of Teaching, Learning, and Innovation
  • University Policies
  • Columbia Online
  • Academic Calendar
  • Resources and Technology
  • Resources and Guides
  • Metacognition

Metacognitive thinking skills are important for instructors and students alike. This resource provides instructors with an overview of the what and why of metacognition and general “getting started” strategies for teaching for and with metacognition.

In this page:

What is metacognition?

Why use metacognition, getting started: how to teach both for and with metacognition, metacognition at columbia.

metacognition science problem solving

The Center for Teaching and Learning encourages instructors to teach metacognitively. This means to teach “ with and for metacognition.” To teach with metacognition involves instructors “thinking about their own thinking regarding their teaching” (Hartman, 2001: 149). To teach for metacognition involves instructors thinking about how their instruction helps to elucidate learning and problem solving strategies to their students (Hartman, 2001).

Learners with metacognitive skills are:

Instructors who teach metacognitively / think about their teaching are:

Teaching for metacognition — Metacognitive strategies that serve students and their learning:

Design homework assignments that ask students to focus on their learning process. This includes having students monitor progress, identify and correct mistakes, and plan next steps.

Provide structures to guide students in creating implementable action plans for improvement.

Show students how to move stepwise from reflection to action. Use appropriate technology to support student self-regulation. Many platforms such as CourseWorks provide tools that students can use to keep up with their course work and monitor their progress.

Teaching with metacognition — Metacognitive strategies that serve the course and the instructor’s teaching practice:

Create an evaluation plan to periodically evaluate one’s teaching and course design, set-up, and content.

Structure the course to provide time for students to give feedback on the course and teaching. Evaluate course progress and successes of teaching Use course and instructional objectives to measure progress.

Schedule mid-course feedback surveys with students.

Request a mid-course review (offered as a service for graduate students).

Review end-of-course evaluations and reflect on the changes that will be made to maximize student learning. Build in time for metacognitive work Set aside time before, during, and after a course to reflect on one’s teaching practice, relationship with students, course climate and dynamics, as well as assumptions about the course material and its accessibility to students.

Metacognition and Memory Lab  |  Dr. Janet Metcalfe (Professor of Psychology and of Neurobiology and Behavior) runs a lab that focuses on how people use their metacognition to improve self-awareness and to guide their own learning and behavior. Dr. Metcalfe is author of Metacognition: A Textbook for Cognitive, Educational, Life Span & Applied Psychology (2009), co-authored with John Dunlosky.

In Fall 2018, the CTL and the Science of LEarning Research (SOLER) initiative co-organized the inaugural Science of Learning Symposium “Metacognition: From Research to Classroom” which brought together Columbia faculty, staff, graduate students, and experts in the science of learning to share the research on metacognition in learning, and to translate it into strategies that maximize student learning. View video recording of the event here .

Ambrose, S. A., Lovett, M., Bridges, M. W., DiPietro, M., & Norman, M. K. (2010). How Learning Works: Seven Research-Based Principles for Smart Teaching . San Francisco: John Wiley & Sons.

Dunlosky, J. and Metcalfe, J. (2009). Metacognition. Thousand Oaks, CA: Sage.

Flavell, J.H. (1976). Metacognitive Aspects of Problem Solving. In L.B. Resnick (Ed.), The Nature of Intelligence (pp. 231-236). Hillsdale, NJ: Erlbaum.

Hacker, D.J. (1998). Chapter 1. Definitions and Empirical Foundations. In Hacker, D.J.; Dunlosky, J.; and Graesser, A.C. (1998). Metacognition in Educational Theory and Practice. Mahwah, N.J.: Routledge.

Hartman, H.J. (2001). Chapter 8: Teaching Metacognitively. In Metacognition in Learning and Instruction. Kluwer Academic Publishers, 149 – 172.

Lai, E.R. (2011). Metacognition: A Literature Review. Pearson’s Research Reports. Retrieved from https://images.pearsonassessments.com/images/tmrs/Metacognition_Literature_Review_Final.pdf

McGuire, S.Y. (2015). Teach Students How to Learn: Strategies You Can Incorporate Into Any Course to Improve Student Metacognition, Study Skills, and Motivation. Sterling, VA: Stylus.

National Research Council (2000). How People Learn: Brain, Mind, Experience, and School . Expanded Edition . Washington, DC: The National Academies Press. https://doi.org/10.17226/9853

Nilson, L. (2013). Creating Self-Regulated Learners: Strategies to Strengthen Students’ Self-Awareness and Learning Skills. Sterling, VA: Stylus.

Schraw, G. and Dennison, R.S. (1994). Assessing Metacognitive Awareness. Contemporary Educational Psychology. 19(4): 460-475.

Explore our teaching resources.

The CTL researches and experiments.

The Columbia Center for Teaching and Learning provides an array of resources and tools for instructional activities.

This website uses cookies to identify users, improve the user experience and requires cookies to work. By continuing to use this website, you consent to Columbia University's use of cookies and similar technologies, in accordance with the Columbia University Website Cookie Notice .

Center for Teaching

Metacognition.

Thinking about One’s Thinking |   Putting Metacognition into Practice

Thinking about One’s Thinking

metacognition science problem solving

Initially studied for its development in young children (Baker & Brown, 1984; Flavell, 1985), researchers soon began to look at how experts display metacognitive thinking and how, then, these thought processes can be taught to novices to improve their learning (Hatano & Inagaki, 1986).  In How People Learn , the National Academy of Sciences’ synthesis of decades of research on the science of learning, one of the three key findings of this work is the effectiveness of a “‘metacognitive’ approach to instruction” (Bransford, Brown, & Cocking, 2000, p. 18).

Metacognitive practices increase students’ abilities to transfer or adapt their learning to new contexts and tasks (Bransford, Brown, & Cocking, p. 12; Palincsar & Brown, 1984; Scardamalia et al., 1984; Schoenfeld, 1983, 1985, 1991).  They do this by gaining a level of awareness above the subject matter : they also think about the tasks and contexts of different learning situations and themselves as learners in these different contexts.  When Pintrich (2002) asserts that “Students who know about the different kinds of strategies for learning, thinking, and problem solving will be more likely to use them” (p. 222), notice the students must “know about” these strategies, not just practice them.  As Zohar and David (2009) explain, there must be a “ conscious meta-strategic level of H[igher] O[rder] T[hinking]” (p. 179).

Metacognitive practices help students become aware of their strengths and weaknesses as learners, writers, readers, test-takers, group members, etc.  A key element is recognizing the limit of one’s knowledge or ability and then figuring out how to expand that knowledge or extend the ability. Those who know their strengths and weaknesses in these areas will be more likely to “actively monitor their learning strategies and resources and assess their readiness for particular tasks and performances” (Bransford, Brown, & Cocking, p. 67).

The absence of metacognition connects to the research by Dunning, Johnson, Ehrlinger, and Kruger on “Why People Fail to Recognize Their Own Incompetence” (2003).  They found that “people tend to be blissfully unaware of their incompetence,” lacking “insight about deficiencies in their intellectual and social skills.”  They identified this pattern across domains—from test-taking, writing grammatically, thinking logically, to recognizing humor, to hunters’ knowledge about firearms and medical lab technicians’ knowledge of medical terminology and problem-solving skills (p. 83-84).  In short, “if people lack the skills to produce correct answers, they are also cursed with an inability to know when their answers, or anyone else’s, are right or wrong” (p. 85).  This research suggests that increased metacognitive abilities—to learn specific (and correct) skills, how to recognize them, and how to practice them—is needed in many contexts.

Putting Metacognition into Practice

In “ Promoting Student Metacognition ,” Tanner (2012) offers a handful of specific activities for biology classes, but they can be adapted to any discipline. She first describes four assignments for explicit instruction (p. 116):

metacognition science problem solving

Next are recommendations for developing a “classroom culture grounded in metacognition” (p. 116-118):

To facilitate these activities, she also offers three useful tables:

Weimer’s “ Deep Learning vs. Surface Learning: Getting Students to Understand the Difference ” (2012) offers additional recommendations for developing students’ metacognitive awareness and improvement of their study skills:

“[I]t is terribly important that in explicit and concerted ways we make students aware of themselves as learners. We must regularly ask, not only ‘What are you learning?’ but ‘How are you learning?’ We must confront them with the effectiveness (more often ineffectiveness) of their approaches. We must offer alternatives and then challenge students to test the efficacy of those approaches. ” (emphasis added)

She points to a tool developed by Stanger-Hall (2012, p. 297) for her students to identify their study strategies, which she divided into “ cognitively passive ” (“I previewed the reading before class,” “I came to class,” “I read the assigned text,” “I highlighted the text,” et al) and “ cognitively active study behaviors ” (“I asked myself: ‘How does it work?’ and ‘Why does it work this way?’” “I wrote my own study questions,” “I fit all the facts into a bigger picture,” “I closed my notes and tested how much I remembered,” et al) .  The specific focus of Stanger-Hall’s study is tangential to this discussion, 1 but imagine giving students lists like hers adapted to your course and then, after a major assignment, having students discuss which ones worked and which types of behaviors led to higher grades. Even further, follow Lovett’s advice (2013) by assigning “exam wrappers,” which include students reflecting on their previous exam-preparation strategies, assessing those strategies and then looking ahead to the next exam, and writing an action plan for a revised approach to studying. A common assignment in English composition courses is the self-assessment essay in which students apply course criteria to articulate their strengths and weaknesses within single papers or over the course of the semester. These activities can be adapted to assignments other than exams or essays, such as projects, speeches, discussions, and the like.

As these examples illustrate, for students to become more metacognitive, they must be taught the concept and its language explicitly (Pintrich, 2002; Tanner, 2012), though not in a content-delivery model (simply a reading or a lecture) and not in one lesson. Instead, the explicit instruction should be “designed according to a knowledge construction approach,” or students need to recognize, assess, and connect new skills to old ones, “and it needs to take place over an extended period of time” (Zohar & David, p. 187).  This kind of explicit instruction will help students expand or replace existing learning strategies with new and more effective ones, give students a way to talk about learning and thinking, compare strategies with their classmates’ and make more informed choices, and render learning “less opaque to students, rather than being something that happens mysteriously or that some students ‘get’ and learn and others struggle and don’t learn” (Pintrich, 2002, p. 223).

metacognition science problem solving

What would such a handout look like for your discipline?

Students can even be metacognitively prepared (and then prepare themselves) for the overarching learning experiences expected in specific contexts . Salvatori and Donahue’s The Elements (and Pleasures) of Difficulty (2004) encourages students to embrace difficult texts (and tasks) as part of deep learning, rather than an obstacle.  Their “difficulty paper” assignment helps students reflect on and articulate the nature of the difficulty and work through their responses to it (p. 9).  Similarly, in courses with sensitive subject matter, a different kind of learning occurs, one that involves complex emotional responses.  In “ Learning from Their Own Learning: How Metacognitive and Meta-affective Reflections Enhance Learning in Race-Related Courses ” (Chick, Karis, & Kernahan, 2009), students were informed about the common reactions to learning about racial inequality (Helms, 1995; Adams, Bell, & Griffin, 1997; see student handout, Chick, Karis, & Kernahan, p. 23-24) and then regularly wrote about their cognitive and affective responses to specific racialized situations.  The students with the most developed metacognitive and meta-affective practices at the end of the semester were able to “clear the obstacles and move away from” oversimplified thinking about race and racism ”to places of greater questioning, acknowledging the complexities of identity, and redefining the world in racial terms” (p. 14).

Ultimately, metacognition requires students to “externalize mental events” (Bransford, Brown, & Cocking, p. 67), such as what it means to learn, awareness of one’s strengths and weaknesses with specific skills or in a given learning context, plan what’s required to accomplish a specific learning goal or activity, identifying and correcting errors, and preparing ahead for learning processes.

————————

1 Students who were tested with short answer in addition to multiple-choice questions on their exams reported more cognitively active behaviors than those tested with just multiple-choice questions, and these active behaviors led to improved performance on the final exam.

Creative Commons License

Photo credit:  wittygrittyinvisiblegirl via  Compfight cc

Photo Credit: Helga Weber via Compfight cc

Photo Credit: fiddle oak via Compfight cc

Teaching Guides

Quick Links

Guiding Students to Ask Questions and Define Problems in Science

Teachers can use these strategies to help students in grades 6 to 12 develop skills that are crucial to scientific study and exploration.

Photo of middle school science class

Without clear questions and well-defined problems, scientific investigations lack direction and focus, leading to inconclusive or irrelevant results. That’s why the first science and engineering practice is asking questions and defining problems . This is the foundation upon which scientific inquiry and problem-solving are built.

More important, asking questions and defining problems are essential skills in life that enable individuals to think critically, solve problems, and make informed decisions.

As STEAM professional development specialists, we’re fortunate to work with hundreds of teachers and students in the collective subjects of STEAM. We’ve identified several reasons why students have a hard time asking questions and defining problems.

We’ve witnessed a lack of confidence, limited prior exposure, fixed mindsets, overemphasis on answers, fear of judgment, and students not seeing the relevance to their lives. 

supporting students in Developing These Skills

Teachers of grades six through 12 play a crucial role in helping students develop their ability to ask questions and define problems in science. To begin, we employ the 5E Model of Instruction (engage, explore, explain, elaborate, evaluate) with our teachers.

In this student-centered model, instruction begins with students asking questions and exploring phenomena rather than teachers delivering content or information. Even if teachers don’t employ the 5E Model, we work with them on practical applications to have students ask and explore before the teachers explain . 

We’ve identified four overarching ideas that teachers can start working on tomorrow with their students.

In addition to the four overarching ideas, we’ve put together four practical techniques and routines to support our colleagues and students. The links below can show how to specifically employ the techniques and routines in the classroom. Although our perspective is STEAM, the routines and techniques can be used in a variety of settings.

The ability to ask questions and define problems is a critical skill in the field of science. Whether working in a laboratory, conducting field research, or simply trying to understand the world around us, asking questions and defining problems are the first steps toward knowledge and discovery, and they’re essential skills in many other aspects of life as well.

Marilyn Price-Mitchell Ph.D.

What Is Metacognition? How Does It Help Us Think?

Metacognitive strategies like self-reflection empower students for a lifetime..

Posted  October 9, 2020 | Reviewed by Abigail Fagan

Siphotography/Deposit Photos

Metacognition is a high order thinking skill that is emerging from the shadows of academia to take its rightful place in classrooms around the world. As online classrooms extend into homes, this is an important time for parents and teachers to understand metacognition and how metacognitive strategies affect learning. These skills enable children to become better thinkers and decision-makers.

Metacognition: The Neglected Skill Set for Empowering Students is a new research-based book by educational consultants Dr. Robin Fogarty and Brian Pete that not only gets to the heart of why metacognition is important but gives teachers and parents insightful strategies for teaching metacognition to children from kindergarten through high school. This article summarizes several concepts from their book and shares three of their thirty strategies to strengthen metacognition.

What Is Metacognition?

Metacognition is the practice of being aware of one’s own thinking. Some scholars refer to it as “thinking about thinking.” Fogarty and Pete give a great everyday example of metacognition:

Think about the last time you reached the bottom of a page and thought to yourself, “I’m not sure what I just read.” Your brain just became aware of something you did not know, so instinctively you might reread the last sentence or rescan the paragraphs of the page. Maybe you will read the page again. In whatever ways you decide to capture the missing information, this momentary awareness of knowing what you know or do not know is called metacognition.

When we notice ourselves having an inner dialogue about our thinking and it prompts us to evaluate our learning or problem-solving processes, we are experiencing metacognition at work. This skill helps us think better, make sound decisions, and solve problems more effectively. In fact, research suggests that as a young person’s metacognitive abilities increase, they achieve at higher levels.

Fogarty and Pete outline three aspects of metacognition that are vital for children to learn: planning, monitoring, and evaluation. They convincingly argue that metacognition is best when it is infused in teaching strategies rather than taught directly. The key is to encourage students to explore and question their own metacognitive strategies in ways that become spontaneous and seemingly unconscious .

Metacognitive skills provide a basis for broader, psychological self-awareness , including how children gain a deeper understanding of themselves and the world around them.

Metacognitive Strategies to Use at Home or School

Fogarty and Pete successfully demystify metacognition and provide simple ways teachers and parents can strengthen children’s abilities to use these higher-order thinking skills. Below is a summary of metacognitive strategies from the three areas of planning, monitoring, and evaluation.

1. Planning Strategies

As students learn to plan, they learn to anticipate the strengths and weaknesses of their ideas. Planning strategies used to strengthen metacognition help students scrutinize plans at a time when they can most easily be changed.

One of ten metacognitive strategies outlined in the book is called “Inking Your Thinking.” It is a simple writing log that requires students to reflect on a lesson they are about to begin. Sample starters may include: “I predict…” “A question I have is…” or “A picture I have of this is…”

Writing logs are also helpful in the middle or end of assignments. For example, “The homework problem that puzzles me is…” “The way I will solve this problem is to…” or “I’m choosing this strategy because…”

2. Monitoring Strategies

Monitoring strategies used to strengthen metacognition help students check their progress and review their thinking at various stages. Different from scrutinizing, this strategy is reflective in nature. It also allows for adjustments while the plan, activity, or assignment is in motion. Monitoring strategies encourage recovery of learning, as in the example cited above when we are reading a book and notice that we forgot what we just read. We can recover our memory by scanning or re-reading.

One of many metacognitive strategies shared by Fogarty and Pete, called the “Alarm Clock,” is used to recover or rethink an idea once the student realizes something is amiss. The idea is to develop internal signals that sound an alarm. This signal prompts the student to recover a thought, rework a math problem, or capture an idea in a chart or picture. Metacognitive reflection involves thinking about “What I did,” then reviewing the pluses and minuses of one’s action. Finally, it means asking, “What other thoughts do I have” moving forward?

Teachers can easily build monitoring strategies into student assignments. Parents can reinforce these strategies too. Remember, the idea is not to tell children what they did correctly or incorrectly. Rather, help children monitor and think about their own learning. These are formative skills that last a lifetime.

3. Evaluation Strategies

According to Fogarty and Pete, the evaluation strategies of metacognition “are much like the mirror in a powder compact. Both serve to magnify the image, allow for careful scrutiny, and provide an up-close and personal view. When one opens the compact and looks in the mirror, only a small portion of the face is reflected back, but that particular part is magnified so that every nuance, every flaw, and every bump is blatantly in view.” Having this enlarged view makes inspection much easier.

When students inspect parts of their work, they learn about the nuances of their thinking processes. They learn to refine their work. They grow in their ability to apply their learning to new situations. “Connecting Elephants” is one of many metacognitive strategies to help students self-evaluate and apply their learning.

In this exercise, the metaphor of three imaginary elephants is used. The elephants are walking together in a circle, connected by the trunk and tail of another elephant. The three elephants represent three vital questions: 1) What is the big idea? 2) How does this connect to other big ideas? 3) How can I use this big idea? Using the image of a “big idea” helps students magnify and synthesize their learning. It encourages them to think about big ways their learning can be applied to new situations.

Metacognition and Self-Reflection

Reflective thinking is at the heart of metacognition. In today’s world of constant chatter, technology and reflective thinking can be at odds. In fact, mobile devices can prevent young people from seeing what is right before their eyes.

John Dewey, a renowned psychologist and education reformer, claimed that experiences alone were not enough. What is critical is an ability to perceive and then weave meaning from the threads of our experiences.

The function of metacognition and self-reflection is to make meaning. The creation of meaning is at the heart of what it means to be human.

Everyone can help foster self-reflection in young people.

Marilyn Price-Mitchell Ph.D.

Marilyn Price-Mitchell, Ph.D., is an Institute for Social Innovation Fellow at Fielding Graduate University and author of Tomorrow’s Change Makers.

metacognition science problem solving

Adolescence

metacognition science problem solving

metacognition science problem solving

May 2023 magazine cover

Hope is double-edged, false hope can set you on a collision course with despair. Know when your hopes are well-founded and how to turn your deep desires into results.

metacognition science problem solving

The Sourcebook for Teaching Science

Science Teaching Series

Internet Resources

I. developing scientific literacy.

II. Developing Scientific Reasoning

III. Developing Scientific Understanding

IV. Developing Scientific Problem Solving

V. Developing Scientific Research Skills

VI. Resources for Teaching Science

Metacognition

24.2.4 – metacognition : teaching students to think about their thinking.

John Flavel argues that learning is maximized when students learn to think about their thinking and consciously employ strategies to maximize their reasoning and problem solving capabilities. A metacognitive thinker knows when and how he learns best, and employs strategies to overcome barriers to learning. As students learn to regulate and monitor their thought processes and understanding, they learn to adapt to new learning challenges. Expert problem solvers first seek to develop an understanding of problems by thinking in terms of core concepts and major principles ( 6.1-4 , 7.1-7 , 11.1-4 ). By contrast, novice problem solvers have not learned this metacognitive strategy, and are more likely to approach problems simply by trying to find the right formulas into which they can insert the right numbers. A major goal of education is to prepare students to be flexible for new problems and settings. The ability to transfer concepts from school to the work or home environment is a hallmark of a metacognitive thinker (6.4) .

Metacognition & Intelligence

"Metacognition, or the ability to control one's cognitive processes (self-regulation) has been linked to intelligence (Borkowski et al., 1987; Brown, 1987; Sternberg, 1984, 1986a, 1986b). Sternberg refers to these executive processes as "metacomponents" in his triarchic theory of intelligence (Sternberg, 1984, 1986a, 1986b). Metacomponents are executive processes that control other cognitive components as well as receive feedback from these components. According to Sternberg, metacomponents are responsible for "figuring out how to do a particular task or set of tasks, and then making sure that the task or set of tasks are done correctly" (Sternberg, 1986b, p. 24). These executive processes involve planning, evaluating and monitoring problem-solving activities. Sternberg maintains that the ability to appropriately allocate cognitive resources, such as deciding how and when a given task should be accomplished, is central to intelligence." (1997 by Jennifer A. Livingston )

Learning Styles

Join our Mailing List!

Get access to free study tips, learning strategies, and other valuable resources for your child or student..

metacognition science problem solving

Metacognition & Problem Solving

April 8, 2019

Successful learners use metacognition to facilitate their problem solving. This is one of the key findings of the National Academy of Sciences’ synthesis of decades of research on the science of learning explained in  How People Learn: Mind, Brain, Experience and School

Below we explain metacognition and provide the vocabulary to teach it. In part two of this series we will focus on strategy selection. If you’d like to try our full metacognition approach, please contact us here.

Start with Cognition

Cognition is how you learn. Depending on the topic, the context, personal experiences and genetics, each of us relies on different proportions of cognitive skills to understand and remember what we read, see or hear.

We begin learning the moment we are born and we never stop. And while neuroscientists have proven that our learning capabilities can and do change over time based on experiences and effort (see Figure 1), it’s not easy.

neuroplasticity

By the age of 10, it requires significant practice and effort to change the brain. And neuroscientists don’t yet have the prescription of how (though they are working on it). But that’s where metacognition comes in. If you can teach students to become more aware of their cognition, you can teach them how to learn in any situation.

Cognition versus Metacognition

Metacognition is the conscious awareness of how you learn. When students recognize how they learn, what feels natural, what requires effort and why , you can teach them how to choose effective strategies. Students will need to learn to adapt their strategy choice for the specific topic, situation and their individual strengths and needs. Given learner variability, each student will need to develop their personal set of go-to strategies.

Embrace Learner Variability

In order for students to use metacognition,  they must be taught the concept and its language explicitly . Below are the names and  short definitions of ten core skills of cognition. Our definitions highlight where the skill has the greatest academic impact.

In part two of this blog series we will talk about how to choose learning strategies based on self-awareness. You’ll need to teach students to choose strategies based on the topic and whether or not the student is strong or has difficulty with the cognitive skill that is typically core to the learning situation. Teaching metacognition is not simple, but it is the key to lifelong learning. So it’s definitely worth it.

Complex Reasoning

Executive Functions

Speed/Efficiency

Look for Part II of this blog series next month.

No Comments

metacognition science problem solving

Subscribe via Email

Email address:

Recent Posts

Older Archives

Molecular Biology of the Cell Text Book

Change Password

Password changed successfully.

Your password has been changed

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Fostering Metacognition to Support Student Learning and Performance

*Address correspondence to: Julie Dangremond Stanton ( E-mail Address: [email protected] ).

Department of Cellular Biology, University of Georgia, Athens, GA 30602

Search for more papers by this author

Department of Biology, Saint Louis University, St. Louis, MO 63103

Department of Psychological Sciences, Kent State University, Kent, OH 44240

Metacognition is awareness and control of thinking for learning. Strong metacognitive skills have the power to impact student learning and performance. While metacognition can develop over time with practice, many students struggle to meaningfully engage in metacognitive processes. In an evidence-based teaching guide associated with this paper ( https://lse.ascb.org/evidence-based-teaching-guides/student-metacognition ), we outline the reasons metacognition is critical for learning and summarize relevant research on this topic. We focus on three main areas in which faculty can foster students’ metacognition: supporting student learning strategies (i.e., study skills), encouraging monitoring and control of learning, and promoting social metacognition during group work. We distill insights from key papers into general recommendations for instruction, as well as a special list of four recommendations that instructors can implement in any course. We encourage both instructors and researchers to target metacognition to help students improve their learning and performance.

INTRODUCTION

Supporting the development of metacognition is a powerful way to promote student success in college. Students with strong metacognitive skills are positioned to learn more and perform better than peers who are still developing their metacognition (e.g., Wang et al. , 1990 ). Students with well-developed metacognition can identify concepts they do not understand and select appropriate strategies for learning those concepts. They know how to implement strategies they have selected and carry out their overall study plans. They can evaluate their strategies and adjust their plans based on outcomes. Metacognition allows students to be more expert-like in their thinking and more effective and efficient in their learning. While collaborating in small groups, students can also stimulate metacognition in one another, leading to improved outcomes. Ever since metacognition was first described ( Flavell, 1979 ), enthusiasm for its potential impact on student learning has remained high. In fact, as of today, the most highly cited paper in CBE—Life Sciences Education is an essay on “Promoting Student Metacognition” ( Tanner, 2012 ).

Despite this enthusiasm, instructors face several challenges when attempting to harness metacognition to improve their students’ learning and performance. First, metacognition is a term that has been used so broadly that its meaning may not be clear ( Veenman et al. , 2006 ). We define metacognition as awareness and control of thinking for learning ( Cross and Paris, 1988 ). Metacognition includes metacognitive knowledge , which is your awareness of your own thinking and approaches for learning. Metacognition also includes metacognitive regulation , which is how you control your thinking for learning ( Figure 1 ). Second, metacognition includes multiple processes and skills that are named and emphasized differently in the literature from various disciplines. Yet upon examination, the metacognitive processes and skills from different fields are closely related, and they often overlap (see Supplemental Figure 1). Third, metacognition consists of a person’s thoughts, which may be challenging for that person to describe. The tacit nature of metacognitive processes makes it difficult for instructors to observe metacognition in their students, and it also makes metacognition difficult for researchers to measure. As a result, classroom intervention studies of metacognition—those that are necessary for making the most confident recommendations for promoting student metacognition—have lagged behind foundational and laboratory research on metacognitive processes and skills.

FIGURE 1. Metacognition framework commonly used in biology education research (modified from Schraw and Moshman, 1995 ). This theoretical framework divides metacognition into two components: metacognitive knowledge and metacognitive regulation. Metacognitive knowledge includes what you know about your own thinking and what you know about strategies for learning. Declarative knowledge involves knowing about yourself as a learner, the demands of the task, and what learning strategies exist. Procedural knowledge involves knowing how to use learning strategies. Conditional knowledge involves knowing when and why to use particular learning strategies. Metacognitive regulation involves the actions you take in order to learn. Planning involves deciding what strategies to use for a future learning task and when you will use them. Monitoring involves assessing your understanding of concepts and the effectiveness of your strategies while learning. Evaluating involves appraising your prior plan and adjusting it for future learning.

How do undergraduate students develop metacognitive skills?

To what extent do active learning and generative work 1 promote metacognition?

To what extent do increases in metacognition correspond to increases in achievement in science courses?

FIGURE 2. (A) Landing page for the Student Metacognition guide. The landing page provides a map with sections an instructor can click on to learn more about how to support students’ metacognition. (B) Example paper summary showing instructor recommendations. At the end of each summary in our guide, we used italicized text to point out what instructors should know based on the paper’s results.

The organization of this essay reflects the organization of our evidence-based teaching guide. In the guide, we first define terms and provide important background from papers that highlight the underpinnings and benefits of metacognition ( https://lse.ascb.org/evidence-based-teaching-guides/student-metacognition/benefits-definitions-underpinnings ). We then explore metacognition research by summarizing both classic and recent papers in the field and providing links for readers who want to examine the original studies. We consider three main areas related to metacognition: 1) student strategies for learning, 2) monitoring and control of learning, and 3) social metacognition during group work.

SUPPORTING STUDENTS TO USE EFFECTIVE LEARNING STRATEGIES

What strategies do students use for learning.

First our teaching guide examines metacognition in the context of independent study ( https://lse.ascb.org/evidence-based-teaching-guides/student-metacognition/supporting-student
-learning-strategies ). When students transition to college, they have increased responsibility for directing their learning, which includes making important decisions about how and when to study. Students rely on their metacognition to make those decisions, and they also use metacognitive processes and skills while studying on their own. Empirical work has confirmed what instructors observe about their own students’ studying—many students rely on passive strategies for learning. Students focus on reviewing material as it is written or presented, as opposed to connecting concepts and synthesizing information to make meaning. Some students use approaches that engage their metacognition, but they often do so without a full understanding of the benefits of these approaches ( Karpicke et al. , 2009 ). Students also tend to study based on exam dates and deadlines, rather than planning out when to study ( Hartwig and Dunlosky, 2012 ). As a result, they tend to cram, which is also known in the literature as massing their study. Students continue to cram because this approach is often effective for boosting short-term performance, although it does not promote long-term retention of information.

Which Strategies Should Students Use for Learning?

Here, we make recommendations about what students should do to learn, as opposed to what they typically do. In our teaching guide, we highlight three of the most effective strategies for learning: 1) self-testing, 2) spacing, and 3) interleaving ( https://lse.ascb.org/evidence-based-teaching-guides/student
-metacognition/supporting-student-learning-strategies/
#whatstudentsshould ). These strategies are not yet part of many students’ metacognitive knowledge, but they should know about them and be encouraged to use them while metacognitively regulating their learning. Students self-test when they use flash cards and answer practice questions in an attempt to recall information. Self-testing provides students with opportunities to monitor their understanding of material and identify gaps in their understanding. Self-testing also allows students to activate relevant knowledge and encode prompted information so it can be more easily accessed from their memory in the future ( Dunlosky et al. , 2013 ).

Students space their studying when they spread their learning of the same material over multiple sessions. This approach requires students to intentionally plan their learning instead of focusing only on what is “due” next. Spacing can be combined with retrieval practice , which involves recalling information from memory. For example, self-testing is a form of retrieval practice. Retrieval practice with spacing encourages students to actively recall the same content across several study sessions, which is essential for consolidating information from prior study periods ( Dunlosky et al. , 2013 ). Importantly, when students spread their learning over multiple sessions, they are less susceptible to superficial familiarity with concepts, which can mislead them into thinking they have learned concepts based on recognition alone ( Kornell and Bjork, 2008 ).

Students interleave when they alternate studying of information from one category with studying of information from another category. For example, when students learn categories of amino acid side groups, they should alternate studying nonpolar amino acids with polar amino acids. This allows students to discriminate across categories, which is often critical for correctly solving problems ( Rohrer et al. , 2020 ). Interleaving between categories also supports student learning because it usually results in spacing of study.

How are students enacting specific learning strategies, and do different students enact them in different ways?

To what extent do self-testing, spacing, and interleaving support achievement in the context of undergraduate science courses?

What can instructors do to increase students’ use of effective learning strategies?

What Factors Affect the Strategies Students Should Use to Learn?

Next, we examined the factors that affect what students should do to learn. Although we recommend three well-established strategies for learning, other appropriate strategies can vary based on the learning context. For example, the nature of the material, the type of assessment, the learning objectives, and the instructional methods can render some strategies more effective than others ( Scouller, 1998 ; Sebesta and Bray Speth, 2017 ). Strategies for learning can be characterized as deep if they involve extending and connecting ideas or applying knowledge and skills in new ways ( Baeten et al. , 2010 ). Strategies can be characterized as surface if they involve recalling and reproducing content. While surface strategies are often viewed negatively, there are times when these approaches can be effective for learning ( Hattie and Donoghue, 2016 ). For example, when students have not yet gained background knowledge in an area, they can use surface strategies to acquire the necessary background knowledge. They can then incorporate deep strategies to extend, connect, and apply this knowledge. Importantly, surface and deep strategies can be used simultaneously for effective learning. The use of surface and deep strategies ultimately depends on what students are expected to know and be able to do, and these expectations are set by instructors. Openly discussing these expectations with students can enable them to more readily select effective strategies for learning.

What Challenges Do Students Face in Using Their Metacognition to Enact Effective Strategies?

How can students address challenges they will face when using effective—but effortful—strategies for learning?

What approaches can instructors take to help students overcome these challenges?

ENCOURAGING STUDENTS TO MONITOR AND CONTROL THEIR LEARNING FOR EXAMS

Metacognition can be investigated in the context of any learning task, but in the sciences, metacognitive processes and skills are most often investigated in the context of high-stakes exams. Because exams are a form of assessment common to nearly every science course, in the next part of our teaching guide, we summarized some of the vast research focused on monitoring and control before, during, and after an exam ( https://lse.ascb.org/evidence-based-teaching-guides/student-metacognition/encouraging-students-monitor-control-learning ). In the following section, we demonstrate the kinds of monitoring and control decisions learners make by using an example of introductory biology students studying for an exam on cell division. The students’ instructor has explained that the exam will focus on the stages of mitosis and cytokinesis, and the exam will include both multiple-choice and short-answer questions.

How Should Students Use Metacognition while Preparing for and Taking an Exam?

As students prepare for an exam, they can use metacognition to inform their learning. Students can consider how they will be tested, set goals for their learning, and make a plan to meet their goals. It is expected that students who set specific goals while planning for an exam will be more effective in their studying than students who do not make specific goals. For example, a student who sets a specific goal to identify areas of confusion each week by answering end-of-chapter questions each weekend is expected to do better than a student who sets a more general goal of staying up-to-date on the material. Although some studies include goal setting and planning as one of many metacognitive strategies introduced to students, the influence of task-specific goal setting on academic achievement has not been well studied on its own in the context of science courses.

As students study, it is critical that they monitor both their use of learning strategies and their understanding of concepts. Yet many students struggle to accurately monitor their own understanding ( de Carvalho Filho, 2009 ). In the example we are considering, students may believe they have already learned mitosis because they recognize the terms “prophase,” “metaphase,” “anaphase,” and “telophase” from high school biology. When students read about mitosis in the textbook, processes involving the mitotic spindle may seem familiar because of their exposure to these concepts in class. As a result, students may inaccurately predict that they will perform well on exam questions focused on the mitotic spindle, and their overconfidence may cause them to stop studying the mitotic spindle and related processes ( Thiede et al. , 2003 ). Students often rate their confidence in their learning based on their ability to recognize, rather than recall, concepts.

Instead of focusing on familiarity, students should rate their confidence based on how well they can retrieve relevant information to correctly answer questions. Opportunities for practicing retrieval, such as self-testing, can improve monitoring accuracy. Instructors can help students monitor their understanding more accurately by encouraging students to complete practice exams and giving students feedback on their answers, perhaps in the form of a key or a class discussion ( Rawson and Dunlosky, 2007 ). Returning to the example, if students find they can easily recall the information needed to correctly answer questions about cytokinesis, they may wisely decide to spend their study time on other concepts. In contrast, if students struggle to remember information needed to answer questions about the mitotic spindle, and they answer these questions incorrectly, then they can use this feedback to direct their efforts toward mastering the structure and function of the mitotic spindle.

While taking a high-stakes exam, students can again monitor their performance on a single question, a set of questions, or an entire exam. Their monitoring informs whether they change an answer, with students tending to change answers they judge as incorrect. Accordingly, the accuracy of their monitoring will influence whether their changes result in increased performance ( Koriat and Goldsmith, 1996 ). In some studies, changing answers on an exam has been shown to increase student performance, in contrast to the common belief that a student’s first answer is usually right ( Stylianou-Georgiou and Papanastasiou, 2017 ). Changing answers on an exam can be beneficial if students return to questions they had low confidence in answering and make a judgment on their answers based on the ability to retrieve the information from memory, rather than a sense of familiarity with the concepts. Two important open questions are:

What techniques can students use to improve the accuracy of their monitoring, while preparing for an exam and while taking an exam?

How often do students monitor their understanding when studying on their own?

How Should Students Use Metacognition after Taking an Exam?

How do students develop metacognitive regulation skills such as evaluation?

To what extent does the ability to evaluate affect student learning and performance?

When students evaluate the outcome of their studying and believe their preparation was lacking, to what degree do they adopt more effective strategies for the next exam?

PROMOTING SOCIAL METACOGNITION DURING GROUP WORK

Next, our teaching guide covers a relatively new area of inquiry in the field of metacognition called social metacognition , which is also known as socially shared metacognition ( https://lse.ascb.org/evidence-based-teaching-guides/student
-metacognition/promoting-social-metacognition
-group-work ). Science students are expected to learn not only on their own, but also in the context of small groups. Understanding social metacognition is important because it can support effective student learning during collaborations both inside and outside the classroom. While individual metacognition involves awareness and control of one’s own thinking, social metacognition involves awareness and control of others’ thinking. For example, social metacognition happens when students share ideas with peers, invite peers to evaluate their ideas, and evaluate ideas shared by peers ( Goos et al. , 2002 ). Students also use social metacognition when they assess, modify, and enact one another’s strategies for solving problems ( Van De Bogart et al. , 2017 ). While enacting problem-solving strategies, students can evaluate their peers’ hypotheses, predictions, explanations, and interpretations. Importantly, metacognition and social metacognition are expected to positively affect one another ( Chiu and Kuo, 2009 ).

How do social metacognition and individual metacognition affect one another?

How can science instructors help students to effectively use social metacognition during group work?

CONCLUSIONS

We encourage instructors to support students’ success by helping them develop their metacognition. Our teaching guide ends with an Instructor Checklist of actions instructors can take to include opportunities for metacognitive practice in their courses ( https://lse.ascb.org/wp-content/uploads/sites/10/2020/12/Student-Metacognition-Instructor-Checklist.pdf ). We also provide a list of the most promising approaches instructors can take, called Four Strategies to Implement in Any Course ( https://lse.ascb.org/wp-content/uploads/sites/10/2020/12/Four
-Strategies-to-Foster-Student-Metacognition.pdf ). We not only encourage instructors to consider using these strategies, but given that more evidence for their efficacy is needed from classroom investigations, we also encourage instructors to evaluate and report how well these strategies are improving their students’ achievement. By exploring and supporting students’ metacognitive development, we can help them learn more and perform better in our courses, which will enable them to develop into lifelong learners.

1 Generative work “involves students working individually or collaboratively to generate ideas and products that go beyond what has been presented to them” ( Andrews et al. , 2019 , p2). Generative work is often stimulated by active-learning approaches.

ACKNOWLEDGMENTS

We are grateful to Cynthia Brame, Kristy Wilson, and Adele Wolfson for their insightful feedback on this paper and the guide. This material is based upon work supported in part by the National Science Foundation under grant number 1942318 (to J.D.S.). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

metacognition science problem solving

© 2021 J. D. Stanton et al. CBE—Life Sciences Education © 2021 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Access Resources for State Adult Education Staff

TEAL Center Fact Sheet No. 4: Metacognitive Processes

Metacognition is one’s ability to use prior knowledge to plan a strategy for approaching a learning task, take necessary steps to problem solve, reflect on and evaluate results, and modify one’s approach as needed. It helps learners choose the right cognitive tool for the task and plays a critical role in successful learning.

What Is Metacognition?

Metacognition refers to awareness of one’s own knowledge—what one does and doesn’t know—and one’s ability to understand, control, and manipulate one’s cognitive processes (Meichenbaum, 1985). It includes knowing when and where to use particular strategies for learning and problem solving as well as how and why to use specific strategies. Metacognition is the ability to use prior knowledge to plan a strategy for approaching a learning task, take necessary steps to problem solve, reflect on and evaluate results, and modify one’s approach as needed. Flavell (1976), who first used the term, offers the following example: I am engaging in Metacognition if I notice that I am having more trouble learning A than B; if it strikes me that I should double check C before accepting it as fact (p. 232).

Cognitive strategies are the basic mental abilities we use to think, study, and learn (e.g., recalling information from memory, analyzing sounds and images, making associations between or comparing/contrasting different pieces of information, and making inferences or interpreting text). They help an individual achieve a particular goal, such as comprehending text or solving a math problem, and they can be individually identified and measured. In contrast, metacognitive strategies are used to ensure that an overarching learning goal is being or has been reached. Examples of metacognitive activities include planning how to approach a learning task, using appropriate skills and strategies to solve a problem, monitoring one’s own comprehension of text, self-assessing and self-correcting in response to the self-assessment, evaluating progress toward the completion of a task, and becoming aware of distracting stimuli.

Elements of Metacognition

Researchers distinguish between metacognitive knowledge and metacognitive regulation (Flavell, 1979, 1987; Schraw & Dennison, 1994). Metacognitive knowledge refers to what individuals know about themselves as cognitive processors, about different approaches that can be used for learning and problem solving, and about the demands of a particular learning task. Metacognitive regulation refers to adjustments individuals make to their processes to help control their learning, such as planning, information management strategies, comprehension monitoring, de-bugging strategies, and evaluation of progress and goals. Flavell (1979) further divides metacognitive knowledge into three categories:

Livingston (1997) provides an example of all three variables: “I know that I ( person variable ) have difficulty with word problems ( task variable ), so I will answer the computational problems first and save the word problems for last ( strategy variable ).”

Why Teach Metacognitive Skills?

Research shows that metacognitive skills can be taught to students to improve their learning (Nietfeld & Shraw, 2002; Thiede, Anderson, & Therriault, 2003).

Constructing understanding requires both cognitive and metacognitive elements. Learners “construct knowledge” using cognitive strategies, and they guide, regulate, and evaluate their learning using metacognitive strategies. It is through this “thinking about thinking,” this use of metacognitive strategies, that real learning occurs. As students become more skilled at using metacognitive strategies, they gain confidence and become more independent as learners.

Individuals with well-developed metacognitive skills can think through a problem or approach a learning task, select appropriate strategies, and make decisions about a course of action to resolve the problem or successfully perform the task. They often think about their own thinking processes, taking time to think about and learn from mistakes or inaccuracies (North Central Regional Educational Laboratory, 1995). Some instructional programs encourage students to engage in “metacognitive conversations” with themselves so that they can “talk” with themselves about their learning, the challenges they encounter, and the ways in which they can self-correct and continue learning.

Moreover, individuals who demonstrate a wide variety of metacognitive skills perform better on exams and complete work more efficiently—they use the right tool for the job, and they modify learning strategies as needed, identifying blocks to learning and changing tools or strategies to ensure goal attainment. Because Metacognition plays a critical role in successful learning, it is imperative that instructors help learners develop metacognitively.

What’s the Research?

Metacognitive strategies can be taught (Halpern, 1996), they are associated with successful learning (Borkowski, Carr, & Pressley, 1987). Successful learners have a repertoire of strategies to select from and can transfer them to new settings (Pressley, Borkowski, & Schneider, 1987). Instructors need to set tasks at an appropriate level of difficulty (i.e., challenging enough so that students need to apply metacognitive strategies to monitor success but not so challenging that students become overwhelmed or frustrated), and instructors need to prompt learners to think about what they are doing as they complete these tasks (Biemiller & Meichenbaum, 1992). Instructors should take care not to do the thinking for learners or tell them what to do because this runs the risk of making students experts at seeking help rather than experts at thinking about and directing their own learning. Instead, effective instructors continually prompt learners, asking “What should you do next?”

McKeachie (1988) found that few college instructors explicitly teach strategies for monitoring learning. They assume that students have already learned these strategies in high school. But many have not and are unaware of the metacognitive process and its importance to learning. Rote memorization is the usual—and often the only—learning strategy employed by high school students when they enter college (Nist, 1993). Simpson and Nist (2000), in a review of the literature on strategic learning, emphasize that instructors need to provide explicit instruction on the use of study strategies. The implication for ABE programs is that it is likely that ABE learners need explicit instruction in both cognitive and metacognitive strategies. They need to know that they have choices about the strategies they can employ in different contexts, and they need to monitor their use of and success with these strategies.

Recommended Instructional Strategies

Instructors can encourage ABE learners to become more strategic thinkers by helping them focus on the ways they process information. Self-questioning, reflective journal writing, and discussing their thought processes with other learners are among the ways that teachers can encourage learners to examine and develop their metacognitive processes.

Fogarty (1994) suggests that Metacognition is a process that spans three distinct phases, and that, to be successful thinkers, students must do the following:

Instructors can model the application of questions, and they can prompt learners to ask themselves questions during each phase. They can incorporate into lesson plans opportunities for learners to practice using these questions during learning tasks, as illustratetd in the following examples:

Rather than viewing reading, writing, science, social studies, and math only as subjects or content to be taught, instructors can see them as opportunities for learners to reflect on their learning processes. Examples follow for each content area:

The goal of teaching metacognitive strategies is to help learners become comfortable with these strategies so that they employ them automatically to learning tasks, focusing their attention, deriving meaning, and making adjustments if something goes wrong. They do not think about these skills while performing them but, if asked what they are doing, they can usually accurately describe their metacognitive processes.

Biemiller, A., & Meichenbaum, D. (1992). The nature and nurture of the self-directed learner. Educational Leadership, 50, 75–80.

Borkowski, J., Carr, M., & Pressely, M. (1987). “Spontaneous” strategy use: Perspectives from metacognitive theory. Intelligence, 11, 61–75.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive-developmental inquiry. American Psychologist, 34, 906–911.

Flavell, J. H. (1976). Metacognitive aspects of problem solving. In L. B. Resnick (Ed.), The nature of intelligence (pp. 231–236). Hillsdale, NJ: Lawrence Erlbaum Associates.

Flavell, J. H. (1987). Speculations about the nature and development of metacognition. In F. E. Weinert & R. H. Kluwe (Eds.), Metacognition, motivation, and understanding (pp. 21–29). Hillside, NJ: Lawrence Erlbaum Associates.

Fogarty, R. (1994). How to teach for metacognition. Palatine, IL: IRI/Skylight Publishing.

Halpern, D. F. (1996). Thought and knowledge: An introduction to critical thinking. Mahwah, NJ: Lawrence Erlbaum Associates.

Livingston, J. A. (1997). Metacognition: An overview. Retrieved December 27, 2011 from http://gse.buffalo.edu/fas/shuell/CEP564/Metacog.htm

McKeachie, W. J. (1988). The need for study strategy training. In C. E. Weinstein, E. T. Goetz, & P. A. Alexander (Eds.), Learning and study strategies: Issues in assessment, instruction, and evaluation (pp. 3–9). New York: Academic Press.

Meichenbaum, D. (1985). Teaching thinking: A cognitive-behavioral perspective. In S. F., Chipman, J. W. Segal, & R. Glaser (Eds.), Thinking and learning skills, Vol. 2: Research and open questions. Hillsdale, NJ: Lawrence Erlbaum Associates.

North Central Regional Educational Laboratory. (1995). Strategic teaching and reading project guidebook. Retrieved December 27, 2011

Nietfeld, J. L., & Shraw, G. (2002). The effect of knowledge and strategy explanation on monitoring accuracy. Journal of Educational Research, 95, 131–142.

Nist, S. (1993). What the literature says about academic literacy. Georgia Journal of Reading, Fall-Winter, 11–18.

Pressley, M., Borkowski, J. G., & Schneider, W. (1987). Cognitive strategies: Good strategy users coordinate metacognition and knowledge. In R. Vasta, & G. Whitehurst (Eds.), Annals of child development, 4, 80–129. Greenwich, CT: JAI Press.

Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19, 460–475.

Simpson, M. L., & Nist, S. L. (2000). An update on strategic learning: It’s more than textbook reading strategies. Journal of Adolescent and Adult Literacy, 43 (6) 528–541.

Thiede, K. W., Anderson, M. C., & Therriault, D. (2003). Accuracy of metacognitive monitoring affects learning of texts. Journal of Educational Psychology, 95, 66–73.

Authors: TEAL Center staff

Reviewed by: David Scanlon, Boston College

About the TEAL Center: The Teaching Excellence in Adult Literacy (TEAL) Center is a project of the U.S. Department of Education, Office of Career, Technical, and Adult Education (OCTAE), designed to improve the quality of teaching in adult education in the content areas.

Help | Advanced Search

Computer Science > Computation and Language

Title: tree of thoughts: deliberate problem solving with large language models.

Abstract: Language models are increasingly being deployed for general problem solving across a wide range of tasks, but are still confined to token-level, left-to-right decision-making processes during inference. This means they can fall short in tasks that require exploration, strategic lookahead, or where initial decisions play a pivotal role. To surmount these challenges, we introduce a new framework for language model inference, Tree of Thoughts (ToT), which generalizes over the popular Chain of Thought approach to prompting language models, and enables exploration over coherent units of text (thoughts) that serve as intermediate steps toward problem solving. ToT allows LMs to perform deliberate decision making by considering multiple different reasoning paths and self-evaluating choices to decide the next course of action, as well as looking ahead or backtracking when necessary to make global choices. Our experiments show that ToT significantly enhances language models' problem-solving abilities on three novel tasks requiring non-trivial planning or search: Game of 24, Creative Writing, and Mini Crosswords. For instance, in Game of 24, while GPT-4 with chain-of-thought prompting only solved 4% of tasks, our method achieved a success rate of 74%. Code repo with all prompts: this https URL .

Submission history

metacognition science problem solving

References & Citations

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

Captcha Page

We apologize for the inconvenience...

To ensure we keep this website safe, please can you confirm you are a human by ticking the box below.

If you are unable to complete the above request please contact us using the below link, providing a screenshot of your experience.

https://ioppublishing.org/contacts/

Please solve this CAPTCHA to request unblock to the website

Cognitive, metacognitive, and motivational aspects of problem solving

Instructional Science volume  26 ,  pages 49–63 ( 1998 ) Cite this article

5947 Accesses

250 Citations

Metrics details

This article examines the role of cognitive, metacognitive, and motivational skills in problem solving. Cognitive skills include instructional objectives, components in a learning hierarchy, and components in information processing. Metacognitive skills include strategies for reading comprehension, writing, and mathematics. Motivational skills include motivation based on interest, self-efficacy, and attributions. All three kinds of skills are required for successful problem solving in academic settings.

This is a preview of subscription content, access via your institution .

Access options

Buy single article.

Instant access to the full article PDF.

Price includes VAT (Russian Federation)

Rent this article via DeepDyve.

Anand, P.G. & Ross, S.M. (1987). Using computer-assisted instruction to personalize arithmetic materials for elementary school children. Journal of Educational Psychology : 72‐78.

Bean, T.W. & Steenwyk, F.L. (1984). The effect of three forms of summarization instruction on sixth graders’ summary writing and comprehension. Journal of Reading Behavior : 297‐306.

Block, J.H. & Burns, R.B. (1976). Mastery learning. In L.S. Shulman, ed., Review of Research in Education, Volume 4 . Itsaca, IL: Peacock.

Google Scholar  

Bloom, B.S. (1976). Human Characteristics and School Learning . New York: McGraw-Hill.

Bloom, B.S., Englehart, M.D., Furst, E.J., Hill, W.H. & Krathwohl, D.R. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook 1: Cognitive domain .York: McKay.

Borkowski, J.G., Weyhing, R.S. & Carr, M. (1988). Effects of attributional retraining on strategy-based reading comprehension in learning disabled students. Journal of Educational Psychology : 46‐53.

Brown, A.L. & Day, J.D. (1983). Macrorules for summarizing texts: The development of expertise. Journal of Verbal Learning and Verbal Behavior : 1‐14.

Chi, M.T.H., Glaser, R. & Farr, M.J., eds. (1988). The Nature of Expertise . Hillsdale, NJ: Erlbaum.

Chipman, S.F., Segal, J.W. & Glaser, R., eds. (1985). Thinking and Learning Skills, Volume 2: Research and Open Questions . Hillsdale, NJ: Erlbaum.

Cook, L.K. & Mayer, R.E. (1988). Teaching readers about the structure of scientific text. Journal of Educational Psychology 80: 448‐456.

Dewey, J. (1913). Interest and Effort in Education .Cambridge, MA: Riverside Press.

Ericsson, K.A. & Smith, J., eds. (1991). Toward a General Theory of Expertise . Cambridge, England: Cambridge University Press.

Fitzgerald, J. & Teasley, A.B. (1986). Effects of instruction in narrative structure on children’s writing. Journal of Educational Psychology : 424‐432.

Gagne, R.M. (1968). Learning hierarchies. Educational Psychologist : 1‐9.

Gagne, R.M., Mayor, J.R., Garstens, H.L. & Paradise, N.E. (1962). Factors in acquiring knowledge in a mathematics task. Psychological Monographs (7) (Whole No. 526).

Garner, R., Gillingham, M.G. & White, C.S. (1989). Effects of “seductive details” on macroprocessing and microprocessing in adults and children. Cognition and Instruction , 6 : 41‐57.

Graham, S. (1984). Communicating sympathy and anger to black and white children: The cognitive (attributional) consequences of affective cues. Journal of Personality and Social Psychology : 40‐54.

Graham, S. & Barker, G.P. (1990). The down side of help: An attributional-developmental analysis of helping behavior as a low-ability cue. Journal of Educational Psychology : 7‐14.

Graham, S. & Harris, K.R. (1988). Instructional recommendations for teaching writing to exceptional students. Exceptional Children : 506‐512.

Halpern, D.F., ed. (1992). Enhancing Thinking Skills in the Sciences and Mathematics . Hillsdale, NJ: Erlbaum.

Hayes, J.R. & Flower, L.S. (1986). Writing research and the writer. American Psychologist 41: 1106‐1113.

Lewis, A.B. (1989). Training students to represent arithmetic word problems. Journal of Educational Psychology : 363‐371.

Luchins, A.S. & Luchins, E.H. (1970). Wertheimer’s Seminars Revisited: Problem Solving and Thinking (Vol. 1) . Albany, NY: State University of New York.

Mayer, R.E. (1985). Mathematical ability. In R.J. Sternberg, ed., Human Abilities: An Information Processing Approach (pp. 127‐150). New York: Freeman.

Mayer, R.E. (1987). Educational Psychology: A Cognitive Approach . New York: Harper Collins.

Mayer, R.E. (1992). Thinking, Problem Solving, Cognition: Second Edition .New York: Freeman.

Mayer, R.E. & Wittrock, M.C. (in press). Problem solving and transfer. In D. Berliner & R. Calfee, eds., Handbook of Educational Psychology . New York: Macmillan.

Nickerson, R.S., Perkins, D.N. & Smith, E.E., eds. (1985). The Teaching of Thinking . Hillsdale, NJ: Erlbaum.

Pintrich, P.R. & De Groot, E.V. (1990). Motivation and self-regulated learning components of classroom academic performance. Journal of Educational Psychology : 33‐40.

Pressley, M. (1990). Cognitive Strategy Instruction . Cambridge, MA: Brookline Books.

Renninger, K.A., Hidi, S. & Krapp, A., eds. (1992). The Role of Interest in Learning and Development .Hillsdale, NJ: Erlbaum.

Rinehart, S.D., Stahl, S.A. & Erickson, L.G. (1986). Some effects of summarization training on reading and studying. Reading Research Quarterly : 422‐438.

Robins, S. & Mayer, R.E. (1993). Schema training in analogical reasoning. Journal of Educational Psychology : 529‐538.

Ross, S.M., McCormick, D., Krisak, N. & Anand, P. (1985). Personalizing context in teaching mathematical concepts: Teacher-managed and computer-managed models. Educational Communication Technology Journal : 169‐178.

Schiefele, U. (1992). Topic interest and level of text comprehension. In K.A. Renninger, S. Hidi & A. Krapp, eds., The Role of Interest in Learning and Development (pp. 151‐182). Hillsdale, NJ: Erlbaum.

Schiefele, U., Krapp, A. & Winteler, A. (1992). In K.A. Renninger, S. Hidi & A. Krapp, eds., The Role of Interest in Learning and Development (pp. 183‐212). Hillsdale, NJ: Erlbaum.

Schoenfeld, A.H. (1979). Explicit heuristic training as a variable in problem-solving performance. Journal for Research in Mathematics Education 10: 173‐187.

Schoenfeld, A.H. (1985). Mathematical Problem Solving . Orlando, FL: Academic Press.

Schunk, D. (1991). Self-efficacy and academic motivation. Educational Psychologist : 207‐231.

Schunk, D.H. & Hanson, A.R. (1985). Peer models: Influences on children’s self-efficacy and achievement. Journal of Educational Psychology : 313‐322.

Smith, M.U., ed. (1991). Toward a Unified Theory of Problem Solving: Views from the Content Domains . Hillsdale, NJ: Erlbaum.

Segal, J.W., Chipman, S.F. & Glaser, R., eds. (1985). Thinking and Learning Skills, Volume 1: Relating Instruction to Research . Hillsdale, NJ: Erlbaum.

Sternberg, R.J. (1985). Beyond IQ: A Triarchic Theory of Human Intelligence . Cambridge, England: Cambridge University Press.

Sternberg, R.J. & Frensch, P.A., eds. (1991). Complex Problem Solving: Principles and Mechanisms . Hillsdale, NJ: Erlbaum.

Sternberg, R.J. & Gardner, M.K. (1983). Unities in inductive reasoning. Journal of Experimental Psychology: General : 80‐116.

Taylor, B.M. & Beach, R.W. (1984). The effects of text structure instruction on middle-grade students’ comprehension and production of expository text. Reading Research Quarterly 19: 134‐146.

Wade, S.E. (1992). How interest affects learning from text. In K.A. Renninger, S. Hidi & A. Krapp, eds., The Role of Interest in Learning and Development (pp. 255‐278). Hillsdale, NJ: Erlbaum.

Weiner, B. (1986). An Attributional Theory of Motivation and Emotion . New York: Springer-Verlag.

Wertheimer, M. (1959). Productive Thinking . New York: Harper & Row.

White, R.T. (1974). The validation of a learning hierarchy. American Educational Research Journal : 121‐236.

Zimmerman, B.J. & Martinez-Pons, M. (1990). Student differences in self-regulated learning: Relating grade, sex, and giftedness to self-efficacy and strategy use. Journal of Educational Psychology : 51‐59.

Download references

Author information

Authors and affiliations.

University of California, Santa Barbara, U.S.A

Richard E. Mayer

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and Permissions

About this article

Cite this article.

Mayer, R.E. Cognitive, metacognitive, and motivational aspects of problem solving. Instructional Science 26 , 49–63 (1998). https://doi.org/10.1023/A:1003088013286

Download citation

Issue Date : March 1998

DOI : https://doi.org/10.1023/A:1003088013286

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Advertisement

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

“Oh, that makes sense”: Social Metacognition in Small-Group Problem Solving

Stephanie m. halmo.

† Department of Cellular Biology, University of Georgia, Athens, GA 30602

Emily K. Bremers

‡ Department of Biochemistry and Molecular Biology, University of Georgia, Athens, GA 30602

Sammantha Fuller

Julie dangremond stanton, associated data.

Stronger metacognition, or awareness and regulation of thinking, is related to higher academic achievement. Most metacognition research has focused at the level of the individual learner. However, a few studies have shown that students working in small groups can stimulate metacognition in one another, leading to improved learning. Given the increased adoption of interactive group work in life science classrooms, there is a need to study the role of social metacognition, or the awareness and regulation of the thinking of others, in this context. Guided by the frameworks of social metacognition and evidence-based reasoning, we asked: 1) What metacognitive utterances (words, phrases, statements, or questions) do students use during small-group problem solving in an upper-division biology course? 2) Which metacognitive utterances are associated with small groups sharing higher-quality reasoning in an upper-division biology classroom? We used discourse analysis to examine transcripts from two groups of three students during breakout sessions. By coding for metacognition, we identified seven types of metacognitive utterances. By coding for reasoning, we uncovered four categories of metacognitive utterances associated with higher-quality reasoning. We offer suggestions for life science educators interested in promoting social metacognition during small-group problem solving.

INTRODUCTION

As researchers investigate ways to support life science instructors’ use of interactive learning in their classes ( Wilson et al. , 2018 ), there is a parallel need to uncover processes that help students fully benefit from these increasing opportunities. Successful interactive group work includes collaboration, or engagement in a coordinated effort to reach a shared goal. Social metacognition, or awareness and regulation of the thinking of others, can increase effective student collaboration during interactive group work (e.g., Kim and Lim, 2018 ). To support life science students’ use of social metacognition during group work, we need to characterize social metacognition in the context of the life sciences. Then we can use our understanding of social metacognition in the life sciences to provide guidance, such as prompts to pose during group work, that helps students fully benefit from opportunities to collaborate with their peers.

In this study, we characterize the unprompted social metacognition life science undergraduates use when they work in small groups to solve problems. We investigate the aspects of their social metacognition that are associated with higher-quality reasoning, which we define as reasoning that is correct, backed by evidence, and generated by more than one individual. Our analysis of student conversation during small-group problem solving draws upon several guiding frameworks. In the following sections, we present relevant background information on the frameworks we use to guide our study of interactive group work, social metacognition, and reasoning.

Increasing Adoption of Interactive Group Work

Interactive group work is increasingly being adopted in college science classrooms ( Wilson et al. , 2018 ). Having students work in small groups to solve problems helps students develop essential skills, like collaboration, which are valued in the sciences ( Kuhn, 2015 ; National Research Council [NRC], 2015 ). The adoption of group work also aligns with the view that knowledge construction is a socially shared activity, rather than an individual one. Social cognitive theory, or the idea that learning occurs in a social context from which it cannot be separated ( Vygotsky, 1978 ; Bandura, 1986 ), forms the foundation behind the promotion of interactive group work.

One framework for studying interactive group work is the ICAP framework, which hypothesizes that learning improves as students’ cognitive engagement progresses from p assive to a ctive to c onstructive to i nteractive, with the deepest level of understanding occurring in the interactive mode ( Chi and Wylie, 2014 ). The interactive mode occurs when students take frequent turns in dialogue with one another by interjecting to ask questions, make clarifications, and explain ideas ( Chi and Wylie, 2014 ). Through this exchange of dialogue, students are able to infer new knowledge from prior knowledge in an iterative and cooperative manner as they take conversational turns. When students work in groups to solve problems, they can employ the social practice of conversation or discourse ( Cameron, 2001 ; Rogers, 2004 ) to co-construct knowledge in the interactive mode ( Chi and Wylie, 2014 ). As more science instructors implement group work in their courses, we need to better understand how social learning contexts impact important aspects of learning, like metacognition.

Metacognition in Social Learning Contexts

Metacognition, or the awareness and control of thinking for the purpose of learning, is linked to higher academic achievement and can be engaged at the individual or social level. Metacognition is composed of two components: metacognitive knowledge and metacognitive regulation ( Schraw and Moshman, 1995 ). Metacognitive knowledge consists of what one knows about their own thinking and what they know about strategies for learning. Metacognitive regulation consists of the actions one takes to learn, including planning strategy use for future learning, monitoring understanding and the effectiveness of strategy use during learning, and evaluating plans and adjusting strategies based on past learning ( Schraw and Moshman, 1995 ). Metacognition gained prominence in cognitive science and education over the last 50 years because of its relationship to enhanced individual learning ( Tanner, 2012 ; Stanton et al. , 2021 ). For example, students with stronger metacognitive skills learn more and perform better than peers who are less metacognitive (e.g., Wang et al. , 1990 ).

Most research on metacognition has focused at the level of the individual learner. Metacognition was initially conceptualized as an individual process, because discussions on learning were influenced by Piaget’s individual-based theory of cognitive development ( Brown, 1978 ). Since then, some researchers have conceived of metacognition more broadly as people’s thoughts about their own thinking and the thinking of others ( Jost et al. , 1998 ). In essence, metacognition includes both individual and social components. Individual metacognition is one’s awareness and regulation of one’s own thinking for the purpose of learning, and social metacognition is awareness and regulation of other’s thinking for the purpose of learning ( Stanton et al. , 2021 ). While the theoretical boundaries between individual and social metacognition are clear, distinguishing between the two in practice can be challenging. For example, during small group work, it can be difficult to know whether a student’s spoken metacognition is directed inward versus outward (i.e., a reflection of individual vs. social metacognition). However, when a student shares their metacognition in this way, it could potentially stimulate metacognition in another group member. For this reason, we operationally define social metacognition as metacognition that is shared verbally during collaborative work.

Social metacognition, also known as socially shared metacognition, has been explored in just a few disciplines, such as mathematics ( Goos et al. , 2002 ; Smith and Mancy, 2018 ), physics ( Lippmann Kung and Linder, 2007 ; Van De Bogart et al. , 2017 ), and the learning sciences ( Siegel, 2012 ; De Backer et al. , 2015 , 2020 ). Social metacognition researchers have focused on identifying the metacognitive “utterances,” or words, phrases, statements, or questions, students use during small-group problem solving. Metacognitive utterances are identified through discourse analysis, which is the investigation of socially situated language ( Cameron, 2001 ; Rogers, 2004 ). From these foundational discourse analyses in other disciplines, a conceptual framework of social metacognition emerged. Social metacognition can happen when students share or disclose their ideas to peers, invite their peers to evaluate their ideas, or evaluate ideas shared by peers ( Goos et al. , 2002 ). Social metacognition also occurs when students enact, modify, or assess their peers’ strategies for problem solving ( Van De Bogart et al. , 2017 ).

Research on social metacognition in other disciplines has shown that students working in small groups can stimulate metacognitive processes in one another, leading to improved learning. For example, researchers found that one variation in social metacognitive dialogue, which they called “interrogative” (i.e., evoked by a thought-provoking trigger and generally followed by elaborative reactions), was positively related to college students’ individual performance on a learning sciences knowledge test ( De Backer et al. , 2020 ). Middle school students who came up with a correct solution as a group had higher levels of metacognitive interactions or made more metacognitive utterances during group problem solving ( Artz and Armour-Thomas, 1992 ). Support for this finding was provided by a comparison of successful versus unsuccessful problem solving in a high school math class. Successful problem solving (i.e., working together as a group to come to a correct solution on a math problem) involved students assessing one another’s ideas, correcting incorrect ideas, and endorsing correct ideas, while during unsuccessful problem solving, students lacked critical engagement with one another’s thinking ( Goos et al. , 2002 ).

Although metacognitive utterances have been identified in a few disciplinary contexts, the metacognitive utterances that college students use during small-group problem solving in the life sciences has yet to be documented. Social cognitive theory posits that learning is socially situated, meaning it is specific to the context and social environment in which it is embedded ( Bandura, 1986 ). This means learning is not easily transferable from one context to another. For example, the nature of metacognition that occurs in a high school calculus class using problem-based learning could differ from that which occurs in a college biology class using process-oriented guided inquiry learning (POGIL). Additionally, just because a student can use metacognition in their calculus class does not necessarily mean they will employ the same metacognition in their biology class. Given that the metacognition students use can differ based on context, defining the metacognitive utterances life science majors use during small-group problem solving is an important first step for understanding social metacognition in the life sciences. We can then use this understanding to provide guidance to students as they work together in groups.

Social metacognition has also been linked to reasoning. In physics labs, metacognitive utterances impacted learning behavior by helping students transition from logistical to reasoning behavior. For example, a metacognitive utterance helped students transition from recording data (logistical behavior) to assessing their experimental design (reasoning behavior). The action the group takes after metacognitive utterances seems to be what matters most for successful problem solving in physics labs ( Lippmann Kung and Linder, 2007 ). Research in grade school mathematics classrooms indicated a positive association between metacognitive talk and transactive talk, or reasoning that operates on the reasoning of another. Results from this study suggest that metacognitive talk is more likely to be preceded or followed by reasoning ( Smith and Mancy, 2018 ). These promising results indicate that social metacognition is associated with improved reasoning in other disciplinary contexts.

Reasoning in Social Learning Contexts

The skill or practice of scientific reasoning is a valued outcome of science education and a focus of major science education reform efforts ( NRC, 2007 ; American Association for the Advancement of Science, 2011 ). Scientific reasoning reflects the disciplinary practices of scientists and can create a more scientifically literate society. Scientific reasoning is the process of constructing an explanation for observed phenomena or constructing an argument that justifies a claim. Scientific reasoning skills include identifying patterns in data, making inferences, resolving uncertainty, coordinating theory with evidence, and constructing evidence-based explanations of phenomena and arguments that justify their validity ( Osborne, 2010 ).

One important framework for reasoning is Toulmin’s argument pattern. Toulmin described an argument as the relationship between a claim, the information that supports the claim, and an explanation for why the claim flows logically or causally from the information ( Toulmin, 2003 ). Toulmin’s argument pattern framework is domain general in nature and therefore does not include an assessment of whether an argument is coherent or accurate. Many studies have used Toulmin’s argument pattern to guide analysis of scientific reasoning in group discourse ( Osborne et al. , 2004 ; Sampson and Clark, 2008 ; Knight et al. , 2013 , 2015 ; Paine and Knight, 2020 ). A key adaptation of Toulmin’s argument pattern is the evidence-based reasoning framework ( Brown et al. , 2010 ).

The evidence-based reasoning framework is “intended to help researchers and practitioners identify the presence and form of scientific argumentation in student work and classroom discourse” ( Brown et al. , 2010 , p. 134). The evidence-based reasoning framework draws distinctions between the component parts of scientific reasoning by identifying claims, premises, rules, evidence, and data (Supplemental Figure 1). A “claim” is a statement about a specific outcome phrased as either a prediction, observation, or conclusion. A “premise” is a statement about the circumstances or input that results in the output described by the claim. A “rule” is a statement describing a general relationship or principle that links the premise to the claim. “Evidence” is a statement about an observed relationship, and “data” are reports of discrete observations. Together, rules, evidence, and data can be considered forms of backing ( Furtak et al. , 2010 ). For example, as someone is driving, they may think, “This traffic light is yellow. Yellow lights quickly turn to red, so I will slow down.” In this example, the statement “This traffic light is yellow” is a premise, “Yellow lights quickly turn to red” is backing (specifically, a rule), and “I will slow down” is a claim. The evidence-based reasoning framework suggests that more sophisticated scientific reasoning occurs when students make a claim supported by backing ( Brown et al. , 2010 ; Furtak et al. , 2010 ). Reasoning quality is relative and likely occurs on a qualitative continuum from lower to higher quality. One of our goals was to characterize this reasoning quality continuum. In this study, we use the evidence-based reasoning framework as a starting point to identify instances of complete reasoning during student discourse or conversation. We expand the continuum so that higher-quality reasoning also includes the consideration of correctness and the transactive nature of the reasoning (i.e., whether the reasoning was generated by more than one individual).

Research Questions

With the increasing adoption of interactive group work in undergraduate life science classrooms, there is a need to study the role of social metacognition, or metacognition that occurs out loud in a social learning context, and its relationship to reasoning. To address this gap, we used discourse analysis of small-group problem solving from an upper-division biology course to address the following qualitative research questions:

Context and Data Collection

This study was conducted at a large, public, research-intensive university in the southeastern United States. Participants were recruited from an upper-level cell biology course taken by life science majors in 2018. Average enrollment in this course was ~80 students per section. The course consisted of an interactive lecture 3 days a week and a smaller breakout session once per week (~40 students per breakout session). The breakout sessions were held in a SCALE-UP classroom designed to facilitate group work with multiple monitors and round tables where students sat in groups of three ( Beichner et al. , 2007 ). During weekly breakout sessions, students worked in small groups of three to solve problem sets in-person using a pen-and-paper format.

The problem sets were designed using guided-inquiry principles ( Moog et al. , 2006 ). The problems scaffolded student learning about a cell biology concept and asked students to analyze relevant published scientific data. The first problem set covered import of proteins into the nucleus (“nuclear import”), and the second problem set focused on transcription. The breakout session problem sets were formative assessments and were not letter graded but were highly aligned to the learning objectives and the exams in the course. Approximately 40% of the exam points covered material from the breakout sessions. In lieu of being letter graded, a graduate teaching assistant provided written feedback to the groups on their completed problem sets, similar to the feedback that would be provided if the problem set was an exam.

To form groups during the breakout sessions, participants were allowed to pick their own group of three or they could opt to be randomly assigned to a group. The groups of three did not change during the course of the study. In each group, there were three roles: manager, presenter, and recorder ( Stanton and Dye, 2017 ). The recorder was responsible for writing and turning in the group’s answers for a participation grade and feedback. The presenter was responsible for sharing group results during the whole-class discussion at the end of the breakout session and for sharing group solutions on dry-erase boards during the breakout session. The manager was responsible for keeping the group on task during the allotted time. Group roles were randomly assigned at the start of each breakout session and rotated week to week. The study was classified by the University of Georgia’s Institutional Review Board as exempt (STUDY00006457).

Four groups of three students each agreed to be audio-recorded during two consecutive breakout sessions. Each participant was compensated $20 for participation in the study, and all participants provided written consent. To accurately record individuals in a group setting, each group member was individually microphoned using 8W-1KU UHF Octo Receiver System equipment (Nady Systems, Inc.). After the audio recordings were collected, the individual recording tracks were synced and merged into one recording per group per breakout session using Steinberg Cubase software.

The audio recordings were professionally transcribed (Rev.com), and the transcripts were checked to ensure accuracy before analysis. The following transcription conventions were used: 1) speaker turns were arranged in vertical format, with all speaker turns arranged in a single column one above another to reflect the equal status of each speaker as students; 2) utterances ending in a sharp rising intonation were considered questions and were signified with a question mark (?); 3) a single dash (-) following a word was used to indicate interrupted, truncated, or cut-off words or phrases; 4) an ellipsis (…) was used to indicate pauses in speech or when a speaker trailed off; and 5) when available, researchers provided interpretations of nonspecific pronouns (e.g., this and that), which are indicated in brackets ([]) ( Du Bois et al. , 1992 ; Edwards, 2005 ). We did not use transcription conventions to signify overlapping stretches of speech, but these instances were coded. Participant names were changed to pseudonyms in the transcripts. The transcripts and accompanying audio serve as the primary data for this study.

Here, we report on data from two of the four groups during two consecutive breakout sessions. The data from the third group were excluded from analysis because one group member was absent the day of the second recording. The data from the fourth group were excluded from further analysis because the group spent a significant amount of time looking through their notes and reading them to one another rather than discussing their thoughts about the problem set. Our rationale for this decision was that we could not study social metacognition if it was not evident. We will refer to the two groups analyzed in this study as Group A and Group B. The groups attended separate breakout sessions on the same afternoons. Group A consisted of three women: Bella, Catherine, and Michelle, who elected to work together. Group B consisted of one woman and two men: Molly, Adam, and Oscar, who were assigned to work together. The group roles of the participants for each problem set can be found in Supplemental Table 1. Our sample size, while small, is in alignment with sample sizes for foundational discourse analysis ( Cameron, 2001 ; Rogers, 2004 ). For example, one study on social metacognition involved the analysis of one pair of students working on a single physics laboratory problem ( Van De Bogart et al. , 2015 ).

Timeline Creation and Analysis of Silence

Each transcript was analyzed to create a timeline of each breakout session. Start and end times of on-task work, which we defined as students directly working on the problem set, and off-task work, which we defined as students discussing ideas unrelated to the problem set, were recorded. Next, two researchers (S.M.H. and E.K.B.) listened to all transcripts and recorded the start and end time of all silences equal to or greater than 5 seconds in length. The duration of the silences was summed for each transcript. Using this information, a percentage of time spent in silence was calculated for each transcript as follows:

equation image

Qualitative Discourse Analysis

Transcripts of student discourse were analyzed by a team of researchers using MaxQDA 2020. Our basic unit of analysis was an utterance. An utterance is either a word, phrase, statement, or question that an individual or group of students makes while collaborating. A single line of speech from a single student could contain multiple utterances. For example, one student could say, “Yeah, I don’t know. What do you think?,” and this could be broken into three separate utterances with “Yeah” as a single word utterance, “I don’t know” as a phrase utterance, and “What do you think?” as a question utterance. Alternatively, multiple lines of speech from multiple students could compose a single utterance. For instance, when one student interrupts another to complete the other’s thought, the combined statement could be considered an utterance composed by two individuals. Our qualitative discourse analysis of these utterances occurred in multiple, iterative cycles.

First-Cycle Coding.

First-cycle coding began with open, initial coding of all the transcripts ( Saldaña, 2021 ). The goal of our initial coding process was to begin identifying the utterances from our data set that related to social metacognition and reasoning. Two coding schemes were developed to investigate our research questions. The first codebook was developed to capture metacognitive utterances from student interactions. The second codebook was developed to capture the reasoning quality present in the discussion. This dual-coding methodology meant utterances could be double coded as both metacognitive and as a part of reasoning. Data were first coded as metacognitive utterances and then coded for reasoning quality ( Figure 1 ). We discuss the development of each codebook in the following sections.

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g001.jpg

Schematic of dual-coding process. In step 1, types of metacognitive utterances throughout each transcript were coded. As a simplified example, the image under step 1 depicts five lines of a single transcript, three of which contain a metacognitive utterance. Metacognitive utterances are words, phrases, statements, or questions students made that were related to their awareness and control of thinking for the purposes of learning. In step 2, each transcript was segmented into problem episodes. Problem episodes consisted of sections of the transcript in which students were solving problems that dealt with cell biology data, e.g., “Explain the results in the YRC panel.” The image under step 2 shows that the first four lines of the example transcript make up a single problem episode (green box). In step 3, the problem episodes were further segmented into reasoning units. Reasoning units consisted of a chunk of discourse in which a student or students were discussing a single collection of connected ideas. The image under step 3 shows that the first two lines of the example transcript make up one reasoning unit, and the fourth line is a separate reasoning unit (purple boxes). In step 4, each reasoning unit was assigned a reasoning code (Supplemental Table 2). The image under step 4 shows the first reasoning unit was assigned with the level 7 code, and the second reasoning unit was assigned with the level 0 code (Supplemental Table 2). Attribution for images: Profile and Profile Woman by mikicon from the Noun Project.

Social Metacognition.

All four authors coded the transcripts for social metacognition, wrote analytic memos, and then met to discuss emergent ideas. Deductive codes originated from prior work on social metacognition: self-disclosure, feedback request, and other monitoring ( Goos et al. , 2002 ). We developed inductive, or emergent, codes based on the utterances present in our data and our knowledge of metacognition as a construct. We refined these codes through discussion, listening to the audio, and careful consideration of which codes aligned with our research questions. Two researchers (S.M.H. and E.K.B.) then coded all four transcripts individually and subsequently met as a team to discuss how each researcher applied the codes. These discussions led us to add, remove, or redefine our existing codes, further refining the codebook. Through these iterations, our codebook stabilized. We then revisited segments of the data that were selected for reasoning analysis and coded them to consensus with the stabilized social metacognition codebook until all discrepancies were resolved. Attribute coding, or the notation of participant characteristics ( Saldaña, 2021 ), was also employed to take note of the group roles during the first-cycle coding for social metacognition. For example, in the nuclear import transcript for Group A, every line from Michelle was coded as a line from the manager, which was her assigned group role that day.

A reliable, systematic methodology for 1) identifying reasoning and 2) assessing its quality was needed. Once sections of the transcripts related to reasoning about scientific data were identified in initial coding, the transcripts were broken into problem episodes. Problem episodes consisted of all utterances in a transcript in which students discussed an answer to a specific problem in the problem set ( Figure 2 , green boxes). For example, there was one problem episode from Group A for problem 2 in the transcription problem set. Given the emphasis on backing (rules, evidence, and data) in the evidence-based reasoning framework (Supplemental Figure 1), we selected only those problems that required students to analyze a diagram or data figure (questions 2 and 4 for the nuclear import problem set and questions 2, 3, and 5 for the transcription problem set).

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g002.jpg

Timelines of small group work. Group work timelines are represented as bar graphs with time (in minutes) on the x -axis. The timelines show how each group spent each breakout session. White sections of the timeline indicate when groups were silent. Black sections of the timeline indicate when groups were discussing the problem set. Blue sections of the timeline indicate when groups were discussing ideas unrelated to the problem set. Problem episodes selected for reasoning analysis are indicated on the timelines in green boxes. Groups did not finish the problem set in the same amount of time. For example, Group B finished the problem set for breakout session 1 early, whereas Group A did not complete the problem set in the allotted class time for breakout session 2.

Problem episodes were further parsed into reasoning units. Reasoning units are defined as a conversational chunk of discourse in which a student or students are discussing a single collection of connected ideas. Two researchers (S.M.H. and E.K.B.) then evaluated each reasoning unit against a list of a priori codes consisting of structural reasoning components (premise, claim, backing) derived from the evidence-based reasoning framework ( Brown et al. , 2010 ; Furtak et al. , 2010 ) and correct and incorrect scientific ideas. The evidence-based reasoning framework does not account for accuracy of scientific information, but it was critical in our analysis of the data to consider accuracy (correct vs. incorrect scientific ideas) of the structural reasoning components. Another researcher (J.D.S.) provided insight into what counted as backing and correct scientific ideas throughout this process because of their expertise in the course content and context. We undertook this part of first-cycle coding together and discussed all discrepancies until consensus was reached.

Second-Cycle Coding.

Second-cycle coding began with establishing reasoning quality codes using our first-cycle codes for reasoning. The purpose was to assess the quality of reasoning that was occurring across three dimensions: a reasoning unit’s 1) transactive nature, 2) completeness, and 3) correctness. First, reasoning units were either transactive, meaning two or more students participated in reasoning and one of those students clarified, elaborated, or justified the reasoning of another student(s); or they were nontransactive, in which individual students share their reasoning but their reasoning is not operated on by another student ( Kruger, 1993 ). Second, the reasoning units were also either complete or incomplete. Complete reasoning units were defined by the presence of at least one, clear claim, a premise, and some form of backing that connected the premise to the claim through data, evidence, or a rule ( Brown et al. , 2010 ; Furtak et al. , 2010 ). Incomplete reasoning units were defined as having a claim and/or a premise but lacked a form of backing such as data, evidence, or a rule. Third, the reasoning units were either correct or mixed. A correct reasoning unit could either solely consist of correct scientific ideas, or it could contain an incorrect scientific idea, as long as that incorrect idea was ultimately corrected within the reasoning unit. In contrast, a mixed reasoning unit contained both correct and incorrect scientific ideas, but the incorrect ideas were never corrected. We chose the word “mixed” instead of “incorrect,” because no reasoning units were wholly incorrect.

Combining these three binary parameters resulted in eight codes that were ordered by first prioritizing transactive over nontransactive behavior (Supplemental Table 2). This decision to view transactive behavior as more beneficial than nontransactive behavior aligns with the continuum outlined in the ICAP framework ( Chi and Wylie, 2014 ) and the view that exchanges of reasoning are of higher quality ( Knight et al. , 2013 ). The ordering of the reasoning codes also reflects our decision to prioritize completeness of reasoning units over correctness. We made this choice, because it may be easier for instructors and students to correct ideas rather than to push students to reason with backing. This prioritization and ordering resulted in the view that transactive, complete, and correct reasoning units represent higher-quality reasoning, and nontransactive, incomplete, and/or mixed reasoning units represent lower-quality reasoning. These mutually exclusive codes were then applied to predefined reasoning units in selected problem episodes of the transcripts ( Figure 1 ).

To identify the metacognitive utterances that co-occurred with higher-quality reasoning, we examined the results from our dual-coding process, both metacognitive utterance types and reasoning codes, together in the last phase of the second cycle. We relied on pattern coding as our selected second-cycle coding method to uncover categories in our dual-coded data ( Saldaña, 2021 ). Pattern coding is a way to group the results from first-cycle coding into larger categories. Specifically, we gathered all reasoning units with higher-level codes (level 6 and level 7) and looked for patterns among the metacognitive utterances in these higher-quality reasoning units. We investigated both level 6 and level 7 reasoning units during pattern coding, because these reasoning units were both transactive and complete. The only difference was that level 6 reasoning units contained mixed ideas. This process was facilitated by complex code querying in MaxQDA 2020. In addition to first- and second-cycle coding, we also relied on our analytic memos about the data to inform our coding decisions ( Saldaña, 2021 ).

We first present an overview of how the two groups spent their time during the breakout sessions to provide the reader with important context for the analysis that follows. Next, we demonstrate the results from our dual-coding scheme. To address our first research question (What metacognitive utterances do students use during small-group problem solving in an upper-division biology course?), we define the metacognitive utterances students used during small-group problem solving. Then we provide an analysis of reasoning that occurred during small-group problem solving, which is required to answer our second research question. Finally, by tying our two coding schemes together, we address our second research question (Which metacognitive utterances are associated with small groups sharing higher-quality reasoning in an upper-division biology classroom?) by presenting the metacognitive utterances that were associated with higher-quality reasoning in our data set.

How Students Spent Their Time during Group Work

Students spent the majority of the breakout sessions working directly on the problem set. Little to no time was spent off-task discussing ideas unrelated to the problem sets. Group A was rarely off-task and Group B was never off-task ( Figure 2 ). Group B spent more of their time in silence. On average, Group B spent 40% of their working time in silence, whereas Group A spent 14% of their working time in silence. Despite spending a larger percentage of their time in silence, Group B completed the problem sets faster than Group A. When Group B finished early during the first breakout session, they spent the remainder of the group work time getting to know one another, because they had never met. In contrast, the members of Group A knew one another and had previously met. Overall, both groups were on-task and focused on the problem sets during the breakout sessions but approached their work together differently.

What Metacognitive Utterances Do Students Use during Small-Group Problem Solving?

We identified several types of metacognitive utterances that upper-division biology students used during group work throughout the breakout sessions analyzed in this study. Metacognitive utterances are words, phrases, statements, or questions students made that were related to their awareness and control of thinking for the purposes of learning. The metacognitive utterances we identified included “planning,” “statements to monitor understanding,” “corrections of another student,” “questions to monitor understanding,” “requests for information,” “evaluations of self,” and “evaluations of others,” which can all be mapped to the individual metacognitive regulation skills of planning, monitoring, and evaluating ( Table 1 ). While metacognitive utterances related to planning may play an important role in small-group problem solving, they did not directly impact reasoning in this data set and thus were not investigated further.

Types of metacognitive utterances in small-group problem solving

Metacognitive Utterances Related to Monitoring.

The individual metacognitive regulation skill of monitoring involves assessing one’s understanding of concepts while learning ( Stanton et al. , 2021 ). The metacognitive utterances related to monitoring that we identified in our data took the form of both statements (statements to monitor understanding and corrections of another student) and questions (questions to monitor understanding and requests for information) ( Table 1 ). These metacognitive utterances are related to monitoring, because they involve assessing either one’s own or a group member’s understanding of concepts. For example, statements to monitor understanding included assessments of one’s own conceptual understanding through self-corrections or self-explanations, whereas corrections of another student involved assessments of a group member’s conceptual understanding.

Students also used questions to monitor understanding to assess their knowledge of concepts. These questions helped them clarify their own understanding or their group members’ understanding of a concept. These questions were closed in nature, meaning they could be answered with a simple one-word response like “yes,” “no,” or “correct.” Questions of this type included follow-up questions, questions to make sure group members were following along, or requests for confirmation on a shared idea ( Table 1 ). In contrast to the closed nature of questions to monitor understanding, we also found evidence of students using more open-ended questions to request access to their group mates’ thinking (requests for information). Requesting access to a group mates’ thinking is related to metacognition, because one must first be aware of the thinking of others in order to act on it or regulate it. Students would make requests for information when they asked group mates to disclose their knowledge and information about a concept beyond a simple yes or no question. These requests for information only occurred when a student was asking for information they had not already supplied themselves and often centered around an interrogative word such as “who,” “what,” “when,” “where,” “why,” or “how” ( Table 1 ).

Metacognitive Utterances Related to Evaluating.

The individual metacognitive regulation skill of evaluating involves appraising one’s plan or approach for learning ( Stanton et al. , 2021 ). Two of the metacognitive utterances we identified, evaluations of self and evaluations of others, are related to evaluating, because they involve appraising either one’s own or a group member’s thinking or approach and whether it is effective or relevant to the problem. Evaluations of others took the form of both statements and questions ( Table 1 ). When in statement form, evaluations of others were matter-of-fact critiques that appraised the group’s current solution to a problem in the problem set, like Oscar’s statement “I don’t know how you can even say that there’s DNA present. If it wasn’t immunoprecipitated, it would’ve been washed away.” Audio does not reveal how Oscar’s group mates felt about this critique, nor does Oscar’s statement explicitly invite his group members to engage with his critique. In contrast, evaluations of others that were in question form requested some sort of engagement with the appraisal from the group. Evaluations of others posed as questions either invite pauses for clarification, reorient the group back to the problem set, or challenge an idea, all of which can more directly impact reasoning. For example, Michelle’s inquiry, “Does it answer the question why?,” reoriented the group back to the question posed in the problem set.

The frequency of the metacognitive utterance types we identified by individual participant is provided in Table 2 . The most frequently used metacognitive utterance types were questions to monitor understanding, evaluation of others, statements to monitor understanding, and requests for information.

Frequency of metacognitive utterances by individual group members a

a The percentages were calculated based on the number of each utterance type made by an individual divided by the individual’s total number of metacognitive utterances. The cell with each individual’s most frequently used type of metacognitive utterance is highlighted in dark purple. The cell with each individual’s second most frequently used type of metacognitive utterance is highlighted in light purple.

Reasoning in Small-Group Problem Solving

To answer our second research question regarding the metacognitive utterances associated with higher-quality reasoning, we needed to first analyze the quality of reasoning that occurred when participants solved problems in small groups. Guided by the evidence-based reasoning framework, we identified reasoning components in our data and rated them using the reasoning coding scheme described in the Methods (Supplemental Table 2). Overall, the reasoning displayed by both groups in the problem episodes analyzed (green boxes in Figure 2 ) was high in quality (Supplemental Table 2). Group A had nearly double the number of reasoning units compared with Group B (Supplemental Table 2), which aligns with the finding that Group A spent more time talking compared with Group B ( Figure 2 ). In this section, we use examples from participant discourse to illustrate the reasoning coding scheme (Supplemental Table 2) by presenting an example of lower-quality reasoning and then an example of higher-quality reasoning. For each example, we share an excerpt of group discourse with a line-by-line analysis of the discourse dynamics and then offer an analysis of the reasoning units using the coding scheme ( Russ et al. , 2008 ).

An Example of Lower-Quality Reasoning.

In the segment of discourse presented in Figure 3 , Group A is solving problem 2b in the transcription problem set. They are discussing why it is important to control fragment length in a chromatin immunoprecipitation (ChIP) protocol. The discourse in Figure 3 is one conversation composed of two reasoning units (purple boxes), both of which are examples of lower-quality reasoning. Lower-quality reasoning units are nontransactive, incomplete, and/or mixed, meaning students are not operating on one another’s reasoning, they do not provide backing to support linking their premises to a claim, and/or incorrect ideas are present.

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g003.jpg

An example of lower-quality reasoning. An excerpt of Group A’s discourse or conversation while solving problem 2 in the transcription problem set. The excerpt consists of two lower-quality reasoning units, outlined in purple boxes. Lower-quality reasoning units are nontransactive, incomplete, and mixed, meaning that students are not operating on one another’s reasoning, they do not provide backing to support linking their premises to a claim, and incorrect ideas are present. Each line corresponds to a speaker turn. Line-by-line analysis of the discourse dynamics is provided.

Reasoning Unit Analysis.

In Figure 3 , there are two reasoning units (purple boxes). The first reasoning unit is nontransactive, because no group member directly acts on the reasoning Catherine shares in line 1. On the surface, it may appear as though Michelle and Bella are providing Catherine with confirmation on her reasoning in lines 2 and 3; however, they are not elaborating on the content of Catherine’s reasoning. Therefore, this reasoning unit is considered nontransactive. In terms of structure, Catherine shares a claim about the regulatory proteins breaking or splitting in half. She also shares two premises about 1) the fragments being too small and 2) the goal of ChIP as figuring out “where the proteins bind to DNA.” Given that Catherine’s reasoning lacks backing in the form of data, evidence, or a rule, her reasoning in this first unit is incomplete. Additionally, Catherine’s ideas about the proteins breaking or splitting in half are incorrect. Thus, this first reasoning unit is also mixed. Taken together, the first reasoning unit in this excerpt of discourse was coded as the lowest-quality reasoning (level 0: nontransactive, incomplete, mixed reasoning).

The next reasoning unit in Figure 3 begins with Bella asking Catherine to repeat what she said previously and ends with Bella’s decision about what to write down for this problem and to move on to the next problem in the set. At the start of this unit, Bella is interested in what Catherine said earlier and asks Catherine to repeat herself. Catherine reshares her incorrect claim, and Bella presents her own incorrect claim. Catherine extends her own reasoning, and Michelle agrees with it. Bella asserts her own incorrect claim again. Catherine agrees with Bella in words, yet continues to extend the incorrect claim that she and Michelle agree on. Bella does not write down Catherine and Michelle’s incorrect claim, but moves the group on to the next problem. Despite the fact that Bella initially elicits reasoning from Catherine, she does not seem to be acting on the reasoning shared by her peers. Michelle acts on Catherine’s reasoning once, and Catherine appears to act on Bella’s reasoning twice by confirming and agreeing with her ideas, albeit superficially. Because they are acting on one another’s reasoning, this second reasoning unit is considered transactive. Given the presence of incorrect ideas that go uncorrected by peers and that Group A is solely sharing premises and claims in this section, this unit is considered incomplete and mixed (level 4: transactive, incomplete, mixed reasoning). This is an improvement from Catherine sharing her siloed reasoning in the first unit, but there were missed opportunities to constructively reason as a group ( Figure 3 ).

An Example of Higher-Quality Reasoning.

In the segment of discourse presented in Figure 4 , Group B is solving problem 4a in the nuclear import problem set. They are considering fluorescence resonance energy transfer (FRET) data from a published figure and are tasked with explaining the results in one of the figure’s panels. The discourse in Figure 4 is an example of higher-quality reasoning. Higher-quality reasoning units are transactive, complete, and correct, meaning students are operating on one another’s reasoning, they provide backing to link their premises to a claim, and no incorrect ideas are present, or if they are, they are ultimately corrected.

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g004.jpg

An example of higher-quality reasoning. An excerpt of Group B’s discourse or conversation while solving problem 4a in the nuclear import problem set. The excerpt consists of two higher-quality reasoning units, outlined in purple boxes. Higher-quality reasoning units are transactive, complete, and correct, meaning that students are operating on one another’s reasoning, they provide backing to support linking their premises to a claim, and no incorrect ideas are present, or if they are, they are ultimately corrected. Each line corresponds to a speaker turn. Line-by-line analysis of the discourse dynamics is provided.

In the first reasoning unit shown in Figure 4 , Oscar shares his idea that something in the figure only makes sense if Ran-GTP is bound. Molly verbalizes that she was thinking the same thing as Oscar and explains what should be occurring if Ran-GTP is not bound in terms of fluorescence output for FRET, and she ties that to what the observed ratio should then be. Adam questions Molly’s reasoning, pondering if both the yellow fluorescent protein (YFP) and cyan fluorescent protein (CFP) molecules should fluoresce when Ran-GTP is not bound. This question triggers Molly to share more of her reasoning, including an accurate description of how FRET works. Oscar then clarifies her idea by stating that you would get more YFP, not necessarily only YFP emission when FRET works. Molly agrees with Oscar, then Oscar ties their co-constructed prediction to the schematic representation of FRET provided in the problem. Molly then connects their reasoning to the observation that the cytoplasm is green in color, which corresponds to a ratio value greater than one. Group B’s reasoning in this unit is transactive in nature, because they are exchanging reasoning. They are displaying complete reasoning by 1) sharing what they know about Ran-GTP concentrations from the prompt (premises) and 2) providing backing for their claims in the form of color observations (data) and ratio relationships (evidence). Their reasoning is also correct (level 7: transactive, complete, correct reasoning).

In the second reasoning unit shown in Figure 4 , Group B follows a similar structure for their reasoning displayed in the first reasoning unit, but this time for what is happening in the nucleus. Molly shares her conclusion about YFP and CFP intensity in the nucleus. Oscar elaborates on her conclusion by adding in his prior knowledge about the location and concentration of Ran-GTP in the nucleus, and Adam brings in the observation that the nucleus should then be very dark in color. Group B’s reasoning is transactive in nature, because they are acting on and building upon one another’s ideas to co-construct shared reasoning. As Group B discusses this problem, they consistently provide backing for their claims in the form of data (color observations) or evidence (ratio relationships) and premises, including what they know about Ran-GTP concentrations and relevant pieces of information shared in the problem prompt. For this reason, their reasoning is complete. Additionally, their reasoning is also correct (level 7: transactive, complete, correct reasoning).

Both reasoning units in Figure 4 were coded as transactive, complete, and correct, which represent the highest-quality reasoning code (level 7: transactive, complete, correct reasoning). As is evident in the example discourse, students in our study often do not start with the data in their discussion of this problem. Rather, students start by sharing their conclusions and what they know about the problem before drilling down to the data that support their conclusions. In both reasoning units within this example of discourse, Group B takes this conclusion-first approach to answering problem 4a.

Metacognitive Utterances Associated with Higher-Quality Reasoning

To address our second research question (Which metacognitive utterances are associated with small groups sharing higher-quality reasoning in an upper-division biology classroom?) we investigated the overlap and interplay between our two coding schemes. We were particularly interested in the metacognitive utterances situated within higher-quality reasoning units (level 7: transactive, complete, correct reasoning). Four categories emerged from this analysis. The metacognitive utterances that were associated with higher-quality reasoning units either included 1) evaluative questioning, 2) requesting and receiving evaluations, 3) requesting and receiving explanations, or 4) elaborating on another’s ideas ( Table 3 ). Some higher-quality reasoning units contained more than one of these metacognitive utterance categories (Supplemental Figure 2). In the following subsections, we highlight illustrative examples of each metacognitive utterance category in transactive, complete, and correct (level 7) reasoning units.

Categories of metacognitive utterances associated with higher-quality reasoning

Evaluative Questioning.

In this study, a key distinguishing feature of higher-quality reasoning was evaluative questioning. We define evaluative questioning as an appraisal of an approach or thinking formed as a question. For example, Michelle’s question, “Does it answer the question why?,” is considered evaluative questioning, because it is an appraisal of the group’s solution to a problem and is posed as a question to her group members. In contrast, Bella’s question, “Can you explain that?,” seeks an explanation from her group mate Catherine but is not an evaluation of Catherine’s answer or approach. In essence, evaluative questioning occurred in our data set when metacognitive utterances that were questions included within them an evaluation, or when a question to monitor understanding or a request for information overlapped with an evaluation of others ( Table 3 ). Evaluative questioning was found only in level 6 and level 7 reasoning units.

Evaluative questioning took two forms. First, evaluative questioning occurred when students would question whether or not the constructed reasoning answered the question asked in the problem set. An example of this type of evaluative questioning can be seen in Supplemental Figure 3, when Oscar asked, “But does that have to do with the gradient?,” in line 3. Another example of this type of evaluative question came from Michelle when she asked, “Does it answer the question why?” ( Table 1 ). Her group was trying to answer a problem that asked why Ran-GDP is concentrated in the cytoplasm. Before Michelle’s question, Bella and Catherine came up with a solution consisting of the correct rule that Ran-GTP has a high affinity for importin and exportin. This solution, while composed of correct backing, did not completely address the problem asked or link the correct backing to a claim. With her evaluative question, Michelle raises the concern that their solution might not fully answer the problem. Her question got Catherine to reflect and admit that their solution did not answer why Ran-GDP is concentrated in the cytoplasm. This led the group to discuss the role of regulatory proteins (e.g., GAPs and GEFs). Ultimately, the group established consensus around the correct idea that Ran-GDP is concentrated in the cytoplasm because RanGAP, which activates the GTPase activity of Ran-GTP, is found in the cytoplasm. Michelle’s evaluative question redirected and refocused her group to the problem that was posed.

The second form of evaluative questioning occurred when students would challenge a part of one peer’s reasoning with alternative reasoning. An example of this type of evaluative questioning can be seen in Figure 5 . In this example, Oscar provided his reasoning about how the Ran-GTP binding domain of importin beta will behave as a control in a FRET experiment that his group was discussing, which led Adam to ask an evaluative question in line 8 ( Figure 5 ). Adam’s evaluative question focused on Oscar’s premise that half of Ran-GTP will bind to importin beta by suggesting an alternative. Adam asked, “Wouldn’t you expect almost all of it to bind, because it looks like the same as this one?” In essence, Adam’s evaluative question challenges Oscar’s reasoning with alternative reasoning ( Figure 5 ). An additional example of this type of evaluative question came from Adam when he asked, “Wait, why would you not get both? Don’t both of these fluoresce if it’s not bound?” ( Figure 4 , line 3). Before this question, Molly shared an explanation. Adam then asked for clarification and challenged her explanation by offering an alternative interpretation of what might be happening in the experimental figure. Adam’s evaluative question pushed Molly to explain her reasoning in greater detail for the group.

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g005.jpg

Evaluative questioning in student discourse that challenges reasoning. A higher-quality (level 7) reasoning unit from Group B, outlined in purple. Adam’s evaluative question is in bolded font. Parts of the discourse and analysis shown in gray text are provided as context for the reader to emphasize what occurs after the evaluative question (black text).

Requesting and Receiving Evaluations.

Another distinguishing feature of higher-quality reasoning in this study was requesting and receiving evaluations. Requesting and receiving evaluations occurred when a closed-ended question to monitor understanding was followed by an evaluation of others or a correction of another student ( Table 3 ). An example of requesting and receiving an evaluation is seen when Adam asked for confirmation on an idea and was corrected by Oscar in lines 7 and 8 of Figure 6 . Adam shared his reasoning posed as a question: “So, wouldn’t it be like the GDP grabs it in here and then sends it through here and then it’s converted to GTP in here?” By doing so he sought confirmation on his idea from his group members. His question to monitor his own understanding opened the floor for feedback from his group members and gave Oscar the opportunity to correct Adam’s thinking. This resulted in Adam accepting Oscar’s correction.

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g006.jpg

Requesting and receiving evaluations in student discourse. A higher-quality (level 7) reasoning unit from Group B, outlined in purple. Adam’s request is in bolded font in line 7 and Oscar’s evaluation is in bold in line 8. Parts of the discourse and analysis shown in gray text are provided as context for the reader.

In higher-quality reasoning units, not every question to monitor understanding was met with an evaluation or a correction. In other words, asking a question to monitor understanding did not always guarantee a response such as that shown in Figure 6 . Some questions to monitor understanding that sought an evaluation or confirmation were not addressed verbally by the group. We acknowledge the possibility of students receiving simple nonverbal answers (like the shake or nod of a head) from group members that cannot be detected via audio recordings. Alternatively, this may suggest that, when students asked for feedback in this way, they were sometimes ignored. We speculate that some questions to monitor understanding went unmet because the group might have thought the individual was talking to themselves. Additionally, questions to monitor understanding were also found in lower-quality reasoning units. However, in lower-quality reasoning units, questions to monitor understanding were met with simple one-word confirmations or with another question rather than more elaborate evaluations or corrections. How the group responds to requests for evaluation seems to be important.

Requesting and Receiving Explanations.

Another distinguishing feature of higher-quality reasoning in this study was requesting and receiving explanations. Similar to requesting and receiving an evaluation, a request is made by one student and then met by one or more members of the group, but the nature of the request is slightly different. Requests that sought confirmation resulted in evaluations, whereas requests that were open-ended questions resulted in explanations. The nature of the request dictated the response from the group.

Requesting and receiving explanations occurred when an open-ended request for information was met with an explanation composed of more than a single-word answer ( Table 3 ). For example, consider the reasoning unit in Figure 7 that begins with a request for information from Bella, “Can you explain that? I’m very confused” ( Figure 7 , line 1). Bella directly asked for an explanation from her group members and received one. Interestingly, every request for information was met with a response in our study. No request for information went unmet. This suggests that, when students in our study asked for help this way, they were never ignored. While requesting and receiving explanations were more common in higher-quality reasoning units, requesting and receiving explanations were occasionally found in lower-quality reasoning units. However, in lower-quality reasoning units, requests for information were met with explanations that often included incorrect or mixed ideas that were never corrected. We underscore that how these requests are met appears to be important.

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g007.jpg

Requesting and receiving explanations in student discourse. A higher-quality (level 7) reasoning unit from Group A, outlined in purple. Bella’s request is in bolded font in line 1, and Catherine’s explanation is in bold in line 2. Parts of the discourse and analysis shown in gray text are provided as context for the reader.

Elaborating on Another’s Reasoning.

Elaborating on another student’s reasoning was another category for metacognitive utterances associated with higher-quality reasoning units. Elaborating on another’s reasoning occurred when students would self-explain a group member’s reasoning or elaborate on a group member’s reasoning beyond a simple “yeah” or “okay” ( Table 3 ). These self-explanations and elaborations were unprompted statements to monitor understanding. To illustrate this category of elaborating on another’s reasoning, Figure 8 shows an excerpt of discourse between Michelle and Bella as they formed a conclusion about a data figure on the occupancy of histone acetylation for a region of a yeast chromosome. In the first part of this excerpt (lines 1–13), Michelle and Bella co-constructed their prior knowledge about the role of acetylation and methylation of nucleosomes in regard to transcription. Bella then restated the question in the problem set (“Okay, so what can you conclude?”), and Michelle responded with her reasoning by providing evidence about a relationship between one gene in the figure and the level of acetylation (“At [gene 1], we’ve got a lot of acetylation”) and a claim (“so that means that that’s a gene that’s being regularly transcribed”). Bella then elaborated on Michelle’s reasoning by providing data (“Oh, yeah, because at [gene 1] is when it shoots up”) and clarifying the evidence Michelle stated by bringing in the idea that the increase in acetylation is occurring at the start of the gene (“So, at the advent of [gene 1], we see dramatic increase in acetylation”). Michelle then provided unprompted confirmation of Bella’s idea by defining the start of the gene as the promoter region based on what she remembers from class (“Yeah, because she said the little arrow thing means promoter, so we’ve got a promoter right there. So, right there, it’s getting going”).

An external file that holds a picture, illustration, etc.
Object name is cbe-21-ar58-g008.jpg

Elaborating on another’s reasoning in student discourse. A higher-quality (level 7) reasoning unit from Group A, outlined in purple. Bella’s and Michelle’s elaborations are in bolded font in lines 16 and 17. Parts of the discourse and analysis shown in gray text are provided as context for the reader.

As seen in Figure 8 , some unprompted elaboration found in the higher-quality reasoning units was composed of “yes, and/because/so” statements (lines 16 and 17). However, “yes, and/because/so” statements were not always indicative of higher-quality reasoning. These elaborative statements to monitor understanding, while more common in higher-quality reasoning units, were also found in lower-quality reasoning units. In some instances of lower-quality reasoning, these elaborative cues masked disagreement. Take for instance, Group A’s use of elaborative cues in Figure 3 (lines 6 and 9). Catherine’s use of “yeah, and” statements may have been perceived as agreement by her group members even when, based on the content of the conversation, it was clear to the research team that the group was not in complete agreement about their reasoning ( Figure 3 ). The appearance of agreement and elaboration could be misleading for students working in small groups in real time.

In summary, metacognitive utterances that 1) stimulated reflection about the solution or presented alternative reasoning, 2) provided evaluations when requested, 3) provided explanations when requested, or 4) elaborated on another’s reasoning in an unprompted manner emerged as critical aspects of higher-quality reasoning in the data analyzed for this study.

Analysis of discourse during small-group problem solving in an upper-division biology course revealed seven types of metacognitive utterances ( Table 1 ) and four categories of metacognitive utterances that were associated with higher-quality reasoning in our study ( Table 3 ). These findings have not been described in this context before and begin to define social metacognition that occurs in the life sciences. We situate our findings from this unique context among broader findings on social metacognition, outline implications for life science instructors based on our data, and suggest future directions for research on social metacognition in the life sciences.

Social Metacognition and Reasoning in the Life Sciences

Our findings build on prior research on social metacognition from mathematics, physics, and the learning sciences and contribute the first exploration of social metacognition in the life sciences. Prior work conceptualized metacognitive utterances during small-group problem solving in secondary mathematics courses broadly as “new ideas,” or occurrences when new information was recognized or alternative approaches were shared, and “assessments,” or occurrences when the execution, appropriateness, or accuracy of a strategy, solution, or knowledge was appraised ( Goos et al. , 2002 ). Other researchers have used the individual metacognitive regulation skills of planning, monitoring, and evaluating to categorize metacognitive utterances from small group work during content analysis ( Kim and Lim, 2018 ). Our rich descriptions of seven types of metacognitive utterances in an upper-division biology course moves the field forward by 1) exploring social metacognition in a new context and 2) conceptualizing metacognitive utterances in social settings beyond new ideas and assessments ( Goos et al. , 2002 ; Van De Bogart et al. , 2017 ) and the three broad individual metacognitive regulation skills of planning, monitoring, and evaluating ( Lippmann Kung and Linder, 2007 ; Siegel, 2012 ; De Backer et al. , 2015 ; Kim and Lim, 2018 ). We propose alignment of the metacognitive utterance types and categories we found in our data set to the individual metacognitive regulation skills of planning, monitoring, and evaluating in order to begin to bridge individual metacognitive theory to the social metacognition framework ( Tables 1 and ​ and3). 3 ). “How are individual and social metacognition related?” remains an open question in the field. More research is needed to determine additional criteria that make metacognition social and not individual.

Seven types of metacognitive utterances emerged in our data ( Table 1 ). For the metacognitive utterances that were questions, the nature of the question determined the type of response it received. For example, open-ended questions like requests for information elicited more elaborate explanations from group members. In contrast, a closed request for feedback, like a question to monitor understanding (“Is my idea right?”), often elicited single-word responses (“Yeah.”). Although this is the first study of social metacognition in the life sciences, this finding aligns with prior research on group work. In a study on student discourse in a life science course, open questions using the words “how” and “why” led to more conceptual explanations from peers ( Repice et al. , 2016 ). In another study, peer learning assistants’ use of open-ended prompting questions and reasoning requests during clicker discussions encouraged students to share their thinking and elicited student reasoning ( Knight et al. , 2015 ).

The four categories of metacognitive utterances that were associated with higher-quality reasoning in our study ( Table 3 ) extend the ways that a student’s thinking can become the subject of discussion ( Goos et al. , 2002 ). Our categories of evaluative questioning and requesting and receiving an explanation build on what Goos et al. (2002) called a “partner’s challenge” or when one student (Student A) asks another student (Student B) to operate on the second student’s (Student B’s) thinking in order to clarify their meaning. Evaluative questioning extends this idea of a partner’s challenge to also include when students question whether or not their co-constructed reasoning or solution answered the question asked in the problem set. Both types of evaluative questioning that we identified in our study and requesting an explanation elicited further reasoning from group members. Our category of requesting and receiving evaluations also builds on what Goos et al. (2002) called an “invitation” or when one student (Student A) asks another student (Student B) to operate on the first student’s (Student A’s) thinking in order to receive feedback. Requesting and receiving evaluations extends this idea of invitation and suggests that invitations are particularly powerful when met. Our category of elaborating on another’s thinking is aligned to what Goos et al. (2002) called “spontaneously, partner-initiated other-monitoring,” in which one student (Student A) operates on another student’s (Student B’s) thinking in an unprompted manner in order to provide unrequested feedback. Unique to our investigation of higher-quality reasoning, we found that spontaneous other-monitoring often appeared as statements of agreement or elaboration, that is, “yes, and/because/so,” rather than corrections. In fact, we did not see evidence of unprompted corrections of another student in our analysis of higher-quality reasoning. Every correction in our data set was invited or requested by a group member ( Figure 6 ).

In research on reasoning and argumentation, the moments when students disagree appears to be critically important ( Kuhn, 1991 ). In fact, some reasoning and argumentation frameworks rank discussions with counterclaims, disagreements, and rebuttals as more sophisticated ( Osborne et al. , 2004 ). Disagreements were present in some but not all of our higher-quality reasoning units. A few disagreements appeared overtly as direct corrections of another student (“No, …”), but more often disagreements and counterclaims were present in our data set in the form of subtle evaluations of the group’s solution through evaluative questioning (see Adam’s and Oscar’s questions in Figures 4 and ​ and5 5 and Supplemental Figure 3). The phrasing of critiques and counterclaims as questions might be a way to be polite, soften the blow, or save face in a group setting, because outright disagreement with one’s peers can be seen as socially undesirable. Group members might feel more open to discussion or comfortable with the possibility of having an incorrect idea when an alternative idea is presented as a question. On the other hand, phrasing critiques and counterclaims tentatively might cause some contributions to be overlooked if not stated directly or assertively (see the exchange in Figure 3 ), especially depending on the group dynamic (i.e., if there is a more dominating group member present).

Implications for Instructors

We found that metacognitive utterances that 1) stimulated reflection about the answer or presented alternative reasoning, 2) provided evaluations when requested, 3) provided explanations when requested, or 4) elaborated on another’s reasoning in an unprompted manner were associated with higher-quality reasoning. Students are likely to need structured guidance on how to be socially metacognitive ( Chiu and Kuo, 2009 ; Stanton et al. , 2021 ) and how to reason ( Knight et al. , 2013 ; Paine and Knight, 2020 ). Our rich, qualitative work begins to provide the foundational knowledge needed to develop this guidance.

Scripts or prompts for social metacognition could be provided to students working in small groups ( Miller and Hadwin, 2015 ; Kim and Lim, 2018 ). Other researchers, particularly in the realm of computer supported collaborative learning, have identified and used scripting tools to structure and sequence collaborative online interactions like small-group problem solving ( Miller and Hadwin, 2015 ; Kim and Lim, 2018 ). Scripting for social metacognition in a life science classroom could involve providing prompts for students to use during group work and then modeling when and why to use these prompts. Based on our data, we suggest several possible prompts in Table 4 . Incorporating these prompts during small group work could encourage students to practice aspects of social metacognition that were associated with higher-quality reasoning in our study. The effectiveness of these prompts in promoting social metacognition and reasoning is unknown and will be tested by our lab in a future study.

Prompts that may promote social metacognition during small-group problem solving

Another way to possibly facilitate social metacognition during small-group problem solving is through the use and modification of group roles. Other researchers suggest that students’ natural role choices are not necessarily optimal and that consideration of group roles can improve group discussions ( Paine and Knight, 2020 ). In our study, students took on a defined group role as either recorder, presenter, or manager which are common group roles for a POGIL-style classroom ( Moog et al. , 2006 ). To facilitate social metacognition, these group roles could be expanded and tested in the following ways. First, we do not suggest expanding the recorder role, because it was already the most demanding group role in our study. Second, the presenter role could be expanded to include the role of prompter. The role of the prompter would be to encourage the group’s use of scripting prompts throughout the group work ( Table 4 ). The prompter could also be tasked with ensuring any questions asked by a member of the group are answered, because we found that requests and invitations were particularly powerful when met. Finally, the manager role could be expanded to include the role of moderator. The role of the moderator would be to encourage active listening.

A particularly interesting form of active listening to consider for the expanded role of moderator is apophatic listening ( Dobson, 2014 ; Samuelsson and Ness, 2019 ). Apophatic listening is a process in which learners start by being quiet to give space for a speaker to share their ideas. Learners use this silence to temporarily suspend their expectations and reflect on the speaker’s ideas before actively participating in the conversation. After actively listening to the speaker, learners then ask follow-up questions and interpret what the speaker shared. The listener then shares their alternative understanding with the speaker, and the listener and speaker work together to collectively create a shared and mutual understanding ( Samuelsson and Ness, 2019 ). Adding the role of moderator to encourage apophatic listening could ensure all group members are heard. In our study, apophatic listening may have been more present in Group B’s discourse because of the greater proportion of silence and turn-taking during group work compared with Group A’s overlapping talk ( Figure 2 ).

Structured guidance on how to reason is also needed in college life science classrooms ( Paine and Knight, 2020 ). Another study of upper-division biology students showed that instructional cueing for reasoning led to higher-quality reasoning during clicker question discussions ( Knight et al. , 2013 ). In alignment with these previous findings from a different population of upper-division biology students ( Knight et al. , 2013 ), the majority of the reasoning units analyzed in our work were higher in quality (Supplemental Table 2). A combination of factors, including the prior educational experiences of the studied population, the nature of the task itself ( Zagallo et al. , 2016 ), and the course expectations around sharing reasoning, may have been sufficient to elicit higher-quality reasoning in our study. Notably, students in our study were not explicitly taught to reason using the evidence-based reasoning framework ( Brown et al. , 2010 ), and the most common approach to reasoning in our data set was a conclusion-first or claim-first approach ( Figure 4 ). It might be helpful to teach students to reason during small-group problem solving with a data-first approach (Supplemental Figure 1) so their claims flow logically from backing (data, evidence, rules).

Future Directions for Research

Our initial investigation of social metacognition during small-group problem solving in a life science context reveals a hypothesis for future testing as well as interesting areas for further research. Based on our results, we hypothesize that certain categories of metacognitive utterances shared by students lead to higher-quality reasoning from their groups. In other words, reasoning is improved when students employ specific categories of metacognitive utterances during group work. If this is true, then we predict that training students to be socially metacognitive through the use of specific prompts will lead to higher-quality reasoning and group performance. For example, we predict that incorporating evaluative questioning and requesting and receiving explanations will transition students to higher-quality, more transactive, complete, and correct reasoning. We are testing this hypothesis using scripting interventions that structure and sequence group collaboration as others have done in the past ( Miller and Hadwin, 2015 ). Alternatively, the inverse may be true, and higher-quality reasoning may lead to certain types of metacognitive utterances being shared. Our foundational work uncovered a possible association between metacognitive utterances and higher-quality reasoning, but the strength and directionality of the relationship needs to be further investigated. Additionally, our findings should be validated in other contexts within the life sciences. For example, the social metacognitive utterances we identified in our work may vary from those students use in a organismal biology class or in courses that use group work in a synchronous online format ( Zheng et al. , 2019 ).

Other areas for future research of social metacognition and reasoning quality in the life sciences include the role of silence, cultural variations in collaboration, and group diversity. We found that Group B was more successful in their problem solving and spent more of their on-task time in silence compared with Group A ( Figure 2 ). This silence was often, but not always, in between problems in the problem set, suggesting it could be representative of group members silently reading, writing, or thinking. Capturing video data in future work can help us understand what is occurring during those silent pauses. The idea of silent time during group work may be counterintuitive to the notion that effective collaboration involves constant conversation. In fact, our findings suggest it may be beneficial for all group members to have silent time to process and reflect on their own thoughts and the thoughts of others. This finding could have a direct impact on tools intended to measure active learning using sound ( Owens et al. , 2017 ). However, it is important to note that this result may simply capture group differences in conversational style or cultural variations in the way collaboration is viewed ( Alcalá et al. , 2018 ). For instance, the talking over one another that resulted in minimal silences in Group A’s dialogue may show the group’s enthusiasm rather than interruption.

Cultural aspects of collaboration and the impact of group diversity on social metacognition and reasoning quality should be considered in future work ( Carter and Phillips, 2017 ; Alcalá et al. , 2018 ). For example, how does a more surface-level homogenous group (e.g., a group of all women) compare to a more surface-level diverse group (e.g., a group of men and women) in terms of social metacognition and reasoning quality? In other research, dissenting group members who were in the social majority found surface-level diverse groups to be more accepting than surface-level homogenous groups ( Phillips and Loyd, 2006 ). The two groups in our study were mostly surface-level homogeneous, which may have impacted how students perceived and shared potential disagreements. While there was high surface-level homogeneity among group members in our study, we are unable to comment on the deep, cognitive-level diversity (i.e., differences in knowledge, skills, and experiences) in the groups aside from the frequencies of metacognitive utterances by participant ( Table 2 ). Our work does not reveal the impact that personality, prior knowledge, group role, and preference for group work may have on individual frequency of metacognitive utterance use. Individual and group differences will be important to consider when investigating social metacognition and reasoning quality in future studies.

Limitations

Our study was designed to qualitatively explore social metacognition and reasoning during small-group problem solving in an upper-division biology course through discourse analysis. Our sample size, while small, is in line with sample sizes traditional for foundational discourse analysis ( Cameron, 2001 ; Rogers, 2004 ). This type of analysis is time and labor intensive. For example, on average, each transcript we analyzed contained approximately 700 coded segments. We do not aim to make generalizations from our study. Rather, our goal is to present a thick description of the metacognitive utterances and reasoning from two groups over the course of two consecutive breakout sessions in an upper-division biology course. Deep and rich analysis of this type is warranted to expand our understanding of emerging areas of research in the life sciences, like social metacognition.

Social metacognition is context dependent and discipline specific. Our analysis and results from an upper-division biology course may not reflect the nature of social metacognition in other contexts, like an introductory biology course that uses team-based learning. Not all utterances that students made were metacognitive in nature. For example, the question “Alright, so what should I write?” and statements like “ChIP is genome-wide” were not coded as being metacognitive in nature. These examples have been taken out of context, and we caution readers that the context around these utterances, as in all discourse analysis, was important and critical in our coding decisions. For this reason, it is important to present excerpts of discourse in the results, when possible, rather than single quotes.

We recognize that language is just one system that people use to create meaning. We relied on audio recordings in our analysis of discourse during small-group problem solving. In future studies, we can use video recordings to investigate how participants use gestures, objects, and technology. This would allow us to explore other nonverbal systems that interact with language during small-group problem solving. Additionally, the evidence-based reasoning framework we used breaks reasoning into its component parts and focuses on structure without focusing on the holistic nature of the reasoning or whether it is scientifically accurate ( Brown et al. , 2010 ; Furtak et al. , 2010 ). This is a limitation of the reasoning framework we used, and one that we attempted to address by accounting for correctness in our reasoning coding scheme.

Social metacognition is important for undergraduates to fully benefit from increasing opportunities for small-group work in life science courses. This work begins to define metacognitive utterances shared by undergraduate students during small-group problem solving in life science courses and suggests a relationship between metacognitive utterances and higher-quality reasoning, which is a valued outcome in the life sciences. Our data suggest it may be important to provide life science undergraduates with guidance on how to be socially metacognitive through the use of scripted prompts and modified group roles during small-group work.

Supplementary Material

Acknowledgments.

We are grateful to members of the Biology Education Research Group at UGA for their feedback on our work. We thank Dr. Paula Lemons and Dr. Mariel Pfeifer for their feedback on an earlier version of this article and the reviewers for their critical insights. We also acknowledge Julio Cordero for his help with preliminary data analysis. This material is based on work supported by the National Science Foundation under grant no. 1942318. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

CNN values your feedback

Wind energy has a massive waste problem. new technologies may be a step closer to solving it.

In this aerial view, wind turbines adorn the landscape in the Southern Lake District on November 25, 2022 in Lambrigg, England.

Wind turbines are built to last. Their tall bodies are topped with long fiberglass blades, some more than half a football field in length, made to withstand the harshest, windiest conditions .

But this sturdiness brings a big problem: What to do with these blades when they reach the end of their lives.

While about 90% of turbines are easily recyclable, their blades are not. They are made from fiberglass bound together with epoxy resin, a material so strong it is incredibly difficult and expensive to break down. Most blades end their lives in landfill or are incinerated.

It’s a problem that’s vexed the wind energy industry and provided fodder for those who seek to discredit wind power.

But in February, Danish wind company Vestas said it had cracked the problem.

It announced a “breakthrough solution” that would allow wind turbine blades to be recycled without needing to change their design or materials.

The company said the “newly discovered chemical technology” breaks down old blades in a liquid to produce high quality materials, which can eventually be used to make new blades, as well as components in other industries.

Claire Barlow, a sustainability and materials engineer at Cambridge University, told CNN that if this kind of technology can be scaled up, it “could be a game changer.”

A new method for a big problem

Wind turbine blades waiting to be  buried in the Casper Regional Landfill in Casper, Wyoming.

In 2019, an image from Casper Regional Landfill in Wyoming showing piles of long, white blades waiting to be buried went viral, prompting criticism of the environmental credentials of wind power.

Wind energy has been growing at a fast pace . It is the world’s leading renewable energy technology behind hydropower, and plays a vital role in helping countries move away from fossil fuel energy, which pumps out planet-heating pollution.

But as the first generation of wind turbines start to reach the end of their service lives , while others are replaced early to make way for newer technology – including longer turbine blades that can sweep more wind and generate more energy – the question of what to do with their huge blades becomes more pressing.

HP ONLY 20230518-offshore-wind-HP

The future of wind energy in the US is floating turbines as tall as 30 Rock

Blade waste is projected to reach 2.2 million tons in the US by 2050. Globally, the figure could be around 43 million tons by 2050.

There are few easy ways to deal with it.

Current options are not only wasteful but have environmental drawbacks. Incineration brings pollution and, while wind companies say there is no toxicity issue with landfilling blades, Barlow said that’s not yet totally clear.

“That’s not as benign as you might think,” she said.

Turbine blade materials make recycling hard and costly. The epoxy resins used to make turbine blades are called “thermosets.”

“If you heat them up, they don’t change their properties until they just burn,” Barlow said. “You can’t just scrunch them up and recycle the material into something easily reusable.”

That’s why Vestas hopes its new technology could hold real promise.

“This has been the key sustainability challenge in the industry. And so we’re of course very excited to have found a solution,” Lisa Ekstrand, the head of sustainability at Vestas, told CNN.

Wind turbines spin at the Traverse wind farm in Oklahoma on April 19.

'The sound of money': Wind energy is booming in deep-red Republican states

The process, which the company has been working on in partnership with Aarhus University, the Danish Technological Institute and US-based epoxy company Olin, uses a liquid chemical solution to break down the blade into epoxy fragments and fibers. The epoxy resin is then sent to Olin which can process it into “virgin-grade” epoxy, Ekstrand said.

The process uses inexpensive, non-toxic chemicals that are readily available in large quantities, she added. “We expect this to be a low energy consuming, low CO2-emitting technology.”

The company remains tight-lipped on further details, including the chemicals involved and how many times the process can be repeated.

Ekstrand said they are filing patents and the plan is eventually to license it to other companies.

So far, Vestas has tested the technology in a lab but is now building a pilot facility to test it on a bigger scale for two years, after which it hopes to commercialize it.

Gummy bears from turbine blades

Vestas is far from the first to try to tackle this knotty problem. Companies and scientists have been working on different approaches for years, although many potential solutions are nascent or remain small scale.

One approach is to grind blades up and use the material in other industries. The downsides are that the enormous blades are tricky to transport and crush. “Because the material isn’t worth very much, it’s not really worthwhile doing it,” Barlow said.

An old blade is prepared for transport to a landfill in Nebraska.

But some companies say they’re making it work.

Veolia, a resource management company headquartered in France, turns old blades into an ingredient for cement production.

It shreds, sorts and blends blade materials before sending them to cement kilns. Using this blend reduces the planet-heating pollution produced in cement manufacturing by 27%, according to Veolia. The program has processed 2,600 blades so far.

Carbon Rivers, a Tennessee-based company, has worked with the US Department of Energy to help scale up its “pyrolysis” technology – a form of chemical recycling that uses very high heat in an oxygen-free environment.

The body of a humpack whale lies on a beach in Brigantine N.J., after it washed ashore on Friday, Jan. 13, 2023. It was the seventh dead whale to wash ashore in New Jersey and New York in little over a month, prompting calls for a temporary halt in offshore wind farm preparation on the ocean floor from lawmakers and environmental groups who suspect the work might have something to do with the deaths. (AP Photo/Wayne Parry)

What's killing whales off the Northeast coast? It's not wind farm projects, experts say

The company’s process produces glass fibers, which can then be used in new wind turbine blades, as well as in the automotive and shipping industries, it says. It also produces oil that can be used in energy production, David Morgan, chief strategy officer at Carbon Rivers, told CNN.

The technology allows them “to fully and completely upcycle wind turbine blades” in a process that is “net positive energy,” Morgan added.

Carbon Rivers has so far upcycled 41 blades weighing 268 tons and is building recycling facilities and with the aim of scaling up to more than 5,800 blades a day.

Other efforts focus on changing the materials used to make turbines, to create a new generation of blades that are easier to recycle.

In 2022, researchers at the University of Michigan announced they had made a new resin for blades by combining glass fibers with a plant derived polymer and a synthetic one, which could be recycled into ingredients for products, including new turbine blades, laptop covers, power tools – and even gummy bear candies .

01 renewable energy boom intl

'Beginning of the end' for fossil fuels: Global wind and solar reached record levels in 2022, study finds

“We recovered food-grade potassium lactate and used it to make gummy bear candies, which I ate,” John Dorgan, a professor of chemical engineering at Michigan State University, said in a statement.

For those concerned about eating an old turbine, Dorgan said: “A carbon atom derived from a plant, like corn or grass, is no different from a carbon atom that came from a fossil fuel. It’s all part of the global carbon cycle, and we’ve shown that we can go from biomass in the field to durable plastic materials and back to foodstuffs.”

Of course, this won’t help with the blades being decommissioned now.

The reason Vestas’ discovery could be so compelling, said Barlow, is that it’s promising a process to recover reusable materials from current turbine blades, without using noxious chemicals and huge amounts of energy. “That’s a real winner,” she said.

Now the company has to scale up.

“There will be all sorts of problems which they haven’t conceived of. So it may be slow, but this is a good starter for ten,” Barlow said.

'.concat(e,"

\n ').concat(n,'\n

\n ').concat(t,'\n

This page will automatically redirect in 5 seconds...

share this!

May 25, 2023

This article has been reviewed according to Science X's editorial process and policies . Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

Social stress, problem-solving deficits contribute to suicide risk for teen girls, research suggests

by American Psychological Association

teenage girl

Explore further

Feedback to editors

metacognition science problem solving

New study uncovers role of previously unknown protein in obesity and diabetes

May 26, 2023

metacognition science problem solving

Gene therapy rescues hearing for the first time in aged mouse models

metacognition science problem solving

Innovative endoscopic imaging system can detect multiple fluorescent tracers

metacognition science problem solving

Researchers show that immunoglobulin A fine-tunes the body's interactions with microbes

metacognition science problem solving

Exploring the mechanisms behind swallowing

metacognition science problem solving

Causal association found between evening social media use and delayed sleep

metacognition science problem solving

Teenage girls are more sensitive to the anxiety of other girls, finds study

metacognition science problem solving

How Botox enters brain cells, a discovery that could save lives

metacognition science problem solving

Five types of heart failure identified using AI tools

metacognition science problem solving

State policies can boost use of anti-opioid medication

Related stories.

metacognition science problem solving

Loneliness has a strong impact on suicidal ideation among Japanese during the COVID-19 pandemic

May 24, 2023

metacognition science problem solving

Psychologist reports daily stressor occurrences' impact on suicide, self-harm ideation in LGBTQ+ teens

Apr 18, 2023

metacognition science problem solving

New study shines light on risk factors for suicidal thoughts in teens

Sep 12, 2022

metacognition science problem solving

No increase in suicidal thoughts, behaviors seen for US veterans during pandemic

Apr 7, 2023

metacognition science problem solving

Excessive screen time linked to suicidal behaviors in US preteens

Feb 27, 2023

metacognition science problem solving

Repetitive negative thinking during low mood linked to suicidal thoughts

Mar 16, 2023

Recommended for you

metacognition science problem solving

Sudden infant death syndrome may have biologic cause

metacognition science problem solving

Identifying the gut bacteria that threaten neonatal babies

metacognition science problem solving

New research sheds light on delusions of external control in schizophrenia

metacognition science problem solving

Depression is more common in the suburbs than in city centers, finds new research

metacognition science problem solving

Encouraging new data on perinatal transmission of HPV

Let us know if there is a problem with our content.

Use this form if you have come across a typo, inaccuracy or would like to send an edit request for the content on this page. For general inquiries, please use our contact form . For general feedback, use the public comments section below (please adhere to guidelines ).

Please select the most appropriate category to facilitate processing of your request

Thank you for taking time to provide your feedback to the editors.

Your feedback is important to us. However, we do not guarantee individual replies due to the high volume of messages.

E-mail the story

Your email address is used only to let the recipient know who sent the email. Neither your address nor the recipient's address will be used for any other purpose. The information you enter will appear in your e-mail message and is not retained by Medical Xpress in any form.

Newsletter sign up

Get weekly and/or daily updates delivered to your inbox. You can unsubscribe at any time and we'll never share your details to third parties.

More information Privacy policy

Your Privacy

This site uses cookies to assist with navigation, analyse your use of our services, collect data for ads personalisation and provide content from third parties. By using our site, you acknowledge that you have read and understand our Privacy Policy and Terms of Use .

E-mail newsletter

IMAGES

  1. PPT

    metacognition science problem solving

  2. Models of Metacognitive Strategies in Solving Words Physics Problems

    metacognition science problem solving

  3. Cognitive and metacognitive model of mathematical problem solving

    metacognition science problem solving

  4. metacognition cycle

    metacognition science problem solving

  5. Metacognition Regulation Cycle Example

    metacognition science problem solving

  6. 20 metacognitive questions to engage your science learners

    metacognition science problem solving

VIDEO

  1. Metacognition.wmv

  2. Science and Technology Cannot Be the Answer (David Bohm)

  3. Metacognition

  4. Webinar Effective metacognition

  5. Metacognition

  6. (2) Joëlle Proust: "la métacognition" -- Centre de sciences cognitives/UniNE

COMMENTS

  1. 20 metacognitive questions to engage your science learners

    Metacognition is more than 'thinking about thinking' or 'learning about learning'. It has three key stages: planning, monitoring and evaluating. Firstly, the planning stage involves finding the problem and choosing the right skills and knowledge in order to solve it. Secondly, the monitoring stage is a chance to reflect on and change ...

  2. Metacognition

    The terms used in the science of learning literature for the processes associated with metacognition are cognitive knowledge and cognitive regulation. ... J.H. (1976). Metacognitive Aspects of Problem Solving. In L.B. Resnick (Ed.), The Nature of Intelligence (pp. 231-236). Hillsdale, NJ: Erlbaum. Hacker, D.J. (1998). Chapter 1. Definitions and ...

  3. PDF Metacognitive Skills and Problem- Solving

    solution, its accuracy is checked. Therefore, problem-solving requires using cognitive skills to know what and how to do and controlling the process. Such practices refer to metacognition and increase problem-solving achievement by enabling students to represent the problem mathematically and try various strategies on it (Davidson & Sternberg ...

  4. Metacognition

    Metacognition is, put simply, thinking about one's thinking. More precisely, it refers to the processes used to plan, monitor, and assess one's understanding and performance. Metacognition includes a critical awareness of a) one's thinking and learning and b) oneself as a thinker and learner. Initially studied for its development in young ...

  5. Helping Students Ask Questions and Define Problems

    Question Formulation Technique: QFT is a structured approach to asking questions. It improves communication skills, critical thinking, and problem-solving skills as well as metacognition. It also increases student autonomy and ownership of learning. Need to Know Questions: These questions are used to guide students' inquiry and drive their ...

  6. What Is Metacognition? How Does It Help Us Think?

    Metacognition is the practice of being aware of one's own thinking. Some scholars refer to it as "thinking about thinking.". Fogarty and Pete give a great everyday example of metacognition ...

  7. Metacognition

    A metacognitive thinker knows when and how he learns best, and employs strategies to overcome barriers to learning. As students learn to regulate and monitor their thought processes and understanding, they learn to adapt to new learning challenges. Expert problem solvers first seek to develop an understanding of problems by thinking in terms of ...

  8. The Effect of Metacognitive Instruction on Problem Solving Skills in

    Pugalee D. K. Writing, Mathematics, and Metacognition: Looking for Connections Through Students' Work in Mathematical Problem Solving. School Science and Mathematics. 2001; 101 (5) ... doing guided metacognitive problem solving exercises in small groups and presentation by the representative of the group and correction of the errors.

  9. (PDF) Metacognitive Skills and Problem-Solving

    The results showed that metacognitive skills have a significant effect on students' problem-solving success. The study found that students with high metacognitive skills tend to solve the ...

  10. Metacognition & Problem Solving

    Successful learners use metacognition to facilitate their problem solving. This is one of the key findings of the National Academy of Sciences' synthesis of decades of research on the science of learning explained in How People Learn: Mind, Brain, Experience and School. Below we explain metacognition and provide the vocabulary to teach it.

  11. The cycle of problem posing, problem solving, and metacognition

    Concerning the steps of problem-solving in science courses, it is observed that four steps consisting of understanding the problem, identifying the problem, carrying out a plan for the solution ...

  12. Learning to Think Mathematically: Problem Solving, Metacognition, and

    Cognitive Science, 2, 155-192. Crossref. Google Scholar. ... The role of metacognition in mathematical problem solving: A study of two grade seven classes. Final report to the National Science Foundation of NSF project MDR 85-50346. Google Scholar. Lucas J. (1974). The teaching of heuristic problem-solving strategies in elementary calculus.

  13. Fostering Metacognition to Support Student Learning and Performance

    Abstract. Metacognition is awareness and control of thinking for learning. Strong metacognitive skills have the power to impact student learning and performance. While metacognition can develop over time with practice, many students struggle to meaningfully engage in metacognitive processes. In an evidence-based teaching guide associated with ...

  14. Full article: Exploring medical students' metacognitive and regulatory

    To develop an instrument that captures metacognitive and regulatory competence in diagnostic problem solving, we began with an existing inventory, the Inventory of Metacognitive Self-Regulation (IMSR) [Citation 30]. The original IMSR was domain-general and designed to capture metacognitive regulatory skills in science and mathematics problem ...

  15. Metacognition

    Metacognition refers to a range of processes and strategies used to assess and monitor knowledge. It includes the "feeling of knowing" that accompanies problem solving, the ability to distinguish ideas about which we are confident from those which we doubt (Tarricone, 2011 ). Piaget's study of children's intelligence included a method ...

  16. Metacognitive Skill

    Problem solving and metacognition. Barbara Blummer, Jeffrey M. Kenton, in Improving Student Information Search, 2014. Problem solving and metacognition. Brown (1977) recognized the benefits of metacognitive skills for problem solving. She identified these skills as "predicting, checking, monitoring, reality testing and coordination and control of deliberate attempts to learn or solve ...

  17. TEAL Center Fact Sheet No. 4: Metacognitive Processes

    Rather than viewing reading, writing, science, social studies, and math only as subjects or content to be taught, instructors can see them as opportunities for learners to reflect on their learning processes. ... Metacognitive aspects of problem solving. In L. B. Resnick (Ed.), The nature of intelligence (pp. 231-236). Hillsdale, NJ: Lawrence ...

  18. [2305.10601] Tree of Thoughts: Deliberate Problem Solving with Large

    Computer Science > Computation and Language. arXiv:2305.10601 (cs) ... Our experiments show that ToT significantly enhances language models' problem-solving abilities on three novel tasks requiring non-trivial planning or search: Game of 24, Creative Writing, and Mini Crosswords. For instance, in Game of 24, while GPT-4 with chain-of-thought ...

  19. Self-Regulated Strategy Development for Algebra Problem Solving

    These practices include explicit instruction, metacognition, use of visual representations, ongoing formative assessment and feedback, multiple examples, and self-regulation. We describe the promising and practical potential of SRSD and STAR, a research-based, metacognitive problem-solving strategy, for instruction in algebraic problem-solving.

  20. Moderating Role of Creative Mindset in the Effect of Metacognitive

    Metacognitive experience, measured by processing fluency, contributes to divergent thinking performance; however, whether it exhibits varying effects on insight problem-solving remains unknown. Additionally, as individuals' interpretation of metacognitive experience is influenced by their creative mindset, whether creative mindset plays a role in the relationship between metacognitive ...

  21. The influence of metacognition in mathematical problem solving

    Abstract. This paper is a review of ten papers about the relation of metacognition and mathematical problem solving. So, the aims of this paper is to analyze the influence of metacognition in mathematical problem solving at low, average, and high students' performance. Metacognition is an important factor of mathematical problem solving.

  22. Cognitive, metacognitive, and motivational aspects of problem solving

    This article examines the role of cognitive, metacognitive, and motivational skills in problem solving. Cognitive skills include instructional objectives, components in a learning hierarchy, and components in information processing. Metacognitive skills include strategies for reading comprehension, writing, and mathematics. Motivational skills include motivation based on interest, self ...

  23. "Oh, that makes sense": Social Metacognition in Small-Group Problem Solving

    Social metacognition researchers have focused on identifying the metacognitive "utterances," or words, phrases, statements, or questions, students use during small-group problem solving. Metacognitive utterances are identified through discourse analysis, which is the investigation of socially situated language (Cameron, 2001; Rogers, 2004 ...

  24. Wind energy has a massive waste problem. New technologies may be ...

    The process, which the company has been working on in partnership with Aarhus University, the Danish Technological Institute and US-based epoxy company Olin, uses a liquid chemical solution to ...

  25. Social stress, problem-solving deficits contribute to suicide risk for

    The research, "Social Problem-Solving and Suicidal Behavior in Adolescent Girls: A Prospective Examination of Proximal and Distal Social Stress-Related Risk Factors," was published online May 25 ...