Problem solving followed by instruction

This section of the guide focuses on process-oriented guided inquiry learning (POGIL), contrasting cases, and productive failure. We included these pedagogies because they are well-defined examples of problem solving followed by instruction and are either widely used in undergraduate science education or have a strong literature base. Readers may also be interested in exploring problem-based learning, and we provide two summaries as a starting point.

Process-oriented, guided-inquiry learning (POGIL)
  • In the POGIL approach, students work in collaborative groups of 3-4 students during class time with facilitation of an instructor.
  • POGIL comprises two components: guided-inquiry learning and development of process skills. The additional focus on process skills makes POGIL different from most collaborative approaches.
  • The POGIL material (activities) are written to provide a structured guided-inquiry opportunity. This is accomplished by having the material carefully structured to follow a 3-phase learning cycle (based on Lawson et al., 1989, which is modified from Karplus’ original cycle) consisting of exploration, concept-intervention, and application. Content and concepts are introduced to the students via the learning cycle.
  • The process skills that POGIL develops are communication, assessment (particularly self-assessment), teamwork, management, information processing, critical thinking, and problem solving.
  • In the pure POGIL model, a class session contains little or no traditional lecture. However, elements of the POGIL approach can be incorporated into a more typical lecture format. Students work on activities in small groups and report out to the entire class at various stages during the learning cycle to ensure understanding by the entire class.
  • The use of rotating roles in the group encourages interdependence among group members and thereby helps to keep group members moving together in solving the problems. The four typical roles are manager, reporter, presenter, and reflector, although different instructors/courses may use different roles.
  • POGIL is used in many STEM undergraduate disciplines including anatomy/physiology, bioscience, chemistry, computer science, engineering, and a number of health sciences courses, and in a variety of institutions. It has been used at all different levels of undergraduate education. Studies have evaluated performance outcomes such as course grades, standardized and course exam performance, and drop/WF rates. Results showed a positive impact on performance and content knowledge; although having comparison groups, most did not account for prior student preparation and identity. For more details, see the review by Rodriguez et al., 2020, below.
  • Social constructivism is the theoretical framework underlying POGIL.
    • The basis of constructivism is that students construct meaning (e.g., develop concepts and models) through active involvement with the material and by making sense of their experiences.
    • Social constructivism assumes that students’ understanding and sense-making are developed jointly in collaboration with other students.
  • The principles of guided inquiry and cooperative learning are also essential elements of POGIL.
    • Guided inquiry: Inquiry-based learning is based on approaches that start with a driving question or issue, and by studying this question or issue students construct new knowledge and understanding. The POGIL approach most closely follows Staver and Bay’s “Structured Inquiry.” In the POGIL approach, the instructor selects (or writes) an activity that is organized around a central question. This activity is written to focus on the central question and, via the 3-phase learning cycle, has guiding questions that help the students uncover the concepts and terms related to the central question.
    • Cooperative learning: The literature shows the biggest gains in student learning typically come from using cooperative learning, an approach based on social constructivism. In cooperative learning, students work in small groups that are structured to encourage interdependence. Key characteristics of a cooperative-learning environment are: each student is given a task or responsibility to help the group succeed; there is individual and group accountability of learning; students discuss with each other to assist all members in their learning; and social/professional skills are taught in addition to the material. See more in the Group Work guide.
Moog R. (2014). “Process oriented guided inquiry” In M. A. McDaniel, R. F. Frey, S. M. Fitzpatrick, and H.L. Roediger (Eds), Integrating cognitive science with innovative teaching in STEM disciplines [E-reader version], 147-166. doi: 10.7936/K75Q4T1X. This chapter describes in detail a POGIL classroom and compares this to a more traditional classroom. It delves into key features of the POGIL approach such as group member roles, materials following the learning cycle, development of process skills, and the instructor’s role as facilitator. The chapter ends with some early studies on the impact of POGIL performance, all in chemistry. The POGIL approach is based in the theory of social constructivism, as well as the principles of guided inquiry, cooperative learning, and  process skill development. Core characteristics are: 1) students work collaboratively in small groups of 3-4, 2) activities are designed to use the learning cycle and include process skills, 3) groups work during class time with a facilitator present, and 4) the dominant mode of instruction is student centered, not instructor centered (i.e., use of lecture). The assignment of roles to group members, such as manager, recorder, presenter, and reflector, helps create interdependence to solve the activity. To provide a structured guided inquiry approach, student learning activities use a learning cycle with three phases: exploration, in which students work with a model (e.g., data, figures, tables); concept invention, in which students use the model to develop a concept or terms; and application, in which students apply the concept or terms to new situations. A key component of POGIL that often separates it from other collaborative and constructivist approaches is the deliberate focus on process skills, such as oral and written communication, self-assessment, teamwork, critical thinking and analytical analysis. Instructors should note that this chapter is an excellent practical resource providing the key features and philosophy of the POGIL approach, a detailed and vivid description of a POGIL classroom, a description of the material in terms of the learning cycle, and a summary of the process skills that POGIL targets. This chapter gives logistics that are often not found in papers.
Rodriguez, J. M. G., Hunter, K. H., Scharlott, L. J., & Becker, N. M. (2020). A review of research on process oriented guided inquiry learning: Implications for research and practice. Journal of Chemical Education, 97(10), 3506-3520. This review describes the key features and current understanding of POGIL and proposes areas for future research. The guiding questions of the review explore 1) the impact of POGIL instruction, 2) the theoretical frameworks on learning used to guide POGIL research, 3) the impact of POGIL instruction on process skills, and 4) the features of POGIL implementation investigated by prior research. The review examines 87 papers (1999-2019) that report studies in a broad range of disciplines including anatomy/physiology, bioscience, chemistry, computer science, engineering, and health sciences. Most studies examined the effect of POGIL on student performance as measured by exam performance, drop/F/withdrawal rates (dFW), or course grades. Results showed a positive impact on content knowledge and an increase in the percent of A grades and decrease in dFW grades earned. Many studies had comparison groups, but most did not account for prior student preparation or student identity. Studies examining affective outcomes (primarily student attitudes) used self-report surveys and course evaluations and found positive perceptions of the POGIL approach and the benefits of working in a group. Some studies also found that while students may recognize the benefits of POGIL, they still wanted more lecture to help organize ideas. The review identified a much smaller number of studies on group discourse, the impact of process skills, and implementation features, suggesting the need for more studies on these topics. In addition, the review suggests that research on the effect of instructor facilitation should be incorporated into the professional development of instructors. Instructors should note that this review provides an excellent summary of research on POGIL impacts. It also identifies gaps in our knowledge about why POGIL positively affects student performance, how POGIL affects process skills and how to assess these skills, and the effect of different POGIL implementations on student outcomes.
Vincent-Ruz, P., Meyer, T., Roe, S. G., & Schunn, C. D. (2020). Short-term and long-term effects of POGIL in a large-enrollment general chemistry course. Journal of Chemical Education, 97(5), 1228-1238. This research article describes the findings of a quasi-experimental study involving 7 large sections of General Chemistry 1 lecture with multiple instructors, four using POGIL (N=809 first-year students) and three using traditional (N=543 first-year students) teaching approaches. The students were followed to General Chemistry 2 (N = 941), which was taught using traditional lecture. Academic outcomes (course grades in both courses and retention to General Chemistry 2) and attitudinal outcomes (fascination, chemistry competency beliefs, and chemistry identity) were examined while accounting for demographics (race/ethnicity, first generation) and preparation (AP and ACT/SAT scores). The POGIL implementation was a hybrid model (50% POGIL, 50% lecture format); students remained in their same team that was created informally on the first day, and trained undergraduate and graduate TAs helped facilitate the teams. All POGIL sections used the same POGIL activities. Compared with non-POGIL sections, POGIL students in General Chemistry 1 were more likely to score in the high-grade range (As) and less likely to receive unsatisfactory grades (C-DFWs), and they were more likely to enroll in General Chemistry 2. In the traditional General Chemistry 2 course, students having prior POGIL General Chemistry 1 experience were again less likely to receive unsatisfactory grades (C-DFWs). In the attitudinal outcomes assessed only in General Chemistry 2, there was no difference in student fascination, and small advantages in chemistry competency beliefs and chemistry identity for students with General Chemistry 1 POGIL experience. Instructors should note that this quasi-experimental study in general chemistry revealed an effect of POGIL on course performance, retention in the series, and a carry-over effect into the second semester, after controlling for academic preparation. The paper also includes an excellent description of the POGIL framework, especially the learning cycle, and one approach to implementing POGIL in a large course with multiple instructors.
Walker, L., & Warfa, A. R. M. (2017). Process oriented guided inquiry learning (POGIL®) marginally effects student achievement measures but substantially increases the odds of passing a course. PLoS One, 12(10), e0186203. This meta-analysis includes twenty-one papers (26 studies) comparing the use of POGIL to standard lecture courses and involved 7876 students over a variety of disciplines (e.g., chemistry, biochemistry, physiology, and nursing). The studies included 24 college-level and two high-school studies, with over half of the studies in chemistry, and compared academic performance (exam or course grades) and course pass/fail odds ratios in POGIL versus standard lecture approaches, although most studies did not account for differences in preparation or demographics. The POGIL implementations in the studies varied from using only the POGIL approach to a hybrid model where only some of class time was POGIL. Most of the studies had class sizes that were 100 students or fewer (80%); 50% had fewer than 50 students. A limitation of these POGIL studies is that most did not adequately describe the efficacy of the implementation; for example, the studies did not discuss the POGIL training or experience of the instructors or if the materials were endorsed by The POGIL Project (and therefore followed the learning cycle). The authors found that students in POGIL courses had higher academic performance (either exam or course scores) with a small effect size of g = 0.29; on average, students in the POGIL group had performances 0.3 standard deviations above the comparison group. In addition, the course pass rate for students in POGIL courses was two times greater than for students in traditional courses, indicating a 38% reduction in risk of failure. Instructors should note that this meta-analysis revealed a robust effect of POGIL on academic performance (exam and course grades) and pass rate, with a larger effect on the pass rate, in spite of differences in implementation and discipline.
Brown, P. J. (2010). Process-oriented guided-inquiry learning in an introductory anatomy and physiology course with a diverse student population. Advances in Physiology Education, 34(3), 150-155. This research article describes a study involving second-semester Anatomy & Physiology at a small private college. The study compared performance (i.e., final-exam performance, course grade, and A/B vs D/F rates) and attitudes (i.e., student course evaluations) between one semester using a traditional lecture approach and three sequential semesters using a POGIL approach, all taught by the same instructor. The semester exams and the final exam were the same for all four semesters, although the academic preparation of the students was not accounted for. The class sizes were small, ranging from 17-31 students. The POGIL implementation was a hybrid model (50% POGIL). The students were placed in groups of 3-4 students and given explicit roles; the instructor facilitated the groups using instructor-developed activities that implemented the 3-phase learning cycle. The author found that students in the POGIL sections had higher final-exam performance (mean of 68% vs. 88%), higher percentages of A/B course grades and lower percentages of D/F course grades. There were no significant differences in the quantitative course-evaluation questions; however, the qualitative responses indicated students perceived there was a learning benefit from using POGIL. Instructors should note that this study in a second-semester Anatomy & Physiology course (comprising juniors and seniors from numerous majors) comparing traditional lecture to the POGIL approach revealed an effect of POGIL on final-exam scores and A/B vs. D/F rates. This paper contains a detailed description of the POGIL method, the learning cycle within the materials, the use and descriptions of the group roles, how the instructor facilitated the POGIL activities, and how one might introduce POGIL to students to create buy-in. Hence, this paper is useful to instructors wanting a clear description of how POGIL may be implemented and facilitated in a small classroom.
Moon, A., Stanford, C., Cole, R., & Towns, M. (2016). The nature of students’ chemical reasoning employed in scientific argumentation in physical chemistry. Chemistry Education Research and Practice, 17(2), 353-364. This paper describes the argumentation discourse upper-level students use while solving thermodynamic problems in two physical chemistry courses using the POGIL approach. The student discourse during whole-class videos was analyzed using the Toulmin Argument Pattern which organizes arguments into a claim (or conclusion), data or evidence supporting the claim, and an explanation connecting the data/evidence to the claim. The argument discourse was further coded using the Chemical Thinking Learning Progression framework to determine the quality of reasoning students used in their arguments. This framework comprises four components: descriptive, relational, linear, and multi-component. The study examined the amounts of each type of reasoning the students used and if the amounts varied depending on question type. The authors found that the most common type of reasoning was relational (mathematical relationships without conceptual understanding), then linear (stepwise description of mathematical operations or conceptual process), descriptive (repetitive using superficial information from the prompt), with multi-component (multiple variables were described using conceptual understanding) reasoning being least frequent. No matter what type of prompt was used, descriptive reasoning (the least complex of reasoning) was not widely used. Students were more likely to use relational reasoning for conceptual prompts and linear reasoning for prompts requesting a derivation or value calculation. Linear and multi-component reasoning gave evidence for students verbalizing their understanding, whereas relational reasoning gave little indication of conceptual understanding. Instructors should note that different types of problems prompt different levels of reasoning from students working and constructing arguments together. Writing prompts that scaffold students to use more complex reasoning is suggested. In addition, instructors can increase the level of reasoning for most problems by modeling more complex reasoning and should explicitly teach linear and multi-component reasoning.
Moon, A., Stanford, C., Cole, R., & Towns, M. (2017). Decentering: A characteristic of effective student–student discourse in inquiry-oriented physical chemistry classrooms. Journal of Chemical Education, 94(7), 829-836. This paper describes the discourse upper-level students use within their small groups while solving thermodynamic problems in two physical chemistry courses using the POGIL approach. The complexity of the discussion and the inclusion of multiple voices in the discussion during problem solving can support deeper levels of questioning and understanding. One technique that can increase the amount of questioning and use of multiple perspectives is decentering, which is a process where one person uses others’ perspectives to help explain a given concept. In this study, numerous small groups were examined to show how students use decentering during their argumentation discourse or how they missed the opportunity to decenter the discussion. The authors found that three features of rebuttals indicating decentering are: 1) a student challenges the initial claim by asking either why something (e.g., a variable) was missing from the claim or why something (e.g., a term or assumption) was used in the claim; 2) a student modifies the original claim by incorporating a new or expanded idea; 3) a student rephrases the initial claim and explains how that claim differs from their own. Examples of argumentation discussions without decentering are: 1) a student substitutes in their own “correct” claim without any explicit explanation of how their claim differs from the original claim; 2) two students have different claims and there is no discussion about how the two differ or which is correct, if either. Instructors should note that the technique of decentering can encourage students to discuss the reasoning behind their solutions and thereby increase the complexity of small group discussions. Instructors can encourage decentering by asking for multiple solutions and their reasoning during whole-class discussions, and by modeling decentering.
The national POGIL Project has a strong national and international community. This website describes the POGIL methodology, the POGIL community and how to join this community, resources (free and for sale) such as role cards, process-skills definitions, implementation guide, curriculum materials, and events to attend such as webinars, workshops, and conferences.
Contrasting cases
  • Contrasting cases are problems or scenarios that differ in key features. Comparing the cases can help learners identify deep features of the problem type and can serve as an important step in developing conceptual understanding about solving problems in the domain under study.
  • Contrasting cases have been explored as a teaching tool when used in several ways.
    • They have been used both before and after direct instruction. Generally, contrasting cases are found to be more beneficial when used prior to direct instruction that explains the principle linking the cases.
    • They have been used to
      • prompt invention of a general solution, requiring inductive thinking.
      • prompt identification of case similarities and differences to identify deep features. When used for to help students identify deep features of problems, they have been used in two ways: with prompts for students to make comparisons, and with expert-generated notations to highlight important similarities and differences. When used in this way, contrasting cases can be viewed as a form of worked example.
    • They have been used in both individual and cooperative learning environments.
  • These studies have led to several conclusions:
    • When asking students to compare cases to identify deep features, it is more beneficial to ask them to identify similarities than differences.
    • When asking students to invent a general solution, the case set should include at least two positive cases—that is, two cases that show the phenomenon under study—and one that does not. The positive cases should show the phenomenon to different degrees. For example, if the phenomenon is production of red pigment, one case might have a red flower; a second case a pink flower; a third case a white flower.
    • More guidance leads to greater benefits from contrasting cases. This guidance can occur through several mechanisms including pretests that may help focus student attention and foster interest; examples of the type of thinking that is required; or expert-generated notations that draw attention to key features.
Schwartz, D. L., Chase, C. C., Oppezzo, M. A., & Chin, D. B. (2011). Practicing versus inventing with contrasting cases: The effects of telling first on learning and transfer. Journal of Educational Psychology, 103(4), 759-775. doi:10.1037/a0025140. This article describes an experimental study of inventing with contrasting cases compared to tell and practice instruction. Inventing with contrasting cases involves asking students to determine the underlying structure of problems that show contrasting instances of the same variable. In tell-and-practice instruction, students are told the important structural feature up front and given practice solving problems using that structural feature. The researchers hypothesized that inventing with contrasting cases would lead to superior transfer because grappling with the problems to determine the underlying structure may better serve students in applying that structure. The study involved 8th graders who were learning about ratios as a fundamental explanation of physics phenomena. Randomly assigned students experienced inventing with contrasting cases vs. tell-and-practice, and researchers assessed learning and transfer using measures of deep structure, superficial structure, transfer, and word problems. Inventing with contrasting cases students significantly outperformed tell-and-practice students on deep structure and transfer measures but not superficial structure or word problem measures. Researchers also found that students’ detection of deep structure in the measures of learning predicted performance on the transfer measures, regardless of instruction type. Researchers reasoned that students who found the deep structure in the problem set during the initial lesson were more likely to find deep structure on the transfer task and that inventing with contrasting cases better facilitated students to find deep structure. Instructors should note that this paper focuses on a problem solving prior to instruction method distinct from productive failure. Both inventing with contrasting cases and productive failure ask students to solve problems before they have been told the underlying principles, but inventing with contrasting cases asks students to actually determine the underlying principle. Instructors should also note that this paper provides discussion of the possible mechanisms by which inventing with contrasting cases improves learning relative to tell-and-practice and the potential variability in impact of inventing with contrasting cases depending on what is to be learned and the prior knowledge of the learners.
Alfieri, L., Nokes-Malach, T.J., and Schunn, C.D. (2013). Learning through case comparisons: A meta-analytic review. Educational Psychologist, 48, 87-113. In this meta-analysis, the authors examine the effectiveness of case comparison as a learning method, considering 15 potential variables that they postulated could moderate the impact of the general case comparison approach. Specifically, they looked at the results of 336 comparisons reported in 57 experiments, considering the impacts of process variables (e.g., general prompts vs. guided questioning; identification of similarities, differences, or both; rich vs. minimal cases), context variables (e.g., conceptual, procedural, or perceptual content; experiment performed in a classroom or a lab setting; age of learners; science, math, or other domain), and measurement variables (e.g., immediate vs. delayed testing; near vs. far transfer) as well as the interactions among these variables. Under a random effects model, the 57 experiments demonstrated a medium effect size for case comparison, indicating that this approach is, in general, moderately effective. There were several variables that were found to consistently moderate outcomes. Specifically, the authors found that explaining the principle linking the cases after case comparison was more beneficial than explaining it before or not explaining it. They also found that prompting identification of similarities was more beneficial than identification of differences. Finally, they found that perceptual content (i.e., altering how stimuli are considered or organized) benefited more from case comparison than procedural content, and that greater impacts were observed with immediate testing than with delayed testing. Based on these results, instructors should note that case comparison is a reliably effective method for teaching undergraduates science and math content. Prompting students to identify similarities between cases and then explaining the principle after this comparison will tend to increase the benefits of this approach.
Roelle, J. and Berthold, K. (2015). Effects of comparing cases on learning from subsequent explanations. Cognition and Instruction, 33, 199-225. The authors use cognitive load theory and flow theory to suggest potential mechanisms for benefits of contrasting cases used prior to direct instruction. They hypothesized that comparing cases fosters learning from subsequent instruction; reduces extraneous cognitive load; increases the feeling of smooth automatic running (“flow”); and reduces learning time and enhances learning efficiency. Further, they hypothesized that providing comparisons of cases would lead to greater subsequent learning than asking students to generate comparisons. To test these hypotheses, the authors presented undergraduates (n = 75) either (a) contrasting cases plus comparison prompts, (b) contrasting cases plus provided comparisons (i.e., model answers to the comparison prompts), or (c) no preparation intervention. All groups then engaged with computer-based instruction in management theory. [Note that the comparison prompts asked students to identify differences rather than similarities, which Alfieri, Nokes-Malach, and Schunn (2013) found to be a less effective prompt.] The researchers used Likert scale items to measure extraneous cognitive load and students’ perception of flow and open-ended items to measure conceptual knowledge. They found that learners who encountered contrasting cases before direct instruction learned more, with a higher degree of efficiency, and perceived lower extraneous load and higher flow than learners without this preparation. The researchers therefore suggest that contrasting cases serve a focusing function. Further, they found that the learners in the provided comparisons condition acquired more conceptual knowledge from the explanations than the learners in the prompted comparisons condition but did not exhibit differences in extraneous load or perception of flow. This result indicates that providing high amounts of instructional guidance while learners compare contrasting cases may have an added value. Based on these results, instructors should note that providing contrasting cases prior to instruction can reduce students’ extraneous cognitive load, increase their sense of flow, and increase learning. Asking students to consider comparisons or to generate their own can both provide benefits, although greater guidance may provide enhanced benefit.
Shemwell, J.T., Chase, C.C., and Schwartz, D.L. (2015). Seeking the general explanation: A test of inductive activities for learning and transfer. Journal of Research in Science Teaching, 52(1), 58-83. This article compares two instructional approaches, one that promotes inductive reasoning and another that promotes hypothetico-deductive reasoning. The authors used three contrasting cases that exhibited key features of Faraday’s law: two cases where a phenomenon occurs, but to different degrees (i.e., current flowing through a coil), and one case where the phenomenon does not occur (i.e., no current through the coil). In Experiment 1, the authors compared two approaches using these cases in undergraduate engineering students. To promote deductive thinking, students were asked to use the cases individually in a series of predict-observe-explain tasks. To promote inductive thinking, students were first presented with an example where a general explanation about an unrelated phenomenon was derived and then were presented with the three cases and asked to provide an explanation. The authors analyzed explanations students wrote while examining the cases and students’ ability to solve novel problems after the learning phase, finding that students in the inductive thinking condition were much more likely to identify deep structure in the cases and to solve novel problems than students in the deductive thinking condition. Further, the authors found that identification of the deep structure predicted posttest performance. In Experiment 2, the authors repeated the experiment but added conditions to test potential confounds. They found that students in the inductive condition exhibited the highest scores on tests of novel problems, and that regardless of condition, identification of deep structure in the cases was predictive of this performance. Prompting students to identify a general explanation and including a negative case (where the phenomenon under study does not occur) increased learning from cases. Based on these results, instructors should choose contrasting cases that exhibit the presence and absence of the phenomenon under study and should prompt students to identify a general explanation from the cases.
Kuo, E. and Wieman, C.E. (2016). Toward instructional design principles: Inducing Faraday’s law with contrasting cases. Physical Review Physics Education Research, 12, 010128. This study compares two instructional approaches for teaching undergraduate physics students Faraday’s law: 1) having students explain a set of examples illustrating the concept (the contrasting cases approach) and 2) having students connect the concept to a previously learned concept. All students (n = 334) received one lecture on Faraday’s law; approximately one half then used the build-on-prior-concepts approach and the other half the contrasting-cases approach. To help students build on prior concepts, students were prompted to apply their knowledge of a related concept to make predictions about different iterations of an experimental set up designed to lead to recognition of Faraday’s law. In the contrasting cases condition, students were presented with six cases simultaneously and prompted to write a general explanation. Students in both groups then received an explanation of Faraday’s law and were given a posttest. There was no significant difference in the posttest performance, but students in the contrasting cases condition reported greater enjoyment as well as higher performance on a final exam question on related material learned after the study, suggesting that the contrasting cases approach may have prepared them for future learning. The investigation was repeated in Study 2 with small modifications. Students in the contrasting cases condition had higher scores on the post-test and on questions of subsequent material on the final exam, again suggesting that the contrasting cases approach had prepared students for later learning. Students in the contrasting cases condition again reported higher enjoyment. Interestingly, students in the contrasting cases condition who skipped the preparatory lecture outperformed students who attended, suggesting that the value of the contrasting cases is maximized when students identify the key features for themselves. Based on these results, instructors should note that sets of contrasting cases that reveal key features of a concept can help students learn the concept and can help prepare them to learn related concepts in the future.
Newman, P.M., and DeCaro, M.S. (2019). Learning by exploring: How much guidance is optimal? Learning and Instruction, 62, 49-63. This manuscript considers the level of guidance that helps students learn from exploration activities before direct instruction. In Experiment 1, undergraduates in a lab setting took a pretest and then explored a statistical concept in one of three ways. In the invention condition (no guidance), they were asked to invent a method to determine consistency within a dataset; in the worked examples condition (full guidance), they were shown how to calculate standard deviation, with explanations for each step; in the completion problems condition (partial guidance), they were given the solutions with some blanks. After these exploration activities, students received direct instruction in the form of a text passage with practice problems followed by a posttest. Worked examples led to significantly higher posttest scores than invention, with completion problems having an intermediate outcome. Students in the worked examples condition also reported lower cognitive load and perceived knowledge gaps, but no difference in interest or enjoyment. In Experiment 2, the authors compared worked examples and invention conditions both before and after instruction in an undergraduate statistics course. Notably, no pretest was included in this experiment. The explore-before-instruction condition resulted in higher posttest scores than the instruction-before-exploration conditions, but there was no difference between the worked examples or invention conditions in posttest scores. The authors hypothesized that the different results in Experiments 1 and 2 was due to the presence of a pretest in Experiment 1. They tested this hypothesis in Experiment 3 in a lab experiment with four conditions: pretest-worked examples; pretest-invention; no pretest-worked examples; no pretest-invention. In all conditions, exploration was followed by direct instruction and a posttest. The authors found that a pretest had a divergent effect in the two conditions, increasing student learning from worked examples but not in the invention condition. Based on these results, instructors should note that contrasting cases with comparisons highlighted—which are essentially worked examples—can serve as an effective exploration activity prior to direct instruction. Further, a pretest can enhance the benefits of students’ exploring using worked examples.
Glogger-Frey, I., Fleischer, C., Gruny, L., Kappich, J., and Renkl, A. (2015). Inventing a solution and studying a worked solution prepare different for learning from direct instruction. Learning and Instruction, 39, 72-87. This study examines two ways to use contrasting cases before direct instruction: asking students to compare solved cases (using worked examples) or to invent a solution. The authors performed two experiments asking whether these two approaches resulted in different preparedness to learn from subsequent instruction. In the first, student teachers (N = 42) studied learning strategies in high school students’ learning journals. Participants in the invention condition were prompted to contrast and rate four journals before generating evaluation criteria. Participants in the worked solution group studied the same four journals, but with contrasts, ratings, and an evaluation scheme provided. Participants reported their perception of their knowledge gaps, curiosity, and interest before working individually through computer-based instruction on the topic and completing a posttest. Students in the invention condition reported greater knowledge gaps and curiosity before the direct instruction phase but exhibited lower posttest performance, which appeared to be mediated by less learning time on the most relevant aspects of the content. Experiment 2 aimed to conceptually replicate Experiment 1 but with less motivated students (8th graders, n = 40), the use of a cooperative learning environment, and a different learning domain (density and ratios). The authors found that students in the invention condition reported lower self-efficacy, higher extraneous cognitive load, and higher knowledge gaps than students in the worked solution condition. There was no difference in performance on questions that required near transfer, but the invention condition resulted in lower scores on questions that required far transfer. This effect appeared to be mediated by students’ self-efficacy and perception of knowledge gaps; that is, invention resulted in lower self-efficacy and perception of more knowledge gaps, and this negatively impacted students’ learning from subsequent instruction. The differences between the two experiments are notable: student-teachers’ awareness of knowledge gaps correlated with their curiosity, while this relationship was not observed for 8th graders. Based on these results, instructors should note that guiding and focusing learners on correct solutions to a problem with contrasting cases can prepare them for learning from direct instruction, although the mechanism for this effect may differ depending on the maturity and motivational state of the learners.
Productive Failure
  • Productive failure theory posits that under certain conditions, students’ engagement in solving problems that are beyond their skill sets and abilities can be productive for learning, even though failure may initially occur.
  • Productive failure derives from theoretical underpinnings that point to the importance of errors and failure during learning, linking to the idea of desirable difficulties.
  • Failure can be beneficial to learning because it can activate prior knowledge, reveal gaps in one’s knowledge as well as the limits of one’s knowledge. Failure can also increase the agency of learners and their motivation. Thus, even though the process of failing may burden learners’ cognitive load, it may also set up learners to learn more or better in the future.
  • Research on productive failure has documented its advantage for learning conceptual knowledge and the ability to transfer one’s knowledge to new problems.
  • Researchers have argued that productive failure (short-term failure in exchange for long-term learning) is superior to instruction followed by problem-solving approaches because those approaches might promote unproductive success (short-term success in exchange for long-term failure) (Kapur, 2016).
  • The productive failure approach involves two key phases.
    • Phase 1: Prior to any explicit instruction, students solve complex problems that are beyond their capabilities.
    • Phase 2: After problem solving, instructors provide explicit instruction that reveals normative conceptual knowledge and problem-solving procedure.
  • Current literature suggests that productive failure is most effective when (1) the problem-solving phase uses contrasting cases, i.e., problems that differ from each other in a way that pertains to the underlying principle of the problem, and/or (2) the explicit instruction phase considers and builds on the solutions students generate during the problem-solving phase.
  • Most productive failure research involves a problem-solving phase without guidance. Students are left to solve the problem on their own with no direction from the instructor. Yet it is an open question whether this lack of guidance during problem solving is essential.
Kapur, M. (2008). Productive failure. Cognition and Instruction, 26(3), 379–424. This landmark paper by Manu Kapur was the first to explain the rationale and test the hypothesis for productive failure: under certain conditions, students’ engagement in solving ill-structured problems that are beyond their skill sets and abilities can be productive for learning, even though failure may initially occur. Prior to this paper, researchers had substantiated that structured problem solving can be helpful to learning (see instruction followed by problem solving paper summaries), but this paper substantiated that ill-structured problem solving can also be beneficial for learning. This study tested the productive failure hypothesis with N = 309 eleventh-grade science students learning Newtonian kinematics. Students in seven schools were randomly assigned to groups who solved either well-structured or ill-structured problems. In terms of group dynamics, the ill-structured condition showed significantly greater complexity in group interactions, but the well-structured condition showed significantly greater convergence in arriving at a shared understanding of the problem. In terms of performance during the problem-solving lesson, students in the well-structured condition outperformed those in the ill-structured condition, and convergence was a significant predictor of performance. In terms of post-test performance, students in the ill-structured condition outperformed those in the well-structured condition on both measures: well-structured problem solving and ill-structured problem solving. These data support the productive failure hypothesis, because students in the ill-structured condition generated stronger learning than those in the well-structured condition, despite initial failure. Kapur suggested that the complexity and divergence of the ill-structured groups’ interactions, which were facilitated by the nature of the task, helped these students create better problem representations to solve the post-test problems. Instructors who are interested in implementing Productive Failure should know that this paper thoroughly lays out the rationale and supporting literature for the approach and the details of implementation.
Kapur, M. (2011) A further study of productive failure in mathematical problem solving: unpacking the design components. Instructional Science, 39, 561-579. This article describes the findings of a quasi-experimental study investigating the impact of failure during problem solving for 7th grade mathematics students (N=109). Productive failure derives from the premise that failure during problem solving encourages critical learning processes and sets up students to benefit from subsequent instruction. Kapur compared productive failure to two other instructional approaches: facilitated complex problem solving and lecture and practice. Students in the productive failure group (n = 36 in one class) participated in multiple rounds of collaborative and individual complex problem solving without facilitation by the teacher. Subsequently, the teacher using productive failure led a consolidation lecture that involved consideration of students’ representations and solution methods. Students engaging in facilitated complex problem solving (n = 34 in a second class) participated in identical instruction with one key difference: the teacher facilitated complex problem solving, providing feedback and questions that prompted explanation and elaboration. Students engaging in lecture and practice (n = 39 in a third class) participated in an interactive lecture that presented the pertinent concepts and a teacher-led discussion of their solutions to well-structured problems. The three instructional methods led to different outcomes. First, during the complex problem-solving phase, students in the productive failure group produced significantly more diverse representations than the students in the facilitated complex problem-solving group, who produced more successful solutions. Second, on the posttest, students from the productive failure condition scored the highest followed by those from the facilitated complex problem-solving condition and then the lecture and practice condition. This order held for well-structured, higher-order application, and graphical representation items, and the findings were statistically significant. This study supports the hypothesis that failure during problem solving can lead to greater learning outcomes, even though short-term problem solving is unfacilitated and unsuccessful. Instructors should note that this article provides seminal evidence for the potential of short-term failure during instruction to produce long-term learning. The article also summarizes key literature on the limits of working memory, the role of failure, and the role of prior knowledge during learning.
Loibl, K., & Rummel, N. (2014). Knowing what you don’t know makes failure productive. Learning and Instruction, 34, 74-85.This paper sheds light on outstanding questions about the impact of productive failure on student learning. In particular, the researchers examined two factors that were previously confounded in productive failure research: the timing of the problem-solving phase and the form of the instruction phase. Timing deals with whether students solve complex problems before or after explicit instruction (PS-I vs. I-PS). Form of instruction deals with whether explicit instruction includes contrasts between students’ noncanonical solutions and canonical solutions (Icon vs. I). The researchers conducted two experiments with 10th grade mathematics students learning the concept of variance; they used multiple measures of student learning, including immediate and delayed post-tests of procedural and conceptual knowledge. This summary focuses on findings from the immediate post-tests of conceptual knowledge. The first experiment compared I-PS, Icon-PS, and PS-Icon and confirmed the superiority of PS-Icon compared to I-PS for learning conceptual knowledge. However, the experiment also showed the importance of the form of instruction because PS-Icon and Icon-PS were statistically equivalent for learning conceptual knowledge. The second experiment compared four conditions: I-PS, Icon-PS, PS-Icon, and PS-I. This experiment showed a large effect size for form of instruction, favoring instruction based on students’ solutions, and a medium effect size for timing of instruction, favoring problem solving prior to instruction. The experiment also showed a significant interaction such that the impact of instructional form was more pronounced in the problem solving-first conditions. The means of student performance on conceptual knowledge questions showed the following pattern: PS-Icon > Icon-PS > PS-I > I-PS, with the first two comparisons reaching statistical significance and the third comparison only showing marginal significance. The researchers also examined the extent to which each of the four instructional conditions helped students identify their knowledge gaps and raised their curiosity for learning. They found that the two problem solving-first conditions were better at helping students identify knowledge gaps and creating curiosity for learning. Overall, these findings tease apart the components of PF and suggest that while timing and form are both important, instructional form may be more important. The authors also propose a mechanism for their findings. Problem solving before instruction activates prior knowledge and reveals broad knowledge gaps. Instruction involving contrasts reveals more specific knowledge gaps between students’ common intuitive ideas and canonical ideas. Using these two approaches together (i.e., PS-Icon) may create maximum benefit. Instructors should note that this article illustrates the complexity of instructional decision-making, even within an evidence-based pedagogy like productive failure. The paper provides a detailed roadmap for how to conduct a problem solving-first approach and how to use student solutions during explicit instruction.
Jacobson, M. J., Markauskaite, L., Portolese, A., Kapur, M., Lai, P. K., & Roberts, G. (2017). Designs for learning about climate change as a complex system. Learning and Instruction, 52, 1-14. doi:10.1016/j.learninstruc.2017.03.007a.This study investigates the impact of productive failure on student learning about climate change and complex systems. The researchers compared the effects of PF and direct instruction on ninth-grade students’ declarative and explanatory knowledge and transferrable problem-solving skill. In productive failure instruction, students (N=56) first solved two complex problems. The instructor then explained the relevant concepts followed by time for students to solve one more problem. In direct instruction, the teacher first lectured on the relevant concepts followed by time for students (N= 54) to solve the same three problems from the productive failure condition. Researchers found no differences between the productive failure and direct instruction conditions on measures of declarative knowledge. However, productive failure students showed significantly better learning for explanatory knowledge, near transfer, and far transfer. Researchers also noted qualitative observations of the productive failure and direct instruction approaches. Productive failure students frequently discussed the problems with each other during the problem-solving phase and asked questions during the teacher explanation, while direct instruction students passively listened during the lecture and spent much of the problem-solving phase searching through their notes. The authors note that their results are consistent with the theoretical mechanism of productive failure. That is, productive failure causes students to activate prior knowledge and discover knowledge gaps, which enables the construction of more elaborate schemas. They suggest that direct instruction students did not activate prior knowledge and thus had limited opportunities for the construction of elaborated schemas. Instructors may benefit from reading this study because of its relevance to college biology teaching: (1) the direct instruction implementation is similar to traditional biology classrooms, allowing instructors to see what would be required for them to implement productive failure in place of direct instruction; (2) the paper focuses not only on evidence-based pedagogy but also on particular conceptual challenges in learning about complex systems; (3) the paper offers helpful descriptions of proposed theoretical mechanisms for both productive failure and direct instruction.
Chowrira, S. G., Smith, K. M., Dubois, P. J., & Roll, I. (2019). DIY productive failure: boosting performance in a large undergraduate biology course. NPJ Science of Learning, 4(1), 1-8. This article describes the findings of a quasi-experimental study involving first-year cell biology undergraduate students. The researchers compared the exam performance of students who participated in productive failure (N=295) vs. active learning (N=279) for two course topics. The participants were enrolled in two separate sections with two different instructors. In both sections, all other topics were taught using active learning. The productive failure implementation involved two phases: In the Exploration phase, the students worked in groups of 2-4 on a challenging activity with questions at higher Bloom’s levels for 25 minutes. Next, the instructor led a Consolidation phase in which student responses were collected via clicker and used during the instructor’s explanation of an expert approach to the activity. The active learning implementation also involved two phases: In the Walkthrough phase, the instructor introduced the topic and worked through an example problem; students answered clicker questions and engaged in group discussions. In the problem-solving phase, students worked through problems in groups, receiving formative feedback as they worked. In both implementations, students took a pre-class quiz on assigned reading. Productive failure students significantly outperformed active learning students (by ~5% points) on study items on the midterm exam while controlling for several variables, including prior exam performance, gender, and score on exam items besides the study items. The effect appeared to be concentrated among low-performing students (i.e., as measured by midterm exam 1). No differences were found between groups on the final exam. Instructors should note that this is one of few research articles investigating the impact of productive failure on undergraduate learning and one of only a handful of studies that undertake a comparison of productive failure with active learning pedagogies. The authors also discuss possible mechanisms for the difference they observe, suggesting that the productive failure condition may have given students better opportunities to understand the deep structure of problems.
Loibl, K., Roll, I., & Rummel, N. (2017). Towards a theory of when and how problem solving followed by instruction supports learning. Educational Psychology Review, 29(4), 693-715. This paper aims to specify the conditions under which problem solving followed by explicit instruction improves student learning. The authors aimed to pinpoint the design features that best promote learning among different problem solving-first approaches and to identify the cognitive mechanisms that account for the learning outcomes. They reviewed twenty papers that tested a problem solving followed by explicit instruction condition with an explicit instruction followed by problem solving condition. These twenty papers included ones focused on productive failure, invention activities, initial problem solving, or problem solving plus direct instruction or explicit instruction. The authors documented the learning outcomes and effect sizes from each study, including measures of procedural knowledge, conceptual knowledge, and transfer. They also documented whether the studies included either of two design features: contrasting cases (i.e., problems that contrast two datasets that differ in one deep feature at a time) and explicit instruction that considers students’ solutions prior to explanation of the canonical solution. The authors found that PS-I approaches can be beneficial primarily for the acquisition of conceptual knowledge and transfer. They also found that this potential benefit depends on the design features. The benefits of PS-I primarily occur when it involves contrasting cases in the problem-solving phase or building on students’ solutions in the explicit instruction phase. The authors propose that the learning benefits of these approaches can be explained by three cognitive mechanisms: activation of prior knowledge, awareness of knowledge gaps, and recognition of deep problem features. Instructors may benefit from this paper because it presents an analysis of the key papers in the literature on problem solving-first instruction, offers insights into the particular design decisions needed when using problem solving-first approaches, and focuses attention on the importance of problem solving first for learning outcomes including conceptual knowledge and transfer.
Problem-based learning
Allen, D., and Tanner, K. (2003). Approaches to cell biology teaching: Learning content in context—problem-based learning. Cell Biology Education 2, 73-81. This essay describes problem-based learning, including the stages of the learning cycle and the roles students and instructors play, and provides references to sample problems. The authors describe strategies for implementing problem-based learning, including in high-enrollment classes, suggestions for assessment, and links to resources. It is a valuable place to get started for instructors wanting to investigate this pedagogical approach.
Anderson, W.L., Mitchell, S.M., and Osgood, M.P. (2005). Comparison of student performance in cooperative learning and traditional lecture-based biochemistry classes. Biochemistry and Molecular Biology Education, 33, 387-393. [Author abstract] Student performance in two different introductory biochemistry curricula are compared based on standardized testing of student content knowledge, problem-solving skills, and student opinions about the courses. One curriculum was used in four traditional, lecture-based classes (n = 381 students), whereas the second curriculum was used in two cooperative learning classes (n = 39 students). Students in the cooperative learning classes not only performed at a level above their peers in standardized testing of content knowledge and in critical thinking and problem-solving tasks (p < 0.05), but they also were more positive about their learning experience. The testing data are in contrast to much of the medical school literature on the performance of students in problem-based learning (PBL) curricula, which shows little effect of the curricular format on student exam scores. The reason for the improvement is undoubtedly multifactorial. We argue that the enhancement of student performance in this study is related to: 1) the use of peer educational assistants, 2) an authentic PBL format, and 3) the application of a multicontextual learning environment in the curricular design. Though educationally successful, the cooperative learning classes as described in this study were too resource intensive to continue; however, we are exploring incorporation of some of the “high context” aspects of the small-group interactions into our current lecture-based course with the addition of on-line PBL cases.

Return to Map

Cite this guide: Frey RF, Brame CJ, Fink A, and Lemons PP. (2022) Evidence Based Teaching Guide: Problem Solving. CBE Life Science Education. Retrieved from https://lse.ascb.org/evidence-based-teaching-guides/problem-solving/
  • Your feedback helps improve the site