There has been some controversy in educational research between promoters of minimal guidance and constructivist approaches to learning vs a growing group of educationalists who support guided instruction approaches. This blog post compares two studies that are diametrically opposite in their conclusions. The first one is a review of the research on minimal guidance and on guided instruction studies by
Kirschner et al. (2006)and the second a study by Muller et al. (2008), by some of you probably known as Veritasium on YouTube, on raising cognitive load with linear multimedia to promote conceptual change.
Kirschner et al. conclude that the ‘only’ activities that can be effective to introduce new information to students are worked examples and guided instruction and that problem-solving tasks are detrimental to learners’ progress and mastery of a subject. They argue that problem-solving activities put too much strain on the student’s cognitive load, making it impossible for learners to retain new information. According to Kirschner et al. these barriers are engrained in the human cognitive architecture. Miller (1994)showed that the capacity of working memory is restricted to seven elements, when introducing “new, yet to be learned information” (Kirschner et al., 2006, p77)and it is also limited in time (about 30 minutes).
However, Kirschner et al. (2006)also point out that “the limitations of working memory only apply to new, yet to be learned information that has not been stored in long-term memory” and that “when dealing with previously learned information stored in long-term memory, these limitations disappear” (p77). So it appears that the pedagogical conclusions drawn by the review from these two observations, i.e. that guided instruction (e.g. worked examples and highly structured worksheets) is preferable to problem-solving and other constructivist activities, seem to ignore that learners’ starting points when new topics are introduced, particularly in physics, are filled with pre-conceptions (Leach and Scott, 2003; Lee and Luft, 2008; Palmer, 2005; Scaife, 2008)that are often taking the form of “common sense science” (Kibble, 2006, p228). It seems, therefore, reasonable to argue that students often have quite a large amount of information (however correct or incorrect) stored in their long-term memory about the ‘new’ information they are being presented. So, introducing ‘new’ topics as if they were primarily new information to pupils might actually not be conducive to learning, as the ‘new’ information presented, if introduced only through guided instruction methods that do little to challenge students’ misconceptions, will probably not generate cognitive conflict (Palmer, 2005, p1869; Scaife, 2008, p92), hence fail to convince students that they need to change their thinking to form more accurate models and conceptions. The result is likely to be lower retention and understanding, as well as a recurrence of previously held and wrong conceptions as soon as the scaffolding provided by guided instruction is removed.
Another issue with the studies reviewed by Kirschner et al. (2006)was that the measure for progress used seemed to be linear exam-style questions. In fact, Kirschner et al. (2006)write “the worked-example effect, which is based on cognitive load theory, occurs when learners required to solve problems perform worse on subsequent test problems than learners who study the equivalentworked examples.” So, the ‘learning’ they are talking about here and ‘measuring’ is exam-type questions. Clearly, practicing with a set of questions that require the use of identical procedures, but giving slightly different contexts would lead to better performance in those tests. But, faced with a new situation, would these learners be able to show understanding and mastery at the same level?
Isn’t this why students who are ‘taught’ and ‘learn’ to the exam are often unable to make links between different areas of science and physics in particular?
In addition, if the measure is exam-style questions we know only 20% of questions in an exam paper in England are AO3 – analyse information and ideas to: interpret and evaluate; make judgements and draw conclusions; develop and improve experimental procedures. In other words, only a very limited number of exam-style questions would address higher-order thinking. It was also unclear when students were tested in the studies reviewed by Kirschner et al. (2006), nor whether students’ retention was measured at a later stage.
On the other hand, Muller et al. (2008)conducted a study with over 270 students and used a control group where students watched videos showing a correct explanation of physics concepts, but without presenting or discussing misconceptions in that area. These students performed worse in post-tests than the study group and generally retained their misconceptions. They actually “became more confident in their preconceptions, believing that the multimedia supported their views” (Muller et al., 2008). This again seems to support the observation that these students interpreted and processed the information they were presented with the alternative conceptions stored in their long-term memory and generated through years of experiences of the world around them, which led to “common sense” (Kibble, 2006)science conceptions.
Interestingly, students watching videos that presented and discussed misconceptions before a more accurate explanation was given performed better in post-tests and generally showed conceptual change in their thinking. Muller et al. (2008)observed that this was because of “students investing more mental effort”, so they were “more likely to recognize discrepancies between their extant knowledge and correct scientific conceptions”. It could be argued that Muller et al.’s approach generated a range of cognitive conflicts in learners which led to the process needed for conceptual change to occur described by Driver (1989)“Students first need to be dissatisfied with their existing conception, then for this to be replaced a new conception has to be available that can be understood by the learner and which fits with their experience and is useful in the longer term in interpreting and predicting events”.
In our opinion the research literature shows little justification for adopting a highly guided instruction approach, but a much more effective learning strategy is to identify and discuss students’ misconceptions through classroom dialogue, experiential learning and develop conceptual change through cognitive conflicts and activities that lead to experimentation of new concepts and ideas that better fit observed phenomena and are more likely to generate a lasting change in learners’ long-term memory.
Bob Kibble, 2006. Understanding forces: what’s the problem? Phys. Educ. 41, 228–231. https://doi.org/10.1088/0031-9120/41/3/003
Driver, R., 1989. Students’ conceptions and the learning of science. Int. J. Sci. Educ. 11, 481–490. https://doi.org/10.1080/0950069890110501
Kirschner, P.A., Sweller, J., Clark, R.E., 2006. Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching. Educ. Psychol. 41, 75–86. https://doi.org/10.1207/s15326985ep4102_1
Leach, J., Scott, P., 2003. Designing and Evaluating Science Teaching Sequences: An Approach Drawing upon the Concept of Learning Demand and a Social Constructivist Perspective on Learning. Stud. Sci. Educ. 38, 115–42.
Lee, E., Luft, J.A., 2008. Experienced Secondary Science Teachers’ Representation of Pedagogical Content Knowledge. Int. J. Sci. Educ. 30, 1343–1363. https://doi.org/10.1080/09500690802187058
Miller, G.A., 1994. The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information. Psycological Rev. 10, 343–352.
Muller, D.A., Sharma, M.D., Reimann, P., 2008. Raising cognitive load with linear multimedia to promote conceptual change. Sci. Educ. 92, 278–296.
Palmer, D., 2005. A Motivational View of Constructivist-Informed Teaching. Int. J. Sci. Educ. 27, 1853–1881.
Scaife, J., 2008. Science Learning, Science Teaching, in: Science and Education. Routledge, Oxon, pp. 581–593.