Some Clarity about Confusion

I have seen a number of articles/videos over the last wee while that have really helped to clarify my thinking about the difference between how learners feel about their learning experiences and how effectively they are actually learning.

Interleaved Practice

Research seems to suggest that if you ask learners to do a mixture of problems after a teaching episode rather than doing lots of practice of the type of problem they have been taught about, they feel more confused and perform worse on the day, but when they are given a test a week later they drastically outperform those who stuck to the one type of problem during the original lesson (The Effects of Interleaved Practice, Taylor and Rohrer 2009).

Facing Misconceptions

Derek Miller showed learners a video about gravity and motion.  Learners judged the video to be clear and easy to understand, but the video completely failed to correct their misconceptions, and their performance in a post-video test was virtually the same as in the pre-video test..  Another group of learners were shown a video which went on to state common misconceptions, and explain why they were false.  The learners described this video as confusing, but they went on to improve by 100% their performance in the test on the topic. You can watch his description of the experiment here on YouTube.

Effective Presentations

I can't find a link to the last bit of research, alas, so I am paraphrasing from memory.  An American who spent decades researching effective use of Powerpoint performed the following experiment.  He presented the same set of information to 2 groups using Powerpoint.  One group saw a bland, bare presentation, whilst the second group saw a flashy presentation with lots of animation and colour.  The second group enjoyed their presentation much more than the first, and felt more confident about the topic than the first.  But when tested the first group outperformed the second group significantly.

And so...

How often do you take the phrase "I'm confused" in your classroom as a negative sign?  And how often do you use learner satisfaction or ability to perform on the day as barometers of the effectiveness of your lessons?  It appears that you shouldn't be doing either without some deeper assessment of the quality of the learning that is going on.


  1. A belter of a post...fascinating stuff Robert, thanks.

  2. Really interesting take. Thank you for sharing!

  3. if you ask learners to do a mixture of problems after a teaching episode rather than doing lots of practice of the type of problem they have been taught about, they feel more confused
    Anyone else seeing parallels with CfE? If you could examine Scotland's teachers on how education systems need to change, they might just do a whole lot better than they'd expect.

  4. Nice ideas to bring together. Useful to make the link to the 3D view in BtC5. Breadth of ideas, challenge and application in new situations. And try to assess all three all during and after the experience.

  5. For the benefit of any readers who have not been drinking from the "Curriculum for Excellence" kool aid: BtC5 is "Building the Curriculum 5: A framework for assessment" - part of the supporting documentation for the new Scottish curriculum.

    Yes Frank - IMHO, how we implement that rich assessment in a meaningful, sustainable, proportionate way is one of the biggest challenges/opportunities in CfE.

    David - I do now you mention it :-)


Post a Comment

Popular posts from this blog

ActiVote Walkthrough Part 1

Someone is looking for the Catburgers!

What the DFES report on interactive whiteboards really says