Skip to main content
This is a new service – contact continuing-professional-development@digital.education.gov.uk with any feedback

Research and Practice Summary

This reading will help you understand some of the theory behind this week’s topic. We will start by introducing some of the key concepts (these are in bold). You will also see some suggestions of how to put these concepts into practice. When using these concepts in your own practice you will need to take account of your pupils’ characteristics, the context of your classroom and the nature of the material that you are teaching.

Case study: part 1

Improving assessment in Year 4 science

Robbie recently taught a series of lessons about the digestive system to his Year 4 class and was really pleased with how things went. However, he was surprised when the end-of-topic test revealed that pupils had not learnt what he wanted them to learn. They had not understood core ideas about the digestive system.

Over the topic, pupils had done lots of writing and been really engaged in each lesson. Robbie had also covered all of the content to be learnt, and pupils had been really enthusiastic during class discussions and activities.

What could have misled Robbie about how well his pupils were learning? What could he do to identify these issues sooner and adapt his teaching to improve pupils’ learning?

In every lesson, teachers are gathering information about what their pupils know and can do and where misunderstandings or misconceptions lie. Teachers do this through the different forms of assessment that they use in the classroom. This diagnostic process is a vital part of formative assessment, in which teachers introduce, for example, effective classroom discussions that elicit evidence of learning.

Making a fair assessment of where pupils are in their learning is often complex, as teachers have to avoid being over-influenced by potentially misleading factors, such as pupils appearing busy. Rob Coe has summarised some of the classroom behaviours and practices that may interfere with accurate assessments of pupils’ learning, which he terms Poor Proxies for Learning. These are activities which are easily observed but don’t necessarily signify learning (bear in mind that they are often not ‘negative’: they may contribute to a positive environment in which desired learning is more likely to occur, but they are not in themselves indicators of learning).

  1. students are busy: lots of work is done (especially written work)
  2. students are engaged, interested, motivated
  3. students are getting attention: feedback, explanations
  4. classroom is ordered, calm, under control
  5. curriculum has been ‘covered’ (i.e. presented to students in some form)
  6. (at least some) students have supplied correct answers (whether or not they really understood them or could reproduce them independently)

Case study: part 2

Robbie was misled by his students completing lots of work and appearing engaged and enthusiastic.

The challenge here is that these factors, although often supportive of learning, can also be observed where learning is not taking place. Think, for example, about pupils who work hard to copy from a textbook or who write (or discuss) a lot without really engaging in ‘hard thinking’.

Pupils may sit quietly and be highly engaged in watching an enjoyable video in a lesson, but the content of the video and how the teacher uses it will determine what and how much pupils learn.

Sometimes, a pupil can produce a ‘correct’ response to a question or task without knowing that they have done so, or without knowing how they did it. In other words, their underlying understanding may not be secure. To test this underlying understanding, it is important that teachers regularly prompt their pupils to elaborate upon their answers. Elaboration means explaining and describing ideas in greater detail. It may also involve making connections between the ideas pupils are trying to learn and their own experiences, prior knowledge and understanding, and day-to-day life.

To prompt pupils to elaborate further, you could say:

  • ’say it again in your own words’
  • ‘show me the steps you took to achieve that answer’
  • ‘prove to me that X is the wrong answer’
  • ‘give me two reasons why you think this is correct’
  • ‘explain it to someone who has a different answer’

You could also repeatedly ask ‘why?’ as a prompt to elaboration.

Case study: part 3

Robbie noticed that his pupils were ‘enthusiastic’ during class discussions and made the inference that this meant they were learning. Instead of making this inaccurate inference, Robbie could have asked pupils to elaborate on the points they were contributing, using some of the prompts above.

This would have given him more detailed data on pupils’ understanding, and he could then have adapted his teaching immediately to address any gaps in knowledge or pupil misconceptions.

Hinge questions are a type of question that teachers use to learn more about pupils’ understanding. They are a useful way for teachers to formatively assess their pupils and pinpoint gaps in their knowledge. These are used at pivotal ‘hinge’ points in a lesson or sequence of lessons – often at the end of a phase of learning, when a teacher wants to check pupils’ conceptual understanding to determine whether, and in which direction, they are ready to move on. The teacher asks a hinge question to all pupils and uses their responses (perhaps on mini-whiteboards or using traffic-light cards) to determine the extent to which pupils have understood key concepts and whether any misconceptions are still present. Hinge questions often use a multiple-choice format - the teacher can structure the multiple-choice options to reflect different levels of understanding, including common misunderstandings.

The pupils’ responses to the hinge question should provide the teacher with valuable evidence about what their pupils know, don’t know and need to do next. The teacher should be able to collect (view) and interpret responses from all pupils in less than 30 seconds - e.g. by glancing at each mini-whiteboard or traffic-light card. Teachers will then make adjustments to their teaching to respond to any misconceptions.

Case study: part 4

Developing a hinge question

This is Robbie’s first attempt at a hinge question.

Everything made of metal is an electrical conductor:

A) True

B) False

After speaking with an experienced colleague, he rewrites his question so that each answer provides more useful information about pupils’ understanding:

Put these objects into three groups based on if they do, do not or maybe conduct electricity:

  • a metal ruler
  • a plastic ruler
  • a shiny ruler
  • a ruler
  • a wooden ruler
  • a long ruler

The second question is a better question because the answers that students give, both correct and incorrect, reveal something about their understanding to the teacher.

For the first question, pupils have a 50% chance of simply guessing correctly. The ‘wrong’ answer also reveals limited information about why pupils are incorrect. The answers to the second question could identify specific misconceptions, such as believing that because most metals are shiny, other shiny objects will also conduct electricity.

Writing hinge questions takes careful preparation. To help you write a successful hinge question, consider these key characteristics as outlined by Dylan Wiliam:

  • it should not be easy to guess the correct answer
  • alternative answers should represent common misconceptions
  • more than one answer may be correct (AND, not OR)
  • pupils should not be able to arrive at the correct answer using incorrect reasoning
  • pupils should be able to respond to the question in no more than two minutes (ideally, less than a minute)

Misconceptions typically occur when a pupil incorrectly applies prior knowledge to a new context, perhaps due to weaknesses in prior knowledge or because new ideas are introduced too quickly. For example, they might mis-remember a spelling rule, they might confuse definitions of important concepts or they might mis-apply a mathematical procedure by consistently leaving out an important step. Misconceptions can also arise from mis-hearing or mis-reading.

Once ‘learned’, misconceptions can become embedded in a pupil’s thinking and assumed to be true. As a teacher’s subject knowledge deepens, they learn to anticipate where these misconceptions commonly arise in their subject or phase. You were introduced to the idea of misconceptions in Module 2.

To help you anticipate and overcome misconceptions, you should:

  • discuss with experienced colleagues what the common misconceptions are for an area of study and the steps they take to help pupils master the important concepts in this area while avoiding common misconceptions
  • use assessments at the beginning of new topics to check for prior knowledge and pre-existing misconceptions
  • structure tasks and questions to enable the identification of knowledge gaps and misconceptions (e.g. by using common misconceptions within multiple-choice questions and by targeting areas that you know pupils typically find particularly tricky)
  • prompt pupils to elaborate when responding to questioning to check that correct answers stem from secure understanding

Case study: part 5

What did Robbie do next? Applying good assessment practice

After learning more about good assessment, Robbie realised that he had been focusing too much on things that were not directly related to learning.

During the next topic, electricity, he made the following changes and was impressed with the improvements in pupils’ learning:

  • before teaching the topic, Robbie made sure that he understood common misconceptions about electricity by speaking to colleagues and researching them online. He then analysed his class’s responses to five carefully written diagnostic questions and used this to inform his planning. He started with what his pupils knew, not what he hoped they knew.
  • before teaching each lesson, Robbie thought carefully about how he would efficiently check the understanding of all pupils – not just the pupils who were keen to volunteer answers. He built in frequent activities throughout the topic that generated whole-class responses, including quick quizzes, multiple-choice questions and pupils showing their working on mini-whiteboards. He was careful to avoid relying on ‘poor proxies for learning’, instead targeting his planned assessments at the evidence that would tell him how his pupils were achieving the lesson objectives.
  • during each lesson, Robbie used the formative assessment techniques he had prepared, including forensic follow-up questions, to make his teaching more responsive to his pupils’ developing understanding. Where this identified widespread misconceptions, he spent more time addressing these issues with the whole class. Where particular pupils were struggling, he provided more targeted support, including dynamically re-grouping his students (e.g. pairing pupils who were less secure in their learning with peers who could effectively support them)

After this topic, Robbie created the following checklist to help him design questions to check pupils’ understanding in the future:

  • what does it look like when pupils understand this concept accurately and completely?
  • what common misconceptions or knowledge gaps do pupils experience in relation to this concept?
  • what question will prompt pupils to show me their understanding/ misconceptions/knowledge gaps relating to this concept?
  • how can I capture each misconception or knowledge gap as an option in the list of possible answers?
  • is it possible to reach the correct answer with a process of incorrect reasoning? (In which case, the question needs further work)
  • what actions will I need to take to address each misconception or knowledge gap in my pupils, if these are evident in pupils’ responses? By following his new approach, Robbie is now much more aware of how his pupils are learning, and is less distracted by potentially misleading factors. This means he is more able to adapt his teaching to his pupils’ needs and capabilities.