Opinion – Knowledge is the Road to Joy

The work of E D Hirsch and many others has been cited as pivotal in the recent interest – particularly in the UK and the US – of ‘knowledge-based curriculums’. That’s great, says Will Orr-Ewing – as long as we don’t forget joy.

A knowledge-based approach is on the march in UK schools. For any traditionalist who was working in the early 2000s – when a knowledge-based approach would have been dismissed as boring, reactionary and (thanks to Google) redundant – this must feel like an unexpected victory. It is a mark of how far we have come from the days of the 2007 National Curriculum and the RSA Open Minds Curriculum that the majority of the UK’s most prominent schools and educationalists now publicly favour a knowledge-based (or knowledge-rich) approach and the education minister can proudly call himself a ‘Hirschian’.

With the battle won (in theory if not quite yet in practice) and the victors sweeping the battlefield, finishing off dead and wounded progressives, many educationalists are now moving on from philosophy to implementation. Before they do, it is worth pausing to stake a philosophical claim that might determine the forms this implementation might take. This claim, neglected in debates over the last decade but treasured by older thinkers, is that knowledge – whatever its other educational benefits – brings joy. That knowledge gained is not just a means to other ends but is its own reward, and that this is one of its most important features and benefits. It is understandable that, in the fierce heat of contemporary squabbles, heads and educationalists prefer to talk up the more empirical benefits of a knowledge approach; but, by doing so, they leave the implementation of a knowledge-based approach open to those who would happily squander its joy for its effectiveness. In order to illustrate the way that a knowledge approach is currently advocated, it is necessary to summarise the arguments of its defenders very briefly. There are three main strands, all interrelated and often evoked as one.

1. Knowledge = access. Children need a secure knowledge base to access, firstly, texts of increasing complexity (cf. E D Hirsch, Daniel Willingham, Doug Lemov et al.) and, secondly, higher-order skills such as creativity, interdisciplinary thinking, critical thinking etc. (cf. Dylan Wiliam, Daisy Christodoulou, David Didau, Joe Kirby et al.). Here is a representative quote from Carl Hendrick: ‘The extent to which we can think critically about something is directly related to how much we “know” about that specific domain and “knowing” means changes in long-term memory.’ This contention is sometimes summarised as ‘the Matthew effect’ based on the passage from Matthew’s Gospel: ‘For all those who have, more will be given, and they will have an abundance; but from those who have nothing, even what they have will be taken away.’

2. Knowledge = success. Because higher-order skills, including exam skills, cannot be accessed without knowledge, the best way to prepare for long-term exam success is via a knowledge-rich curriculum. The work of schools such as Michaela and those in the Inspiration Trust exemplify this approach. Christine Counsell, Director of Education for the latter, says: ‘I feel quite passionate about the broad curriculum in key stage 3 serving attainment in GCSE.’

3. Knowledge = power. Building on the two positions above, if schools do not teach knowledge, only those children from more privileged backgrounds whose parents pass on their own knowledge (even if obliviously) will be able to read well, access higher- order skills and achieve exam success. This is the social justice case for a knowledge approach advanced by all of the above, as well as the likes of the West London Free School. See also Michael Young’s concept of ‘powerful knowledge’.

These arguments, prosecuted on Twitter, blogs and at conferences, have generally and rightly won out – remarkably so, given the headwinds of a progressive teaching establishment. And yet, despite the fact that such arguments are often labelled ‘traditional’, they feel rather too bound within late modernity’s norms and values. As you have read in the above, knowledge is almost exclusively presented as a means rather than an end. The search for empirical benefits, able to justify approaches in only instrumentalist terms, has missed the marrow at the heart of knowledge and so risks erecting an educational project as thin and dreary as the orthodoxy it correctly seeks to replace.

Perhaps we need older perspectives – from an Aristotle or a C S Lewis or anyone who might be said to defend a liberal education in the old sense of that phrase – to remind us of just how much we are selling knowledge short. This older view of what knowledge can do is perhaps best encapsulated in the writing of Charlotte Mason, who saw herself both as the inheritor of this ‘liberal education’ tradition and as being charged with spreading its fruits to children of every background in late Victorian and Edwardian England. Here is what a knowledge-based approach meant to her:

‘We launch children upon too arid and confined a life. Personal delight and joy in living is a chief object of education … It is for their own sakes that children should get knowledge. The power to take a generous view of men and their motives, to see where the greatness of a given character lies, to have one’s judgment of a present event illustrated and corrected by historic and literary parallels … these are admirable assets within the power of every one according to the measure of his mind; and these are not the only gains which knowledge affords. The person who can live upon his own intellectual resources and never know a dull hour (though anxious and sad hours will come) is indeed enviable in these days of intellectual inanition, when we depend upon spectacular entertainments pour passer le temps.’

In her writing and in her schools, knowledge was never presented as a means to something else.

She talked of a child’s ‘knowledge-hunger’, an appetite of the mind akin to the appetite of the body for food. Knowledge was inherently ‘delightful’, ‘enlivening’, ‘vitalising’, helping children to see a world that pulsated with meaning. It required no further justification. Beyond the philosophical differences, she also contrasts with today’s defenders of knowledge in the implementation of her vision. There are many interesting ways in which the approaches diverge (and, naturally, converge) but the three summaries below will stand as illustrations:

1. Role of the teacher. It seems fair to say that those that promote knowledge today also tend to favour a heightened role for the teacher than the ‘guide on the side’ proposed by progressives. Many knowledge-rich schools make much of their teachers’ subject knowledge for instance. Mason would not have had a problem with this per se but she worried that a charismatic teacher could get in the way between a child and knowledge. There is an interesting piece by one of her followers on her views on Vygotsky’s ‘scaffolding’, which shows her dislike of the way teachers would often unwittingly come between children and ‘the mountain’ (or what she elsewhere called ‘the feast’) of knowledge through excessive talking. Teachers of course have their role to play in elucidating meaning but their role was one of ‘masterly inactivity’, something which is unlikely to find any favour in contemporary knowledge advocates, who tend to favour direct instruction and other ‘sage on the stage’ roles for the teacher, sometimes going as far as prescribing scripts for teachers.

2. Books vs textbooks. Because Mason feared that teachers often got in the way between children and knowledge, her lessons were rooted in reading. She condemned the way that educationalists ‘wrote down’ to children in ‘dry as dust’ textbooks, diluting the delightful aspects of knowledge, and would have disapproved of the generally pro-textbook stance of knowledge’s defenders today, not to mention the printable worksheets, précis and simplified versions that are still so common across all classrooms today.
She placed her trust not in all books but in certain well- chosen books, especially those with lively narratives and the right expressions, which expertly conveyed meaning from the mind of the author to the mind of the child. The teacher’s role is to elucidate the meaning in the books but not to be the main purveyor of the knowledge itself.

3. Knowledge demonstrated vs teaching to the test. Today’s defenders of knowledge seem to see the UK’s examination system as being a worthy demonstration of their pupils’ knowledge, boasting of high attainment in GCSE or, in the case of private schools, of places won at top senior schools or universities. Mason, on the other hand, worried that any teaching to the test, any academic marks or prizes, winnowed the innate desire within children for knowledge for its own sake. She favoured a method called narration, whereby children told back (either written or out loud) what they had heard or read. Now that schools can boast of their pupils’ knowledge via social media, YouTube etc., where are the demonstrations of that joyful knowledge that Mason would surely have used if she was still alive today? (Her equivalent was to publish a list of substantive nouns and proper nouns written in a typical exam in her schools – e.g. Africa, Alsace- Lorraine, Antigonus, Abdomen, Antennae, Aphis, Antwerp, Alder, etc.) The closest that comes to it are Michaela’s moving videos of their children chanting great poetry, but where are the others?

By aligning a knowledge approach with textbooks, charismatic teaching and excellent examination prep, amongst many other implementations, there is a danger that today’s defenders of knowledge are dampening exactly that aspect of knowledge that makes it so genuinely ‘rich’, ‘powerful’ and delightful. It is time to reclaim joy as the rightful aim of a knowledge-based approach (could it even be hoped that a knowledge approach implemented on Mason’s grounds could go some way to pushing back at the awful incidence of childhood unhappiness we see about us?) and time to experiment with other methods that protect and uphold this worthy goal for a great and liberal education.

Graham Nuthall: Educational research at its best

Professor Emeritus Graham Nuthall, an educational researcher from New Zealand, is credited with one of the longest series of studies of teaching and learning in the classroom that has ever been carried out. A pioneer in his field, his research focused on the classroom, and what impact certain factors – for example, teaching – had on the outcomes of learners. Perhaps his most famous work is The Hidden Lives of Learners, which is increasingly being seen as a seminal text for understanding learning.

Jan Tishauser, programme manager for researchED Netherlands, explores his contribution to the education debate, and why his work is extraordinarily relevant today.

The outcomes of the research that Graham Nuthall conducted into the classroom experience of learners are little known, notwithstanding the far-reaching implications for our classroom practice. He demonstrated the need for formative assessment and discovered which factors influence learning most. He also pinpointed metacognition’s role on learning outcomes.

Nuthall started recording classroom conversations as a student. He kept on doing this during his whole career from 1960 until 2000. In some ways his research was an expedition into unknown territory. His first question was: what actually happens during a lesson? His final research question was: what is the role of ability in learning?

Taking off

It all started in 1960, when Nuthall (at that time a young student) obtained permission from a number of experienced teachers to record their lessons with a number of students. At this time, he had not yet developed a sound design for his research. He was simply driven by curiosity, wondering what actually happens in a lesson. He worked under the assumption that one needs to observe experienced teachers to spot good teaching.

On the surface, his initial results show a seemingly spontaneous interaction between teachers and students; but beneath this surface, his analysis showed set patterns of communication and predictable structures and rules for social interaction. Nuthall replicated his research in the US and Japan; these rituals were identical everywhere. But the purpose of these rituals was not clear at that time. He concluded that ‘like language, teaching has its own underlying grammatical rules’.

Learning that experience makes no difference

In the period between 1968 and 1974, Nuthall and his PhD students started to work with an experimental design. Together with a group of teachers, they scripted a series of lessons about the black-backed gull. They wanted to know whether a teacher’s experience or training influenced the learning of students. They analysed differences between three groups of teachers: experienced teachers, inexperienced teacher trainees and teacher trainees who were trained to analyse their lessons using micro-teaching and recording. The results were rather unexpected: experience and training made no difference; instead it was only the type of feedback the teachers gave and their style of questioning students that mattered.

Dead end

Nuthall and his PhD students thought they were on to something and continued to work with scripted lessons. They worked with experienced teachers, made recordings, did pre and post tests, trying to find the factors that had a positive effect on learning outcomes.

Finally they came up with results: the way teachers gave feedback, questioned students and activated students made a difference. This might not seem so amazing to us now, but in 1974, these were promising results. One of the problems that was brought to the surface through their intensive monitoring of the interactions in the classroom was the enormously complex reality of the classroom. To supplement their findings, they would have to do hundreds of intensive follow-up studies, which would most likely produce an endless, useless list of dos and don’ts. It could lead to a ‘robotification’ of the teacher, while their own research had shown them that this is impossible and undesirable:

‘I realized I was following a path that satisfied the cultural rituals of the research community, but would be of little value to teachers, and probably do them harm.’ Nuthall hit a dead end. He describes this period as ‘roaming in the desert’.

A focus on student learning

Then Adrienne Alton-Lee, an experienced teacher, started working on a PhD in 1978. Her research question focused on the students. What causes a student to learn the course material? In her classroom practice she was unable to predict when a given student would have learned the material and when they would not. Alton-Lee dissected the course material in great detail, down to what she called ‘concepts’ and ‘items’, using a rolodex system. For example, a simple series of lessons on climate could contain as many as 500 items.

What stands out most in Nuthall’s research is that only the ‘three times’ rule has predictive value. Ability or intelligence or similar properties do not.

A ‘concept’ could be: Antarctica is the driest continent. Examples of ‘items’:

  • There is little precipitation.
  • There is more precipitation in the Sahara.
  • Because of the low temperatures the snow never melts.

Every 15 seconds, all student communication and every action was registered, such as what they did, or what they said to themselves and to others. All the material a student encountered was registered and everything a student made or wrote was photographed. This led to a dissertation published in a leading magazine.

Replication crisis

Because Alton-Lee had followed a mere three students, Nuthall decided he needed replication studies. He designed three follow-up studies in order to replicate her findings. Technological advancements made it possible to gather even more information. Linking the students’ learning experiences, the course material and the outcomes seemed to work. Together, they collected a mountain of information.

They identified four simultaneous processes going on:

1. The invisible thinking of the student

2. The self-talk

3. The social interaction between peers (mostly invisible to the teacher)

4. The teacher-led public discussion

The self-talk and interaction between peers is well hidden. This was illustrated by the fact that while each student had an observer, even they missed 40% of the talk that was on tape. Nuthall concluded that the opinions from peers were more important and better believed than the teacher’s opinions, including those related to the course matter.

The study also concluded that:

  • When you start a lesson, half of what you are about to teach is already known.
  • Every student holds a different piece of the puzzle.
  • Almost every student learns something different in your lesson.
  • In practice, they learn more from each other than from the teacher – including misconceptions – which is obviously not always a good thing.

The often-chaotic nature of the classroom explains the function of the rituals that Nuthall found in his first study. The rituals allow the teacher to focus on the class as a whole; the teacher simply doesn’t have the resources to follow individual students. Part of the ritual is the ‘nodding and smiling’ of the students who draw the attention of the teacher. Students also make sure to appear to focus on their work whenever the teacher is in their vicinity. ‘Appear’ is the key word here.


Ultimately, Nuthall decided to precisely map out the learning process of one student in relation to one topic. He analysed the interaction of ‘John’ in regards to the topic ‘The migration to New York’. That’s when some light was finally shed on a recurring pattern.

His analysis of John’s learning experience made it possible to define learning in the following terms: it is a positive change of what we know or can do; it takes place by means of a sequence of events and learning experiences; each experience builds on the previous one and every change in the order of the learning experiences will lead to a different outcome. The learning activities of a student consist of understanding and making sense of the learning experiences. A student understands, learns and remembers a concept if they have encountered all the underlying information three times.

They built on this insight and did one replication study after another with increasing numbers of students, classes and topics. And they could predict with 85% certainty which student would correctly answer which question on a test.

If ability doesn’t matter, what does?

What stands out most in Nuthall’s research is that only the ‘three times’ rule has predictive value. Ability or intelligence or similar properties do not. Yet the ‘better’ students learn more. Nuthall dedicated his last research period to solving this conundrum. These students had more prior knowledge and they profited more from the lessons. The secret seems to be that they make sure to get more out of the lessons. They possess better metacognitive skills; they understand what it takes to get results.

The Hidden Lives of Learners

At the end of his life, Nuthall hastily wrote The Hidden Lives of Learners, drawing these conclusions for the classroom based on his research:

  • Standardised tests appear to offer certainty, but are no more reliable than interviews held with students.
  • Learning activities should be designed to take into account how memory works.
  • The subject matter should be repeated in different ways.
  • Follow the individual learning experience.
  • Less is more: we should confine the curriculum to the big questions. Teachers need the time to design rich learning experiences, conduct pre- tests and get to know the social processes in the class. Learners need the time and the space to really master the content.

Nuthall’s diligent research efforts gave us lasting insights into the fundamentals of learning and teaching. We should take his research into account both in our current teaching practice and in our curriculum design. For me, the two fundamentals are that learning takes time and that it is not necessarily related to ability. The latter is really a finding that should encourage us all to set high goals for ourselves and our students.


Nuthall, G. and Alton-Lee, A. (1993) ‘Predicting learning from student experience of teaching: a theory of student knowledge construction in classrooms’, American Educational Research Journal 30 (4) pp. 799–840.

Nuthall, G. (1999) ‘The way students learn: acquiring knowledge from an integrated science and social studies unit’, Elementary School Journal 99 (4) pp. 303–341.

Nuthall, G. (2004) ‘Relating classroom teaching to student learning: a critical analysis of why research has failed to bridge the theory- practice gap’, Harvard Educational Review 74 (3) pp. 273–306.

Nuthall, G. (2007) The hidden lives of learners. Wellington: NZCER Press.

Nuthall, G. (2012a) ‘Understanding what students learn’ in Kaur, B. (ed.) Understanding teaching and learning. Rotterdam: Sense, pp. 1–40.

Nuthall, G. (2012b) ‘The acquisition of conceptual knowledge in the classroom: a case study’ in Kaur, B. (ed.) Understanding teaching and learning. Rotterdam: Sense, pp. 97–134.

Wright, C. J. and Nuthall, G. (1970) ‘Relationships between teacher behaviours and pupil achievement in three experimental elementary science lessons’, American Educational Research Journal 7 (4) pp. 477–491.

The light is winning


At the recent researchED in Haninge, Sweden, researchED magazine’s editor Tom Bennett closed the conference with a speech that tried to understand where we had got to in evidence-informed education, and what the landscape now looked like. This is a transcript of that speech.

The sleep of reason produces monsters – at least it does in education, where we see teaching full of myths, snake oil and poorly evidenced practices and strategies. Why have we succumbed so much to learning styles and worse, and why have we found ourselves basing our vital practice on gut feelings, hunches and intuition? I think it’s because misconceptions creep into the spaces where:

• we don’t know much about the topic,

• we like the answers junk science provides, or

• we’re too busy to find out the facts.

How did we get here? Let’s reframe that question. Where did you acquire your ideas about teaching, learning, pedagogy etc? Chances are your answer revolves around the following: teacher training; memories of your own school experience; your mentor; your early class experiences.

Up to a point, that’s fine. Teaching is to a great extent a craft. But craft without structured evidence to interrogate its biases and misconceptions can lead to what I call ‘folk teaching’, where we reproduce the mistakes of our predecessors as easily as we do their successes.

So what? Because merely folk teaching leaves us at the mercy of snake oil, fads, fashions, ideology, bias. We can think of an ocean of cargo cult voodoo that often dominated educational discourse in the past: Shift Happens; TED talks; the Great Interactive Whiteboard Con; most links you see shared on Facebook. We recall the training days hosted by inexpert experts; the books by charismatic gurus; the often quoted rentagobs that fill TV, radio and print and seem to know so much about classrooms despite never having worked in one. Know- nothings elevated by other know-nothings.

In this landscape, discussions about teaching become a battle of prejudices – Pokémon debates where we simply hurl one unprovable claim against another until someone blinks.

A new hope?

My naive ambition in 2013 when I began researchED was simple: we should lean on evidence where it exists; we should try to become more research-literate as a profession; and crucially we should ask for evidence at every turn. That was as far as I had gotten, strategy-wise. But surprisingly, amazingly, researchED took off, despite its lack of blueprint or funding. It was a movement that wanted to happen, and we started to respond to demand by hosting events across the UK and, quickly, around the world. Since then we have been to 14 countries, 5 continents, and seen 17,000 unique visitors to our events. researchED has 30,000 followers on Twitter (not counting the local accounts), and we have been graced with 1000 speakers (none of whom are paid). We pay no salaries (least of all to myself) and entirely self-fund each event. It is a humbling testimony to what can be achieved for next to nothing if love and altruism and mutual benefit are all you want to achieve. And it reminds me of the best in people – always.

The dangers of research

But it is important to always retain a sense of caution alongside the enthusiasm. The sleep of reason produces monsters, even with good intentions. There have been some reasonable responses and criticisms of this new age of evidence enquiry:

Evidence in the wild

Bad research – the ‘not even wrong’ categories like learning styles – isn’t the only problem. What happens to evidence in the wild is crucial. One thing this has taught me is that high-quality research is, by itself, not enough. If it doesn’t reach the classroom in a useful state then it may as well not have happened. And often good research gets lost in translation. I call this the Magic Mirror. Sometimes research goes through the mirror and schools turn it into something else. Research translation is as important as research generation. Poor old assessment for learning drops into the Black Box and gets mangled into levelled homework and termly tests, weird mutant versions of what it was meant to be. And some research is simply misunderstood: project-based learning, homework, collaborative learning all have utility in the right contexts. But how many teachers know the nuance of their evidence bases? Homework, for example, has variable utility depending on circumstances. Grasping the when and the how of ‘what works’ is essential, otherwise we oversimplify.

A brave new world that hath such teachers in it

I think researchED is a symptom of a new age of evidence interest. Perhaps also a catalyst – one of many that now exist, from the Deans for Impact1 to the Learning Scientists2 to the Five from Five3 programme and many more. This is indicative of an appetite that was always there. We now host more conferences, visit more countries every year. We have more first-timers, both attendees and speakers. Like the can of worms opened, the worms cannot now go back in the can. This car has no reverse gear. Successful innovations, once perceived, cannot be unseen.

Policy makers

I once asked ex-UK premier Tony Blair what research he relied on when making education decisions. He replied that there ‘wasn’t any useful evidence at the time’. This attitude still dominates the biggest lever-pullers. We still see at a policy level multiple factors driving decisions away from evidence bases:

• Budgets
• Policy/ministerial churn
• Lack of insider representation
• Reliance on personal experiences

But the more the profession talks the language of evidence, the more they will have to listen to it. And I have always believed that we should reward policy-makers when they participate in evidence-driven discussions. That’s why I’m proud we try to engage rather than barrack our political representatives. And why every year we invite ministers of every party to our party.


Leadership is still the biggest lever in driving evidence adoption. One evidence-literate school leader cascades far more than one teacher. Some schools are now embracing the ‘research lead’ role, and devoting staff resources to this area. There is a moral and a practical duty for leadership to attend to evidence, because an era of dwindling resources demands better, more efficient decisions – less waste, more impact, from training to workload to tech. Let us abandon the days we tried to buy our way out of our problems, as if a chequebook were a magic lamp. And I sometimes wonder if raising budgets isn’t by itself insufficient, because the most important thing is to be judicious in spending the money we have.


In the absence of a coherent, evidence-informed system it is necessary for teachers to drive their own research articulacy. It is necessary. Teachers should not be pseudo- researchers, but they should become literate; share, disseminate and interpret high-quality research, and help us to develop a herd immunity, where enough of us are learned enough to recognise the zombie learning and junk pedagogy when it rises – as it always does – from the grave.

Embrace ambiguity

We have one more duty to observe. Teachers must become active participants in the research ecosystem rather than massive recipients. But teaching is driven by practice, and the data is subtler than we suspect. We frequently seek definite answers where none exist. Research often unpacks ambiguity, and we need to embrace nuance, uncertainty and probability rather than dress high-quality research up as eternal and immutable fact. We should avoid universals and certainty – and always remember that context is king. Otherwise we perpetuate dogma and become that which we seek to surpass.

The gatekeepers

One thing I didn’t expect – but should have – is that the existing system objects to its own reinvention. Whenever power shifts, former custodians of power seek to preserve privilege; and this new age of evidence adoption has frequently been dismissed by some academics, some education faculties, commercial interests, some teaching bodies. But the habit of command dies slowly. Education has relied on arguments from authority for decades. Evidence challenges their dominance like mystics challenge the Church. I have faith that evidence and truth will win, but it will not be because it was easy. Arguments must be made; evidence bases must be made transparent.

Evidence doesn’t obliterate professionalism – it liberates it

We enter a new age of evidence. Once seen it cannot be unseen, and science cannot be uninvented, although ideas can change. Fears that evidence makes us slaves to research are no more rational than the fear that understanding how to cook makes you a worse chef. It empowers. If you object to where evidence takes us, then find better evidence. Otherwise, ask yourself if your opinion is dogma, or if something more animates your objections.

Caveat emptor. In a complex field we need interpreters and brokers of research, but we must also take care not to create a new priesthood – the neo-shamans of evidence, who act as irrefutable guardians of divine truth. The OECD, for example, in some ways has become the new international inspectorate, blessing or banishing entire countries on the basis of their data. Is this healthy? I don’t think so. Beware also the New Generation of Consultants selling ‘Snake Oil 2.0’ who have updated their absurdities by simply stapling the phrase ‘evidence-based’ onto their bags of magic beans. And don’t think I’m ignoring the danger of researchED succumbing to this, like mortal ring bearers corrupted by Sauron. This is why we curate events to include challenge and debate, like the grit in the oyster that helps to make the pearl.

The future

We begin to see new models of professional groupings emerge – digital collaborations, conference communities that no longer require permission to exist. Self-propelled, self-sustaining, self-regulating, they exist only as long as people want to go. These fluid, accessible, dynamic, virtual colleges are needed until they are no longer needed because the profession will have reinvented itself. We’re not there yet. Which is why we commit to cheap, accessible events that are democratic, inclusive and most of all, directed at discovering what works – and when, and why, and how.

My ambition is that we begin to drive this voluntary professional development, and then that cascades back into schools and starts conversations which set off sparks in classrooms – ones that catch fire and burn down dogma. And also that initial teacher training increasingly

makes evidence its foundation (where it does not do so already), platforming the best of what we know rather than perpetuating the best of what we prefer. For new teachers to be given skills to discern good evidence from bad. And for that to eventually bleed into leadership; and from there, into the structures that govern us.

I’m reminded of the story about the eternal battle between darkness and light in the sky. A pessimist could look up and think that darkness was nearly everywhere. But the optimist doesn’t see that. The optimist knows that once, there was only darkness.

If you ask me, the light’s winning.

This transcript was first published on Tom’s blog, The Behaviour Guru.


  1. www.deansforimpact.org/resources/the-science-of-learning/
  2. www.learningscientists.org
  3. www.fivefromfive.org.au


From the editor

The relationship between education policy and education evidence has never been easy. The realpolitik of education is pulled hither and thither by many horses, and research bases are only one of several influences. In 2010 the CfBT report Instinct or Reason: How education policy is made asked every surviving post-war UK minister what the principal reasons behind their policy decisions in education were. The answers were sobering, if unsurprising:

  • Urgency – a sense that ‘something must be done’
  • Ideology – the values and beliefs of policymakers
  • International exemplars
  • Cost
  • Electoral popularity
  • Pressure groups
  • Personal experience
  • Research evidence

Notice research there; a dusty bottom.

There are many reasons why this is perfectly understandable, of course. Parties are elected to deliver a manifesto, which is composed to reflect the values and ideologies they seek to represent. Evidence that confounds or contradicts these platforms can be seen as an obstacle rather than an ally to the policy process.

But there is cause for hope. The growing and international appetite for evidence-informed education we see at researchED events and beyond is fuelling a renewed appetite for evidence-informed policy to drive that agenda.

Change in policy can be slow; ministerial churn can be fast. In this issue, I speak to Nick Gibb, the UK Schools Minister, a politician who, probably more than most in the UK, has spearheaded a drive towards evidence-informed education, particularly in the field of phonics and literacy, but also more broadly in pedagogy. This interest at a ministerial level in the affairs of what happens in the classroom has not been met with open arms, and Gibb has attracted criticism for walking into what was once described as the ‘secret garden’ of education.

It is easy for politicians and policy-makers to look to education for the engine of their reform programmes. The Jesuit philosophy of catching them young is attractive; you have a reasonably compliant cohort of tomorrow’s scientists and sailors who crucially, can’t yet vote. Society-building and vocational imperatives are also big drivers in policy behaviour. But where does the ambitious politico turn for expertise and answers? Why, the experts. But which ones? In a field as contested as education, it is understandable if politicians recruit advisors who flatter rather than inform.

Which is why evidence-informed education has never been needed more. Education strategies must be as evidence-informed as possible, from the classroom to the Oval Office. It is entirely right that democracies should define the goals of education; it is imperative that once that will has been conceived, evidence should be the backbone of how we seek to realise it.

Which is why at researchED we engage with everyone involved in the education ecosystem, from teaching assistants to cabinet ministers, with the ambition that informed and careful conversations will save us from the dogma and superstition that has characterised our extraordinary and turbulent profession. I hope you enjoy our second issue of researchED magazine, and find something to challenge, inspire and enthuse you in your practice.

Thanks for reading.


Give me your answer do: An interview with Daisy Christodoulou

Education’s fastest talker tells us about mythbusting, why assessment drives everything else, and the seven myths of edutech

Daisy Christodoulou is the author of Seven Myths about Education and Making Good Progress?: The Future of Assessment for Learning, as well as the influential blog, The Wing to Heaven. She is currently the Director of Education at No More Marking, a provider of online comparative judgement. She works closely with schools on developing new approaches to assessment. Before that she was Head of Assessment at Ark Schools, a network of 35 academy schools. She has taught English in two London comprehensives and has been part of UK government commissions on the future of teacher training and assessment.


What’s your background?

I did Teach First, trained as an English teacher, in a school in London for three years, then another secondary school. I was working in a school that went into special measures. It was challenging. And I learned that a large amount of advice out there for us – or what was being mandated for teachers – didn’t reflect reality.

Like what?

We were getting a lot of Ofsted scrutiny. I write about this in Seven Myths. The kind of information we were getting about how you succeed for Ofsted, and lots of the advice wasn’t based in reality and it didn’t have any evidence backing it up.

For example?

The biggest thing I came back to in Seven Myths was an example of a best practice lesson for an English teacher about Romeo and Juliet: teaching students by getting them to make puppets. These aren’t straw men. One criticism Seven Myths gets is that this is a ‘straw man’. But it’s all based on Ofsted reports from that era. If only I’d made this up, if only this had been a figment of my imagination and not best practice. The problem with that – and it’s not just a knee-jerk reaction, ‘all puppets are stupid’ – is that when you look at the evidence, you remember what you think about. And what you think about is how you made the puppets. You won’t be thinking about Romeo and Juliet, you’ll be thinking about puppet mechanics. It’s not that I’m averse to making puppets. If that’s your aim, great. But as an English teacher, learning about Romeo and Juliet, that advice to make puppets wasn’t very helpful.

Why do you hate puppets so much? I think we need to unpack this a bit more.


The reason why facts do matter isn’t an ideological argument. It’s an evidence-based argument.

So you were an English teacher in challenging schools. Fast forward, you’ve written an international sensation of a book. What happened in between? What caused the awakening?

Part of it was a nagging feeling that something wasn’t’ right. All the examples in the book are backed up – they’re referenced from Ofsted inspections or consultants or ITT. There were other things that I put in the book that were also pretty bonkers. You would hear consultants talk about ‘talkless teaching’ – there was this point where if you were actually intervening or talking or teaching, you must be doing something wrong. It was a nagging feeling that it was wrong. It didn’t make sense. What you’re inclined to do is think ‘Well, all of these people are saying the same thing. It can’t be them; it must be me.’ The awakening led to me reading more, and researching more, and realising that evidence suggested maybe my nagging feelings had something to them.

What kind of things were you reading?

Willingham, obviously. That was a lightbulb moment. And the first real insight I had was reading Hirsch, and his Cultural Literacy. Thing about that is that it’s – as Willingham says – a book about cognitive science, and all the heat and the light is generated by the list of the facts at the end. I then read a bit by Herbert Simon – who is enormously interesting, one of the great polymaths of the 20th century – and his work on chess players, how they think and learn. And he was incredibly insightful. And realising that there’s this research out there by a Nobel Prize winner, that was completely contradicting so much of what I was hearing in teacher training.

And that inspired you to write?

It did. I got so frustrated hearing what I was hearing. It’s hard to imagine now but back in 2009, 2010, these ideas were things that people just took for granted – ‘You can just google it.’ It was just so frustrating. Everyone saying these things. And there was all this evidence out there by serious people saying, ‘No, this is not the case. It’s not how we learn, you can’t rely on Google, you can’t access memory through the cloud.’ And that was how Seven Myths came about. They were just the seven things I got most annoyed by.

Can you summarise the main ideas?

The über myth is that facts don’t matter or knowledge doesn’t matter. It’s been around a long time, at least back to Rousseau. The modern conception around thinking skills, and so on, they seem very modern but they are actually a rehashing of things that are over 100 years old in some cases. And the reason why facts do matter isn’t an ideological argument. It’s an evidence-based argument. We need facts in long-term memory in order to think, because we have working memory and long-term memory and our working memory is very limited, and long-term memory is the seat of all intellectual skill. Working memory can only hold four to seven items of information in it at any one time, so whenever you solve a problem, your working memory can very quickly become overwhelmed. So particularly with very young children, you give them a multiple-step maths problem. If they’re not secure on their maths facts and processes, by the time they get to the end, they’ve forgotten the beginning. That’s not because they’re stupid. We’ve all got a working memory issue.

So, the idea is to get as many facts or chunks of facts into long-term memory as possible, and free up that precious space in working memory. That’s the value of e.g. maths facts. It’s also necessary if you want to be able to read and you want to read fluently, but you don’t want to have to sound out every word or stop to look up every word in the dictionary. If you have to do all that – as you’ll know from learning a foreign language – then you quickly get overwhelmed. But when you can read fluently, it’s a smooth process and you can read for hours and not get tired and enjoy the act of it. But if you stop and start, it’s not a pleasant process and you can’t enjoy the meaning.

But surely nobody is against teaching facts?

(Laughs) That’s why the structure of the book is designed to try and show you that some people actually are against teaching facts. That’s why the structure of each chapter is ‘What does the research say?’, ‘What are people saying today in theory?’ and ‘What are recommending in practice?’ I structured it like that because a lot of the rhetoric in education is frustrating.

You’ll get some who’ll spend a chapter saying why facts are bad, and projects are great. I’m not against teaching facts. It’s very easy to spend a long time dismissing facts, rubbishing facts and then saying ‘But of course we’re not against teaching facts.’ So what I wanted to do was to try to move beyond an argument about words and to actually look at practice. What is the actual lesson advice you are expected to follow? The moment you start to dig into that you realise that all the types of lessons and practice that people were recommending were disagreeing with what the evidence said. And lots of lesson types that fitted the evidence were being dismissed as worst practice.

The best example of this is direct instruction. DI has an enormous research base behind it, huge amounts of evidence. Whenever you try to deploy DI-style tactics in a lesson, people will react with horror. It was the kind of thing you saw in the literature; the advice teachers were getting was to avoid that kind of approach.

Where was this advice coming from?

The whole point was that I was trying to find reputable examples of people in authority who were recommending this. And that’s why I often go back to Ofsted. It’s not because I think Ofsted were the only ones responsible. There were a lot of people doing this. The issue with Ofsted is that everyone accepts their authority and they have a very big record of their reports. But it wasn’t just them. The whole general world view reflected it. Ofsted weren’t saying things that were controversial to the wider world. They weren’t criticised for this. They were criticised for other things. I should say that I think Ofsted have gone through a big reform process and have changed a lot of this.

I asked online what people thought the impact had been on them. There was a deluge of support from people talking about the immensity of your influence. Were you surprised?

Yes! It felt quite niche. I remember going through all the Ofsted reports and I was thinking ‘This is just a moment in time. In one country in one system. Who’s going to be interested?’ I thought it would be quite ephemeral, and it might date because of the reports and era it was in. But I’m most pleased that people are still reading it – and that it was controversial to begin with, but that as time has gone on, and people have thought about it, it seems to have people warming to it. It wasn’t intended to be an ideological polemic. It was meant to be about the evidence; ‘Here is the state of how we learn.’

If you were publishing it for the first time today, would you change anything?

No, I think it’s fine as it is. Although the thing I realised needed expanding very quickly was assessment. I think there’s a section in Seven Myths – very short – where I’m critical of teacher assessments. It’s just a couple of lines, and there were clearly a lot of people who seized upon that and thought, ‘Oh she just wants teaching to the test.’ What happened was that people associated a knowledge-based approach with teaching to the test or a massive exam focus. I realised – that was just a couple of sentences – I didn’t talk about exams very much at all. And they are such a massive part of our modern education system that I realised we have got to address that. Because there are massive problems with the way some teach to the test, there are legitimate critiques about the exam factory model of schooling that I have a lot of sympathy for. And I’d always been aware of that. I didn’t address it enough in the book. You can’t address education without this discussion: the role of exams.

Seven Myths became very well known, especially in the UK. How did you get from that to assessment?

When I read the responses to Seven Myths, it felt like the most interesting arguments were about exams – how does this fit in with them? The second thing: I was working with schools about how to make some of my ideas a reality, and what I realised very quickly was that you can’t do anything about curriculum – especially in English schools – unless you do something about assessment.


Look at GCSEs. I was working at this when levels were abolished. Even at primary, if you try to introduce a new curriculum approach, people instantly say, ‘How can I level this?’ So for example, say you want to bring in a direct instruction approach. How do I give a level at the end of it? If your new system of curriculum doesn’t match up with the way you assess it currently, you have a problem. And that was the issue I kept running into. Look at DI programmes like expressive writing. That doesn’t fit very well with an old UK national curriculum approach. So what do you do? Tweak it? Or do you bring the levels in? Change the assessment? To what?

So when you started to look into assessments, where did that lead you?

The big thing I struggled with, this idea that you just separate formative and summative assessment. Because when I started teaching, what you were seeing was lots of assessments that you would do six times a year, and the problem with that is you were assessing big, complex tasks. But these big, complex tasks, like essays, just because they’re in an assessment, actually they’re like projects. One of my arguments is that projects are not a good way to learn. But if you are assessing kids with a big complex task every six weeks, you don’t have the time to be breaking that task down into smaller chunks. And the big argument in Seven Myths is that we need to decompose the skill. As a practical example, as an English teacher, you try to judge a piece of writing.

A great book published a year ago, The Writing Revolution, is really good on this. The problem it says we have is that we aren’t training them to do writing; we don’t teach writing, and that is exactly the issue I find. That we were assessing writing  – a lot – but at what point do we sit them down and say, ‘Here are the nuts and bolts of writing’? When you break it down, this is what you need. This wasn’t compatible with a levelled or even a graded approach. Because when you grade or level you do want to assess a large piece of writing. So, when you teach it you want to break it down. And the analogy I use in Making Good Progress is that when you run a marathon, 26.2 miles is the end goal. But nobody, unless you’re already an elite marathon runner, no one begins by running 26.2 miles. Nobody runs 26.2 miles in every training session. And nobody thinks that the way you make progress to your end goal is by running marathons. So people do all kinds of other tasks. They go to the gym. They do cross-training, swimming, shorter runs, speed work. And all of those tasks go towards the complex goal.

So that’s how I got so involved in assessment: by realising that if you wanted to focus on a knowledge-based curriculum, I realised that the only way you could properly do it was within the framework of the assessment you were working on.

Which leads us neatly to comparative assessment.

As an English teacher, the biggest thing is that assessing writing is really hard. The minute you are writing in an extended way, those pieces are extremely hard to mark reliably. And not only that, but they start to have a negative impact on teaching and learning. Because what you end up with is marking to the rubric. And the rubric might say something like ‘uses vocabulary originally…’. There’s a list of things that define good writing. And the problem with that is that those sentences end up becoming the lesson objective. This creates the problem that you’re not teaching at the nuts-and-bolts level. You’re teaching at this generic level. You start saying things to students like ‘You need to infer more insightfully.’ Hang on, how helpful is that? The whole point of feedback is to give people something they can do next. The rubric isn’t designed to be helpful like that! But it’s not even that useful for markers, because two different markers can interpret the same line in different ways.

So what comparative judgement tries to do is to help with reliability, efficiency and validity. The first two are quick wins. You get much better agreement and you’ll get there much quicker. And that’s amazing. There’s another benefit: it lets you move away from the rubric. So when you look at two pieces of writing beside each other and you ask, ‘Which is the better piece?’, you just go on your gut instinct on your knowledge of what good writing is. And the power is that you move away from teaching to the rubric.

How do people criticise this?

I think people find it odd at first when you move away from the mark scheme, when you say use your gut instinct. They’re quick to ask ‘How do I know my gut instinct is right? And even if it is, what about everyone else’s?’ The way you get around those issues is that the thing about comparative assessment is that it generates an enormously sophisticated model. You have data on everyone’s judgement and every judge, so you can tell if the judge is an outlier. And it’s quite rare. So you can see if they’re in line with the group or not. The initial criticism is that ‘this just feels hopelessly subjective’. But we can prove it isn’t, because we can show you after that the reliability you get from this, the agreement and consistency between judges in the room is greater than the process with a rubric. And we can prove that. It feels subjective, but it isn’t; and marking with a rubric feels objective…but it isn’t.

What’s next?

I’m still very involved in assessment. But I really want to do some writing on education technology. Comparative judgement is quite a tech approach so I’ve been thinking about it, And what I find fascinating is that here are some really amazing innovative examples of tech use, but there are also a lot of gimmicks. And being in the world of ed-tech at its worst can feel like education from years ago: ‘Kids don’t need to know stuff, they can just google it.’ That is like a mantra in ed-tech. It’s early stages, but I want to find out which approaches in technology work with the mind and are going to help learning, and which ones aren’t there yet. It might be, in some ways, similar to Seven Myths, because it’ll be looking at different approaches to technology and wondering which ones are working with the grain of how our minds work and which ones aren’t.

Seven Myths of Education (2014) is available to buy from Routledge. Making Good Progress (2017) is available from Oxford University Press.

The grateful ped(agogue)

Why giving thanks may be a gift that gives to the giver

From the philosophers Epictetus and Confucius to our own parents and teachers, wise thinkers have always encouraged us to count our blessings. Joe Kirby puts this sage advice to the test, and explains why it’s great to be grateful.

The secret to happiness? Gratitude – or so the Greek philosopher Epictetus said in Rome, some 2000 years ago. In Ancient China, Confucius said it was ‘better to light one small candle of gratitude than to curse the darkness’. Buddhists put it even more succinctly: ‘grateful heart – peaceful mind’. For centuries, great thinkers around the world have taught this simple idea: ‘Want to be happy? Be grateful!’

Let’s put this ancient wisdom to the test of modern science and see what psychologists have learned. What actually happens when people express what they’re grateful for?


Two decades of seminal psychological research studies have found that after practising gratitude, people say they feel happier. In two studies, people wrote nine weekly gratitude journal entries, or daily entries for two weeks.1 Both groups reported better wellbeing, optimism and social connectedness than control groups. These studies were replicated with a third group.2 In another study, people kept a daily gratitude journal for a week, and reported lasting increases in happiness, even six months later.3 A 2006 study found that practising gratitude raised and sustained positive mood.4 But this was only with adults. What about teenagers and children?

A 2006 study of 221 young teenagers asked them to list five things they felt thankful for daily for two weeks. This enhanced their optimism and life satisfaction and decreased negative emotion, including after a three-week follow-up.5 A 2009 study found that children with lower positive emotion levels especially benefit from gratitude interventions.6 Two more studies replicated the findings: writing gratitude letters increased participants’ happiness and life satisfaction.7,8 After ten years of clinical trials, the world’s leading scientific expert on the topic, Robert Emmons, concluded that gratitude makes a measurable, positive impact on happiness.9

Other researchers found that people reported that gratitude improved relationships.10,11,12 Further studies also found that expressing gratitude increases people’s patience.13,14

One complication comes out of this research. One study suggested weekly appreciative writing outperformed daily.15 Perhaps writing too frequently loses freshness and meaning?

A recent trial, just published this year, involved students seeking counselling for depression and anxiety, with clinically low levels of mental health. They were divided into three groups: one wrote gratitude letters, one group wrote their deepest thoughts about negative experiences, and one did not do any writing. What did they find? Those expressing gratitude reported significantly better mental health four weeks afterwards – and even larger effects 12 weeks afterwards.16 Perhaps Confucius was right.

Three applications in schools

How might we apply these research insights in schools?

1. Termly postcards to teachers

Once a term in forms, tutors can give students gratitude postcards to write to teachers that have made a difference in their lives. It is easy for students to forget how much teachers do for them. It makes children feel happy to notice and acknowledge those who support them. It also makes teachers feel happy to be thoughtfully appreciated. Teachers can model this by writing appreciative postcards to one pupil each day. If a school does this, each year, teachers will have written 200 cards, and there’d be some 10,000 acts of encouragement. Students like showing these to their parents to make them feel proud. Some display them proudly on their fridges at home. Some students I know even keep and frame postcards they earn over the years!

2. Termly postcards to families

In forms, tutors can ask students to write gratitude postcards to their own parents, siblings or families at the end of term. It is hard for children and teenagers to remember how much the adults and family members in their lives do for them, and how sad they’d be if they lost them. Students and parents feel much more positively about the school when they see how much their family relationships matter to teachers.

3. Thanks to end lessons and form

Every day, teachers and students make great efforts. Leaving lessons creates an opportunity for students and teachers to say ‘Thank you!’ to show they appreciate each other. If both say ‘thank you’ politely as they part, this creates a very upbeat atmosphere around the school. Combine this with a mantra – ‘It’s great to be grateful!’ – to encourage students who are appreciative. Assemblies on the benefits of gratitude can help children understand why it’s helpful in life to really notice the good things we have in our lives.

Applying the research of gratitude is a promising way of helping children, teachers and families feel happy about school.


1. Emmons, R. and McCullough, M. (2003) ‘Counting blessings versus burdens: an experimental investigation of gratitude and subjective well-being in daily life’, Journal of Personality and Social Psychology 84 (2) pp. 377–389.

2. Ibid.

3. Seligman, M. E., Steen T. A., Park N. and Peterson, C. (2005) ‘Positive psychology progress: empirical validation of interventions’, American Psychologist 60 (5) pp. 410–21.

4. Sheldon, K. M. and Lyubomirsky, S. (2006) ‘How to increase and sustain positive emotion: the effects of expressing gratitude’, The Journal of Positive Psychology 1 (2) pp. 73–82.

5. Froh, J. J., Sefick, W. J. and Emmons, R. A. (2008) ‘Counting blessings in early adolescents: an experimental study of gratitude and subjective well-being’, Journal of School Psychology 46 (2) pp. 213–233.

6. Froh, J. J., Kashdan, T. B., Ozimkowski, K. M. and Miller, N. (2009) ‘Who benefits the most from a gratitude intervention in children and adolescents?’, The Journal of Positive Psychology 4 (5) pp. 408–422.

7. Toepfer, S. M. and Walker, K. (2009) ‘Letters of gratitude: improving well-being through expressive writing’, Journal of Writing Research 1 (3) pp. 181–198.

8. Toepfer, S. M., Cichy, K. and Peters, P. (2012) ‘Letters of gratitude: further evidence for author benefits’, Journal of Happiness Studies 13 (10) pp. 187–201.

9. Emmons, R. A. (2013) Gratitude works! San Francisco, CA: Jossey-Bass.

10. Bartlett, M. Y. and DeSteno, D. (2006) ‘Gratitude and prosocial behavior: helping when it costs you’, Psychological Science 17 (4) pp. 319–325.

11. Lambert, N. M., Clark, M. S., Durtschi, J., Fincham, F. D. and Graham, S. M. (2010) ‘Benefits of expressing gratitude: expressing gratitude to a partner changes one’s view of the relationship’, Psychological Science 21 (4) pp. 574–580.

12. Grant, A. M. and Gino, F. (2010) ‘A little thanks goes a long way: explaining why gratitude expressions motivate prosocial behavior’, Journal of Personal and Social Psychology 98 (6) pp. 946–955.

13. DeSteno, D., Li, Y., Dickens, L. and Lerner, J. S. (2014) ‘Gratitude: a tool for reducing economic impatience’, Psychological Science 25 (6) pp. 1262–1267.

14. Dickens, L. and DeSteno, D. (2016) ‘The grateful are patient: heightened daily gratitude is associated with attenuated temporal discounting’, Emotion 16 (4) pp. 421–425.

15. Ibid. 3.

16. Wong, Y. J., Owen, J., Gabana, N. T., Brown, J. W., McInnis, S., Toth, P. and Gilman, L. (2018) ‘Does gratitude writing improve the mental health of psychotherapy clients? Evidence from a randomized controlled trial’, Psychotherapy Research 28 (2) pp. 192–202.

Harder, better, faster, longer?

Rebecca Foster explains how to introduce ‘desirable’ difficulties into your teaching – and why learning shouldn’t be easy.

‘The mistake we pop stars fall into is stating the obvious. “War is bad. Starvation is bad. Don’t chop down the rainforest.” It’s boring. It’s much better to hide it, to fold the meaning into some sort of metaphor or maze, if you like, and for the listener to have to journey to find it.’


The fetishisation of ease is ubiquitous: you only need to look down at your smartphone to see how advances in technology have converged to squeeze a multitude of processes into one hand-held device for your convenience – a camera, easy access to cat videos and social media all in one place! We don’t even have to get up from our sofas to change the TV channel or rely on a map to get us from A to B anymore. But at what cost this ease? In making life as easy as possible, what are we losing? Aren’t some difficulties in fact desirable?

These are questions we ought to be asking of our classroom practice too. When we make learning easy in the classroom, what is the cost? The work of Bjork and other researchers suggests that practices that ‘appear optimal during instruction’,1 such as massing study sessions and blocking practice, ‘can fail to support long-term retention and transfer of knowledge’. Whereas introducing certain difficulties that ‘slow the apparent rate of learning’, such as reducing feedback to the learner and interleaving practice on separate topics or tasks, ‘remarkably’ has the opposite effect.

Bjork asks the question why, ‘if the research picture is so clear’, are ‘massed practice, excessive feedback, fixed conditions of training, and limited opportunities for retrieval practice – among other nonproductive manipulations – such common features of real-world training programs?’2 One answer, in school contexts, might be a type of ‘operant conditioning’ teachers are exposed to. Several school systems serve to reinforce practices that encourage the teacher to increase the performance rate of their students to satisfy a demand for ‘rapid progress’. For example, frequent data-trawls encourage teachers to teach in a way that will maximise the short-term performance of their students. If I have to enter data on a student six times a year, and especially if that data is used to judge my performance as a teacher or inform the pay I’m entitled to, am I not motivated to do what’s necessary to push students over short-term hurdles? Notwithstanding the perfectly admirable desire as a teacher to see my students perform well.

It’s a bit like confiscating everybody’s satnavs: probably not a great idea if their timely arrival on a certain day is important; but if you want people to get better at finding their way in the longer term then it’s a sensible strategy that has merit.

As teachers we may also be led to favour practices that increase performance at the acquisition of learning stage because many of the ‘desirable’ difficulties Bjork suggests will produce ‘the best retention performance’3 result in ‘poorer performance’ at the point of learning new information. It’s manifestly unintuitive to a teacher to degrade the performance of students in the classroom. It’s a bit like confiscating everybody’s satnavs: probably not a great idea if their timely arrival on a certain day is important; but if you want people to get better at finding their way in the longer term then it’s a sensible strategy that has merit.

While short-term performance goals are understandable, our sights as teachers need to stretch far beyond the end of the lesson, unit or course of study. With supportive whole-school structures, teachers can be freed up to introduce desirable difficulties that may impede short-term performance but have long-term positive impact.

I’ve been leading the English department at my current school for two years and have introduced a range of ‘desirable difficulties’ that have been a challenge for both teachers and students. However, the effectiveness of the learning taking place in the English lessons in my department is revealed by the level of retention demonstrated by our students over time.

Distributing practice

One of the biggest changes I introduced was a move away from massed practice or traditional term-long units of study. In the past students might study a novel for a term and then move on to study creative writing followed by four other units – each conveniently one term long. I can only assume that the rationale for the length of the units was because that’s how the year is broken up and an end-of-unit assessment would fall just before a data drop, with all of the work leading up to that building the knowledge and skills necessary to perform well in that assessment. However, when that topic was returned to a year or more later, students’ long-term recall or performance was hindered by this approach.

Now, at KS3, we have two key units that are studied for roughly half of the year: a novel and a Shakespeare play. These are interleaved with studying poetry, fiction writing, non-fiction writing and analysis of both fiction and non-fiction. In practice this means that no two English lessons within a single week are on the same topic. Whilst this was a real challenge for teachers at first, our students haven’t been in the least bit phased and we’ve seen the impact this model has had on the development of our students’ knowledge and skills.

Of course, were I working in a school that demanded an assessment every six weeks, I may find myself in hot water; but thankfully, I work in a school that only requires one data entry a year at KS3 and two or three at KS4.

Using tests as learning events

Lots of evidence points to the idea that recalling information is more effective than a further study event and also serves the purpose of providing feedback to students about their current knowledge or a given topic. In my department, we have introduced a range of tests as learning events, including retrieval practice starters and knowledge tests. One of the most effective things we’ve introduced, after reading Battle Hymn of the Tiger Teachers: The Michaela Way, is self-quizzing homework. Students are required to test how much they can recall from their knowledge organisers and then, in a different coloured pen, fill in any gaps or make corrections. Not only is this weekly, structured activity improving students’ learning of key knowledge, but it’s also providing regular feedback to both teacher and learner about what they know or don’t know. Furthermore, it has the added benefit of not needing to be marked – a difficulty that is certainly not desirable!

Learning styles – the greatest trick the devil ever trained

It wasn’t so long ago that training teachers in the UK were taught almost entirely uncritically to use learning modalities (learning styles) like VAK as an allegedly ‘evidence-informed’ way to help students learn. How wrong they were. Jennifer Beattie, a teacher from East London, takes a trip down memory lane and recalls how common it was even in her career – and still could be if we’re not careful.

Recently, I was involved in a discussion on edu-Twitter with teachers who were reflecting on their training. A significant number of them were critical of the fact that certain aspects of pedagogy that they’d been trained in had not stood the test of time. Being professionals, we recognise how training evolves and practices change. What trainees are being told to do today could well not exist in a few years’ time. The concept of VAK learning styles (visual, auditory and kinaesthetic), however, somehow stills continues to spark debate, despite us all knowing that making your teaching resources visual, auditory and kinaesthetic would be as helpful to pupil progress as it would be to make your resources about Love Island or Fortnite. I understand why the idea still exists. It’s a comfortable way of attempting to deal with an uncomfortable truth: not all pupils learn and make progress at the same rate.

Making your teaching resources visual, auditory and kinaesthetic would be as helpful as making them about Love Island or Fortnite

Yet, I have to admit that I believed in learning styles whilst training – and still for a large part of my early teaching career. I recognise that my ITT experience is simply reflective of what Ofsted (the UK school inspectorate) and the DFES (the then Department for Education and Skills) wanted at the time and my course tutors were simply channelling that into us. That time was 2007; that progressive era of, notably, ‘The One-off Outstanding Lesson’, mini plenaries, student-led ‘discovery learning’, Brain Gym and P4C (Philosophy for Children).

With the aim of reminding myself why I was such a devout believer of VAK back then, I dusted off my QTS Standards folders and books. I found one, entitled Learning and Teaching in Secondary Schools. In it, there were six pages devoted to learning styles and ‘multiple intelligences’. Of these six pages, nine lines were given over to ‘Learning Styles; a critique’, where the writer admits that it is, actually, very difficult to define learning in such different ways. This isn’t developed further in the book.

What I find most incredible in these pages is that they mention a possible ‘mismatch’ between a student’s ‘preferred learning style’ and the tasks they face from the teacher. It’s outrageous to tell new entrants to the profession that a possible reason why a pupil isn’t learning is because the teacher hasn’t engaged with the student’s preferred learning style. I can only imagine the sheer number of PGCE student hours wasted, trying to make that elusive, ‘engaging’ resource which will appeal to all sorts of learners. I know this because I did it.

When I think back to the time taken up with trying to make things like the ‘passé composé’ kinaesthetic (‘Right, let’s MOVE the pronouns and auxiliary verbs that I’ve spent hours laminating for you all, shall we, class?’), I reflect that I could have actually been learning ways to explain it better and give pupils adequate, robust practice. No wonder I am exasperated with having been caught in the nonsense of it all.

Furthermore, in my professional standards portfolio, much of the evidence I gathered to prove I’d met a particular standard comprised of lesson plans with VAK ideas and resources. As a trainee, the lesson plan pro forma had a box specifically for planning and detailing your VAK resources to be used. But, were trainees explicitly told to include VAK learning styles in order to gain Qualified Teacher Status? In the 2007–08 Standards, there was a real emphasis on ‘personalising learning’. Trainees were told that you should plan your lessons to engage with all pupils’ individual learning styles and preferences. This turned into tutors expecting to see VAK on every trainee lesson plan. Even the training book mentioned earlier issued a stark ‘warning’ about it:

‘In order to progress towards meeting the Standards for the Award of Qualified Teacher Status (QTS) it is important that beginning teachers are aware of the different learning styles that might exist in their classes and what might be some characteristics of individual learning preferences.’

I have been asked why I am now so critical of VAK, when I wasn’t ten years ago. Well, for one thing, experience. Experience as a teacher has shown me that telling pupils the rule about the past tense in French gets you better results than making a game of it. Experience has shown me that telling the pupils what a word means gets you a quicker result than making a ‘card sort’ game. I didn’t have this experience ten years ago: there wasn’t much research debunking it; and when someone tells you that you have to include it in your lesson plans and observed lessons to meet the standards, in all likelihood you’re going to do it!

So, while this was a brief, nostalgic look back at what it was like to be fully submerged in the VAK pseudoscience of 2007, it is important that, as teachers, we don’t allow it back in. I still see a lot of new entrants to the profession worry about why some pupils aren’t ‘getting it’ and some of the advice dispensed encourages them to try matching their teaching and learning activities to their students’ different styles of learning. We cannot allow more trainee and NQT hours to be spent trying to create ‘perfect’ lesson resources. The best resource, for any lesson, is the teacher.

Education, literature and the paradox of ‘the whole child’

Professor Robert Davis of the University of Glasgow writes a poignant reflection on the Plowden report, which defined the era of child-centred education for the generation for which it was written – and for decades to come.

2017 was the 50th anniversary of the Plowden Report (Children and their Primary Schools), a landmark document in the history of 20th-century progressivism, which announced major reforms in curriculum and pedagogy across the schools of the United Kingdom and which echoed powerful modernising impulses elsewhere in the developed world. The elusive search for the origins of ‘progressive education’ has led some historians to question its entire viability as a concept for capturing an undeniably broad and piecemeal diversity of 20th-century educational innovations. Nevertheless, wherever we trace its roots, it seems clear that a number of key concepts steadily became dominant in educational thought on both sides of the Atlantic between 1920 and 1960 (to the undoubted reproach of the didactic models of learning and teaching that had monopolised schools since the coming of state-sponsored mass education to the industrial nations in the closing decades of the 19th century). Paramount among these supposedly ‘new’ ideas was the discourse of ‘child-centredness’, and the language of the ‘whole child’ – each among the first phrases, incidentally, to excite the scepticism of philosophers of education such as R.S. Peters and Robert Deardon of the Institute of Education in London in the first issues of the Journal of Philosophy of Education in the middle and late 1960s.

Like ‘progressive education’, the terms ‘child-centredness’ and the ‘whole child’ already possessed, by the 1960s, a complex pedigree. Rousseau’s direct influence in the late 18th century on (most significantly) Johann Pestalozzi had succeeded in embedding the concepts by the 1820s very explicitly in radical philosophies of, particularly, infant education across Europe and into parts of North America. By the end of that decade, Robert Owen and Friedrich Froebel were each campaigning vigorously in Britain and Germany on behalf of the revolutionary ‘kindergarden’ or nursery movement, where learning and teaching for very young children would be centred upon play and led by the interests and inclinations of the child rather than (in Froebel’s model especially) the direction of the teacher. Owen’s British experiments were destined to end in defeat at the hands the traditionalist opposition of church and state, while Froebel’s spectacularly successful kindergarden networks nevertheless saw the language of child-centredness carefully cordoned into the specialist pre-5 environment where his thinking and reputation took root, with consequently very little impact on the expanding compulsory sectors.

Nevertheless, it is safe to say that more inclusive notions of ‘child-centredness’ and the ‘whole child’ sustained a kind of subterranean afterlife throughout the later 19th century in radical educational circles in Britain. Such ideas resurfaced in a series of immensely important government enquiries chaired by W.H. Hadow in 1926, 1931 and 1933 that heavily criticised the Victorian approaches to learning and teaching – then still prevalent in UK primary and secondary schools. The ’31 document (which approvingly referenced Owen and New Lanark) declared:

We desire to see the child as an active agent in his early schooling, making … an active participation in its process, through his own experiences and his own activities, and relating his growing knowledge at all points to the world in which he lives.

Although these ideas were to be eclipsed by more pressing domestic and international anxieties as the 1930s unfolded, they survived as a subversive memory – a hope, indeed – in British educational thought until a more welcoming climate emerged with the onset of the Swinging Sixties. This period heralded the rise of a new metropolitan youth culture and the election of Harold Wilson’s Labour governments in 1964 and 1966 on a platform that included far-reaching educational reform. Bridget Plowden was actually commissioned to conduct her investigations into English schools by the outgoing Conservative Government in 1963; but under the direction of the new Labour Education Secretary, the socialist intellectual Anthony Crosland, the egalitarian mission of the enquiry was very significantly radicalised. Crosland and his advisors had in turn been deeply influenced by the central, supposedly scientific justification for the doctrine of child-centredness provided between 1930 and 1960 by Jean Piaget’s model of developmentalism.

Contemporary ‘neo-traditionalists’ mock Piagetian theory for what they see as its poor empirical evidence base, but Peters, Deardon and others discerned at the time a deeper problem. On the one hand, the new mid-20th-century progressivist discipline of ‘educational psychology’ was advocating an optimistic, unfettered view of the child’s predisposition for learning perfectly aligned with Plowden’s reformed pedagogy. But on the other, the work of some of the most influential psychologists and anthropologists of the time was describing a quite different child secreted at the heart of modern society: an anxious, troubled, aggressive creature trapped in the gothic Freudian-Kleinian struggles of the family romance, or self-centredly and unempathetically striving for dominance over rivals in the pursuit of its appetites and an obviously unappeasable desire for security. It was for this reason that the earlier Hull House experiments of John Dewey in Chicago had eventually repudiated the dominant American Froebelian conception of the kindergarden as a reproduction of the domestic emotional ambience of the family, in favour of the rigorous cosmopolitan practices of the ‘peer group’ and the ‘school community’ supposedly so critical to the fortunes of an essentially immigrant society. If the family is intrinsically psychodynamically maladaptive, Dewey had argued, effective education could not possibly proceed from the imitation of its affective life or its understanding of the child. The 1960s were also, we should recall, the era of Phillipe Ariès’s Centuries of Childhood, which in bowdlerised form had found its way into the textbooks of many caring-profession diploma and degree programmes – instructing intending nurses, doctors, teachers, social workers that childhood and the nuclear family were contingent, bourgeois ideological constructions of the very bureaucracies they were training to serve. The extraordinarily popular Scottish psychiatrist Ronnie (‘R.D.’) Laing, a media hero of many ’60s ‘liberation’ movements, turned most vitriolically on the family and its supporting institutions, denouncing them as the cradle of injustice, oppression and patriarchy, producing only damaged children and frustrated adults, and against which schizophrenia was a perfectly valid emancipatory protest.

Even Piaget himself became part of this same malaise through the use in his writings of a concept for the description of early childhood which he later came to regret: egocentrism. Now for Piaget, the term was confined to the description of purely epistemological processes, not affective or moral states. But in the psychoanalytic climate of the period, it is unsurprising that it was swiftly mobilised for estranging and othering children, culminating in the notorious observation in the best-selling mid-century teacher training manual by Hughes and Hughes, Learning and Teaching, that ‘it is well known that young children are, as a general rule, determined little egotists’. A host of popular and influential figures – led by high-profile academics such as Bowlby, Winnicott and Gesell – compounded this problem by foregrounding a developing child characterised by innate aggression, violent fantasies of control and group destructiveness. There were variants within this literature, across gender, age-band and social class especially, but the trends remained consistent; and such was the prestige of these authorities that their ideas routinely migrated into formal guidance for schools, teachers and even parents.

These difficulties were of course cultural as well as educational, and their cultural dimensions have been so far largely neglected in the critical assessment of the coming of Plowden progressivism. But Plowden both reflected and stimulated a new climate in teacher education in which the study of, for example, children’s literature was earnestly cultivated for both aspiring teachers and their pupils as a potent antidote to the previous supposedly failed models of instructional literacy. This was also pivotal, of course, to the success of any effort to export child-centredness beyond the pre-literate, pre-compulsory confines of the nursery into the later stages of childhood. Hence the education of the ‘whole child’ championed by Plowden in England, and by the so-called 1965 Primary Memorandum in Scotland, would abandon in schools the force-fed language training and decontextualised literary comprehension extracts of the old system in favour of the ‘real books’ and the appreciation of valuable works of literature to which children and young people might be instinctively attracted when shared appropriately with them by their suitably well-read and sincerely ‘child-centred’ teachers. This is a principle that has of course remained absolutely central to mainstream literacy teaching in most democratic education systems for the past 50 years, and the examination of Plowden advanced in this analysis does not seek to overturn it. But just as the 1960s psychological messages to beginning teachers from their formal programmes of study (as well as their surrounding culture) were paradoxical ones, so also the otherwise salutary advocacy to them of high-quality children’s literature was also singularly ambivalent.

Some of the finest books for children and young people that accompanied the Plowden Report off the printing presses of 1967 and 1968 dealt candidly with experiences of childhood and youth which – reflective no doubt of the volatile, contradictory tensions in that same surrounding society – were rarely celebrated for the presentation of ‘whole’ children or of benign, ‘child-centred’ environments. Leon Garfield’s Carnegie-honoured and hugely popular Smith (1967) described a deprived Regency pauper childhood of exploitation and treachery, where childhood is neither special nor valued and where the pursuit of a defining trust (a cornerstone assumption of progressivism) between adults and children is as elusive as the literacy which – when eventually acquired – simultaneously empowers and mortally imperils the central character. In the same vein, the Carnegie Medal Winner of 1967 – and certainly one of the best and most influential children’s books of the last 50 years – Alan Garner’s The Owl Service, presented a dark vision of childhood forever in thrall to the sins and repetition-compulsions of the adult generation, condemned interminably to repeat the same cycle of errors and betrayals across the epochs regardless of environment or circumstance.

The Owl Service also audaciously probed further into the cultural territory in which Plowden’s optimistic account of childhood, and adult-child relations, had pitched its claims. As well as highlighting an almost genetic taint passed across the generations, and destined to pollute indelibly the faltering communications between adults and children, The Owl Service engaged with the experience of ‘youth’ – just at a time, indeed, when this fugitive cultural category was beginning to overtake ‘childhood’ as the primary focus of 1960s educational solicitude and artistic preoccupation. Garner daringly highlights single-parent and blended families stamped by class, regional, linguistic and postcolonial ethnic divisions. The novel also famously unleashes intense sibling and sexual rivalry into the narrative, in forms darkly reminiscent of the forces claimed by the influential analytical psychologists of the time to be pervasive and determinant in the lives of children and young people. There is, of course, a moment of redemption in The Owl Service: a terminal renunciation by one of the central characters, the priggish Roger, which finally rescues the doomed Alison from the vindictive clutches of the past. But it comes at immense cost, with the socially and ethnically excluded Gwyn left both unreconciled and in full possession of the ineradicable knowledge of his family’s myriad ancestral crimes.

Even those children’s books of ’67–’68 – popular in both wider society and the expanding network of teacher training institutions which focused directly on the experience of school, or of simply becoming educated – rarely presented these settings in benevolent, ‘child-centred’ terms. Barry Hines’s 1968 A Kestrel for a Knave – memorably adapted as the Ken Loach film Kes (and thereafter often taught in schools too) – described somber northern English schools marked by casual violence, bullying, extreme physical punishment, routine humiliation and the pervasive alienation of pupils and teachers. Even the teacher with a heart in the novel, Mr Farthing, can only seriously identify with the central character Billy around the nurture of the kestrel – the injured bird with which the boy has bonded standing for the brief moments of flight from his bleak domestic and educational existence. Hines’s contribution in Kes stood with a group of important writers for children reminding the ’60s generation, and the large teacher-influx within it, that many working-class schools in Britain operated in ways far removed from Plowden’s principles, serving children and young people whose lives, learning and identities were far from ‘whole’ or integrated.

The pursuit of such ‘wholeness of being’ marks another text hugely popular with late-’60s readerships and which in the decades since has only accrued increased esteem and recognition. The late Ursula Le Guin’s 1968 A Wizard of Earthsea was a gift to the grammar-school Tolkien generation, flush with the countercultural values that were sustaining the environmental movement, hippiedom, the anti-Vietnam protests and the idealism of the Summer of Love. Earthsea was instantly celebrated for its retrained ecocentrism, its laid-back Zen-style wisdom of naming and knowing and its invocation of alternative styles of archipelagic working and being closer to nature and other living things. Insofar as Earthsea is an intrinsically educational text – concerned with the training and instruction of the boy-mage prodigy Ged at an elite wizard school – all of its conditions at first seem ideal for a child-centred, holistic conception of learning and personal discovery of precisely the type envisaged by Plowden and its related literature. Yet, as we know, Ged’s education takes an unexpectedly malevolent turn, when from his unquenchable curiosity and juvenile individualism (qualities unstintingly celebrated in progressivist literature) he inadvertently unleashes the destructive havoc of a shadow creature – and which he, maimed and incapacitated, must spend the rest of the novel seeking to undo. Earthsea, thereafter, becomes a kind of bildungsroman – a journey of the traumatised Ged into the realms of Earthsea beyond the confines of even this most inclusive, holistic society where he can begin his education again and in an entirely altered and humbled state of mind. We might go so far as to say that Ged needs to become a decentred learner, whose brokenness and injury take the focus away from him and on to the setting and the personalities whose needs he must learn to serve with his impaired magical talents. This shift in perspective is most fully underlined at the climax of the story, where reader and protagonist each discover that the abomination Ged must seek to recapture and subdue is the abject, refractory elements of his own self, sharing his name and his identity:

Alone and clearly, breaking that old silence, Ged spoke the shadow’s name,

And in the same moment the shadow spoke without lips or tongue, saying thesame word: ‘Ged’. And the two voices were one voice.

Alone and clearly – I emphatically do not invoke Earthsea or any other of these novels as a casual repudiation of Plowden or any other investment in child-centredness, yesterday or today. I wish only to suggest here that the social, cultural, and literary ambiguities of 50 years ago, like those of the present time, require that we think through – again and again – the emblematic educational slogans of every era in which we practise our professions, recognising that the resources of art and literature can assist us immeasurably with the task of understanding the inevitable incompleteness and vulnerability of ourselves and of the children and young people in the classrooms before us.

Myth-Busting: Gardner’s multiple intelligences

Every issue, Dr Pedro De Bruyckere takes aim at a common educational theory and summarises the evidence for and against it. This time, it’s Gardner’s multiple intelligences in the hot seat.

There is some truth in every lie: multiple intelligences

In the last issue of researchED magazine, I discussed the grains of truth inside the learning styles theory and I’d like to follow that with something that is often mistakenly used as a kind of learning styles theory: the multiple intelligences theory by Howard Gardner.

What does it state? That we should look to more than just IQ in education. Gardner thought it too narrow to see ‘intelligence’ as one single thing. So he added different modalities of intelligence, such as:

  • musical-rhythmic
  • visual-spatial
  • verbal-linguistic
  • logical-mathematical
  • bodily-kinaesthetic
  • interpersonal
  • intrapersonal
  • naturalistic

This list has been adapted a few times; somebody even suggested adding gastronomic intelligence.

In an interview with Kathy Checkley in 1997,1 Gardner explained that this theory shouldn’t be used as a learning style approach:

A myth that irritates me is that people place my intelligences on the same footing as learning styles. Learning styles say something about how people approach everything they do. If you are good at planning, people expect you to have a plan for everything you do. My own research and observations lead me to suspect that this is a wrong assumption.

But there are more issues than this. In my book,2 we’ve already debunked this theory; but little did we know that Howard Gardner would drop a tiny bombshell a bit later in a kind of memoir looking back at his academic life.

I want to share with you three telling quotes by the man himself. One of our criticisms was that the word ‘intelligence’ is a bad choice as it suggests a predictive power – which Gardner’s theory does not have. Now Gardner explains:3

I termed the resulting categories ‘intelligences’ rather than talents. In so doing, I challenged those psychologists who believed that they owned the word ‘intelligence’ and had a monopoly on its definition and measurement. If I had written about human talents, rather than intelligences, I probably would not have been asked to contribute to this volume.

Ok…but it gets worse. Did he test his theory?

I readily admit that the theory is no longer current. Several fields of knowledge have advanced significantly since the early 1980s.

Nor, indeed, have I carried out experiments designed to test the theory. This has led some critics to declare that my theory is not empirical. That charge is baloney! The theory is not experimental in the traditional sense (as was my earlier work with brain-damaged patients); but it is strictly empirical, drawing on hundreds of findings from half-a-dozen fields of science.

Oh, but should his theory be used today? Well, again, Gardner himself:

At the same time, I readily admit that the theory is no longer current. Several fields of knowledge have advanced significantly since the early 1980s. Any reinvigoration of the theory would require a survey similar to the one that colleagues and I carried out thirty-five years ago. Whether or not I ever carry out such an update, I encourage others to do so.

And that is because I am no longer wedded to the particular list of intelligences that I initially developed. 

Myth-busting multiple intelligences this time requires only that we use the original author himself. Now for the truth inside the myth. Even in our book, we don’t want to call this theory a complete myth, but instead label it as ‘nuanced’. Why? Well, the basic idea behind this theory is that people are different, and maybe you’ve noticed – they really are. People have different interests, different abilities, different moods, etc.

So, for example, taking into account the difference pupils have in their prior knowledge can be very productive for their learning. When pupils have less prior knowledge, for example, a more teacher-directed approach could be warranted.4


Checkley, K. (1997) ‘The first seven…and the eighth: a conversation with Howard Gardner’, Educational Leadership 55 (1) pp. 8–13.

De Bruyckere, P., Kirschner, P. A. and Hulshof, C. D. (2015) Urban myths about learning and education. Cambridge, MA: Academic Press.

Gardner, H. (2016) ‘Multiple intelligences: prelude, theory, and aftermath’ in Sternberg, R. J., Fiske, S. T. and Foss, D. J. (eds) Scientists making a difference. Cambridge: Cambridge University Press, pp. 167–170.

For example: Yates, G. C. and Hattie, J. (2013) Visible learning and the science of how we learn. London: Routledge.

See also: Ritchie, S. (2015) Intelligence: all that matters. London: Hodder & Stoughton.