Writing and cognitive load theory

Cognitive load theory has been described as one of the most important discussions in modern psychology that educators need to be familiar with. Natalie Wexler looks at what the implications of this theory are for the way we teach writing, and what it means in the classroom.

It’s been said that reading is the most difficult thing we ask students to do. In fact, that description applies more accurately to writing, which has received far less attention from both cognitive scientists and educators. Because it requires students to express themselves and not merely to receive and process information, writing imposes the greater cognitive load.

It’s clear that reading places a heavy burden on short- term or working memory – the aspect of cognition that could also be called ‘consciousness’, and which can hold only a limited number of things for a limited amount of time. When it comes to decoding, those things include the correspondences between letters and sounds; for reading comprehension, they expand to include knowledge and vocabulary relating to the topic (1). The key to successful reading is to have as many of these factors as possible stored in long-term memory – which has a virtually infinite capacity – so they don’t take up precious space in working memory and overload it.

With writing, background knowledge is even more crucial. It may be difficult to read about a subject that’s unfamiliar, but it’s virtually impossible to write about one coherently. At the same time, knowledge of the topic is only one of many factors vying for space in working memory. Even when producing a single sentence, inexperienced writers may be juggling things like letter formation, spelling, word choice, and sentence structure. When asked to write at length, they need to cope with the challenges of adhering to a topic, creating smooth transitions, avoiding repetition, and ensuring that the overall organization of the piece is coherent. All of this is in addition to absorbing the information that forms the basis for their writing, deciding what to say about it, and anticipating what a reader will need to know.

In some situations, the key to easing cognitive load is to provide what are known as ‘worked examples’. Rather than asking learners who are unfamiliar with a topic to acquire knowledge through solving problems themselves, the theory goes, teachers should have them study problems that have already been solved. In the context of math, for example, research has shown that students who study worked examples of algebra problems perform better than those who solve problems on their own, when tested later on their ability to solve similar problems. The reason appears to be that problem solving imposes such a heavy cognitive load on novice learners that they have little capacity left for transferring the strategies they’ve used into long-term memory (2).

It’s been suggested that the worked-example effect can be applied to writing as well: if teachers explicitly teach sentence structures and vocabulary, provide exemplars that illustrate these things, and lead discussions on the subject, students should be able to study the exemplars and reproduce those features in their own writing (3). But many American teachers already use a version of worked examples when trying to teach writing: they show students ‘mentor texts’ to use as models (4). Considering that a mere 25% test of American students test at the proficient level in writing (5), it’s fairly clear that that approach is not having the desired effect.

Showing students exemplar sentences rather than entire texts is definitely a step in the right direction, because it focuses students’ attention on a manageable unit. But the problem, as Greg Ashman has put it, is that there’s a difference between ‘knowing that’ and ‘knowing how’. Students may know, for example, that a sentence is ‘a set of words containing a subject and a predicate and expressing a complete thought’. Showing students examples of complete sentences and contrasting them with sentence fragments may make the concept more concrete. But many students will nevertheless fail to know how to write complete sentences and continue to use sentence fragments in their own writing. A basic problem is that the massive cognitive load that inexperienced writers face makes it difficult for them to remember to put their conceptual knowledge into practice (6).

How do we get students to know how to write well? That question is crucial, and not just because we want students to acquire writing skills. When the cognitive load is modulated, writing is perhaps the most effective way to build and deepen students’ knowledge and develop their analytical abilities. To be sure, students need some knowledge of a topic to begin writing. But once they start to write, they need to recall information they have recently learned, determine which points are important and connect them to one another, and put all of this into their own words. If students have the cognitive capacity to engage in these steps, the effect is powerful – akin to the ‘testing effect’ (the boost in retention that comes from being quizzed on recently learned material) and the similar ‘protégé effect’ (which results from explaining a topic to another person) (7,8).

When the cognitive load is modulated, writing is perhaps the most effective way to build and deepen students’ knowledge and develop their analytical abilities.

Because of the complexity of the writing process, students need more than direct instruction and worked examples to become competent writers. They need ‘deliberate practice’: repeated efforts to perform aspects of a complex task in a logical sequence, with a more experienced practitioner providing prompt and targeted feedback (9). And for many students, including many at upper grade levels, this kind of practice needs to begin at the sentence level – partly because sentences are the building blocks of all good writing, and partly because sentence-level tasks lighten the cognitive load. That’s not to say that constructing a sentence is an inherently simple task. It all depends on the content. For example, there’s nothing simple about completing this sentence: ‘Immanuel Kant believed that space and time are subjective forms of human sensibility, but _________.’

Deliberate practice in writing also needs to extend beyond English class to the rest of the curriculum. Not only does that provide teachers of history, science, math, and other subjects with a powerful tool to enhance their instruction, it also gives students more opportunities to practice the writing strategies. Eventually, many of those strategies will become lodged in long-term memory, becoming so automatic that students don’t even realize they’re using them.

When students are ready to embark on lengthier writing, where the cognitive load is even greater, they need to learn to construct clear, linear outlines that enable them to organize their thoughts, avoid repetition, and stay on track. Juggling those tasks in working memory while writing can be overwhelming even for many experienced writers. Once students have used an outline to create a draft, they can use their pre-existing knowledge of sentence-level strategies to vary their sentence structure and create smooth transitions.

While this approach to writing is still rare and unorthodox, it is gaining traction largely thanks to a US- based organization called The Writing Revolution, of which I am board chair, and a book that explains the method – also called The Writing Revolution – of which I am the co- author with Dr Judith C. Hochman. A veteran educator, Dr Hochman has developed a series of writing strategies that are designed to be taught explicitly and practiced repeatedly in a variety of contexts, with prompt feedback from a teacher. Although originally created for learning- disabled students, the method has been shown to be effective with students of all abilities, including those still learning English.

What does the method look like in practice? Let’s return to the example of students who use sentence fragments rather than complete sentences. In addition to showing students examples of fragments and complete sentences side by side, the Hochman Method has students practice distinguishing between the two – and turning the fragments into complete sentences. For older or more sophisticated students, the terms ‘subject’, ‘verb’, and ‘predicate’ might be used, but it’s sufficient to simply ask questions in functional terms. For example, if a fragment says, ‘ate a great meal,’ the teacher might ask the class, ‘Does that tell us who ate a great meal? How can we make these words into a sentence?’(10)

To derive the maximum benefit from this activity, the examples should be embedded in whatever content students are learning. A math teacher who has taught rational numbers could review – and simultaneously build writing skills – by giving students the following fragments and asking them to transform the phrases into sentences, with proper punctuation and capitalization:

  • can be expressed as a fraction or a ratio
  • rational numbers

Their responses might be:

  • A rational number is a number that can be expressed as a fraction or a ratio.
  • Rational numbers can be ordered on a number line.

Eventually, through the repeated process of identifying and correcting fragments, students will develop an understanding of how to create a complete sentence and apply that knowledge to their own writing.

Students don’t need to learn the names of grammatical structures and parts of speech for their own sake. But certain terms are useful as a shorthand for strategies that will enhance writing and lessen cognitive load. For example, the method has students learn the word ‘appositive’ – that is, a phrase that renames a noun – because it provides them with an effective strategy for varying sentence structure and expanding their responses. Once students have grasped the concept, they can be asked to provide appositives for sentences grounded in the content of the curriculum. A biology teacher might give students the sentence, ‘Natural selection, __________, results in species with favorable traits.’ A student might supply the appositive, ‘a process of evolution’.

When students have moved on to lengthier writing, they’re advised that appositives can be used to create good topic sentences – and they’ll understand what to do. Ultimately, that information will be stored in their long-term memory, along with the knowledge of other possible sentence types and structures, to be drawn on when beginning a paragraph or an essay. Rather than having their working memory occupied with searching for a way to begin – or, if they’re revising an essay, to vary their sentences – they’ll be able to devote more cognitive capacity to what they want to say.

Those of us who are already competent writers have vastly underestimated the difficulties faced by many (if not most) students in reaching that point. In years past, the assumption was that teaching rules of grammar and parts of speech was sufficient. After studies determined that approach had no positive impact on student writing, and in some cases had a negative one (11), another school of thought took hold. Its proponents assumed students would basically pick up the conventions of written language if they just read enough mentor texts and engaged in enough writing (12). Given the generally dismal results, it’s time for a new approach, supported by research: explicit instruction, mentor texts or ‘worked examples’, and the deliberate practice that will enable students to transform their conceptual knowledge into knowing how to write. Not only will schools produce better writers, but easing the cognitive load imposed by writing will lead to better thinking as well.

Natalie Wexler
Author, The Knowledge Gap: The Hidden Cause of America’s Broken Education System–and How to Fix It (forthcoming from Avery, August 2019)
Co-author with Judith C. Hochman of The Writing Revolution: A Guide to Advancing Thinking Through Writing in All Subjects and Grades (Jossey-Bass 2017)

References
1. Willingham, D. (2017) The reading mind: a cognitive approach to understanding how the mind reads. San Francisco, CA: Jossey-Bass, pp. 116–18.
2. Ashman, G. (2018) The truth about teaching: an evidence-informed guide for new teachers. London: SAGE Publications, pp. 42–43.
3. Needham, T. (2019) ‘Cognitive load theory in the classroom’, researchED 3, pp. 31–33.
4. Alber, R. (2014) ‘Using mentor texts to motivate and support student writers’, Edutopia [Website]. Available at: www.edut.to/2IB3ZPI.
5. The Nation’s Report Card (2011) Writing 2011. US Department of Education, Institute of Education Sciences. Washington, DC: United States Government Publishing Office. Available at: www.bit.ly/2I9LGm1.
6. Ashman, G. p. 122.
7. Roediger, H. and Karpicke, J. (2006) ‘Test-enhanced learning: taking memory tests improves long-term retention’, Psychological Science 17 (3) pp. 249–255
8. Boser, U. (2017) Learn better: mastering the skills for success in life, business and school. New York, NY: Random House.
9. Ericsson, K. A. and Pool, R. (2016) Peak: secrets from the new science of expertise. New York, NY: Houghton Mifflin.
10. This example and others are taken from Hochman, J. C. and Wexler, N. (2017) The writing revolution: a guide to advancing thinking through writing in all subjects and grades. San Francisco, CA: Jossey- Bass.
11. Graham, S. and Perin, D. (2007) Writing next: effective strategies to improve writing of adolescents in middle and high schools. Washington, DC: Alliance for Excellent Education.
12. Calkins, L. M. (1986) The art of teaching writing. Portsmouth, NH: Heinemann.

Evidence-based school leadership

A veteran of speaking about evidence, Gary Jones flags up some concerns he has about the difficulty of leading a school in an evidence- informed way that is also meaningfully and, crucially, has an impact that matters.

The first researchED event I attended was the London national conference in September 2014. Without doubt, this was some of the most inspiring and influential professional development I had experienced in the 30 years I had been involved in education. It was inspiring because I was taking part in an event with over 1000 teachers who had given up a Saturday morning to speak and listen about something they cared about – namely, improving teaching and learning though the appropriate use of research evidence. It was influential in that it got me thinking, reading and writing about evidence-based school leadership and management.

researchED London 2014 got me thinking about evidence-based school leadership and management for two reasons. First, the vast majority of the sessions at the event had a focus on teaching and learning and little attention seemed to be paid to the role of research and other sources of evidence in the decision-making of senior leaders in schools. Second, that summer I had by chance read an article by Adrian Furnham(1) which introduced me to the discipline of evidence-based management and I was intrigued as to whether there was a possible synthesis with evidence-based education. This contributed to me writing a book – Evidence-based School Leadership and Management: a practical guide – and 220 blogposts (www.garyrjones.com/blog).

We need to have an honest conversation about teachers’ research literacy and their subsequent abilities to make research- informed changes in their practice.

So having written around 300,000 words on all things evidence-based, I would like to make the following observations about the current state of evidence-based practice within schools. First, the ‘evidence-based movement’ is not going away any time soon. We have 22 schools in the Research Schools Network; an increasing number of schools appointing schools research leads; hundreds if not thousands of educational bloggers contributing to discussions about how to improve education; social media and EduTwitter providing a forum for the articulation of views; over 20 researchED conferences scheduled for 2019; the Education Endowment Foundation (EEF) spending over £4m in 2017–18 to fund the delivery of 17 projects, involving 3620 schools and other educational settings reaching approximately 310,000 children and young people(2); and finally, we have Ofsted using research evidence to inform their inspection framework (3).

Nevertheless, despite all this time, effort and commitment being put into research and evidence-based practice, there is still much to ensure evidence-based practice contributes to improved outcomes for pupils. First, we need to have an honest conversation about teachers’ research literacy and their subsequent abilities to make research-informed changes in their practice. Research undertaken by the National Foundation for Educational and the EEF suggests that teachers have a weak variable knowledge of the evidence-based relating to teaching and learning and have a particularly weak understanding of research requiring scientific or specialist knowledge (4). Second, there is a distinction between the rhetoric and the reality of evidence-based practice within schools. Research undertaken for the Department for Education identified a number of schools where headteachers and senior leaders ‘talked a good game’ about evidence-informed teaching within their schools, whereas the reality was that research and evidence was not embedded within the day-to-day practice of the school (5). Third, it’s important to be aware there is a major debate taking place amongst educational researchers about randomised controlled trials, effect sizes, meta-analyses. Indeed, as Professor Rob Coe states: ‘Ultimately, the best evidence we currently have may well be wrong; it is certainly likely to change.’(6)

And finally, if I were to offer any advice to teachers, school leaders and governors/trustees who are interested in evidence-based practice, it would be the following. Becoming an evidence-based practitioner is hard work. It doesn’t happen by just reading the latest EEF guidance document, John Hattie’s Visible Learning or by spending one Saturday morning a year at a researchED conference. It requires a career-long moral commitment to challenging both your own and others’ practice, critically examining ‘what works’ to ensure whatever actions you take bring about improvements in pupil outcomes.

Dr Gary Jones is the author of Evidence-Based School Leadership and Management: a practical guide. Prior to his recent work – in blogging, speaking and writing about evidence-based practice – Gary worked in the further education sector and has over 30 years of experience in education as a teacher and senior leader. Gary is currently engaged by the University of Portsmouth as a researcher on projects looking at area-based reform and increasing socialmobility.
@DrGaryJones
www.garyrjones.com/blog

Further reading:

Brown, C. (2015) Leading the use of research & evidence in schools. London: UCL IOE Press.

Barends, E. and Rosseau, D. (2018) Evidence-based management: how to use evidence to make better organizational decisions. London: Kogan-Page.

Cain, T. (2019) Becoming a research-informed school. London: Routledge.

Jones, G. (2018) Evidence-based school leadership and management: a practical guide. London: SAGE Publishing.

Kvernbekk, T. (2016) Evidence-based practice in education: functions of evidence and causal presuppositions. London. Routledge.


References
1. Furnham, A. (2014) On your head: a magic bullet for motivating staff?, The Sunday Times, 13 July.
2. Education Endowment Foundation (2018) EEF annual report 2018. London: EEF. Available at: www.bit.ly/2Iw9ajY
3. Ofsted (2019) Education inspection framework: overview of research. London: The Stationery Office. Available at: www.bit.ly/31hVQbN
4. Nelson, J., Mehta, P., Sharples, J. and Davey, C. (2017) Measuring teachers’ research engagement: findings from a pilot study: report and executive summary. London: Education Endowment Foundation/ NFER.
5. Coldwell, M., Greany, T., Higgins, S., Brown, C., Maxwell, B., Stiell, B., Stoll, L., Willis, B. and Burns, H. (2017) Evidence-informed teaching: an evaluation of progress in England. Department for Education. London: The Stationery Office.
6. Coe, R. (2018) ‘What should we do about meta-analysis and effect size?’, CEMblog [Blog]. Available at: www.bit.ly/2ZcWm96

Ambitious rhetoric but the reality falls short

Education in Scotland is often trumpeted by some as exemplary, with other countries – such as Wales – using it as a template from which to build their own systems. But recent years have seen troubling levels of unhappiness in the education community about the reality of delivering the Curriculum for Excellence (CfE) – the once-lauded program of educational reinvention that was supposed to revolutionise the learning of a nation’s children. Here, Walter Humes looks at some of the problems with the CfE, and what went wrong.

This article is based not only on my own analysis of the situation in Scotland, but also on the views of a small sample of people who are well placed to comment on the extent to which educational policy and practice are informed by research evidence. They include academics and researchers who have worked with government agencies, funding bodies and local authorities, a senior figure in a national organisation that regularly responds to consultations and policy proposals, and an experienced headteacher. To encourage frankness, respondents were guaranteed anonymity. Responsibility for the text that follows is, however, entirely mine.

Behind the official discourse

The word ‘evidence’ appears no fewer than 41 times in the document A Research Strategy for Scottish Education (1). The paper’s aims include a commitment to ‘learning from data and evidence’, ‘empowering practitioners to produce and use evidence and data’, and the ‘effective commissioning and dissemination of evidence on ‘what works’. The reference to ‘what works’ suggests a rather narrow view of the function of educational research – it should be concerned with fundamental questions of meaning and value, not just practical recommendations – but the general thrust seems to indicate a positive attitude towards the use of evidence at classroom, school, local authority and national levels.

However, these aspirations need to be understood against a background of mounting criticism about the Scottish Government’s record in relation to research and the use of evidence in policy development. The OECD report of 2015 which reviewed the progress of Scotland’s flagship policy of Curriculum for Excellence, launched in 2004, said that it could not conduct a full evaluation of the reform programme because there was insufficient information available (2). It called for ‘a robust evidence-base on learning outcomes and progression’. A similar plea was made in a report by the International Council of Education Advisers (ICEA), appointed by the Scottish Government in 2016, partly in response to public disquiet about a perceived decline in standards. One of the recommendations in the ICEA report was that the government should work with universities and other providers ‘to further develop and implement the educational research strategy published in 2017. This will enhance the system’s capacity for independent research and evaluation, and build a Scottish empirical evidence base’(3).
It is significant that it has taken pressure from outside Scotland to produce a shift of attitude in relation to research. Until recently, the post-devolution period was marked by progressive disengagement of government officials from the research community (4). Many academics felt that researchers were regarded with suspicion by politicians, inspectors and local authority officers, especially if their work took a critical line. The notion that critique may lead to improved strategies was not welcome in the conformist culture of Scottish education.

Although the mutual mistrust has eased slightly, with some reports that government is more willing to listen, it should not be overstated. One researcher recounted conflicting experiences in relation to the influence of his work on policy. He said that a particular project ‘did influence aspects of government policy’ and offered two explanations: first, the agency funding the research played an important role ‘in brokering access to policy makers’; and secondly, the research was ‘timely’ in the sense that the topic being investigated was already ‘gaining some momentum in government’.

Another project fared less well. It had minimal impact partly because ‘multiple policies were being introduced and the civil servants had little time to engage fully with the issues’. Furthermore, there seemed to be limited capacity to synthesise the results of this project with other related studies which had been commissioned, and so the opportunity to better inform proposed policies was missed.

These examples illustrate that policy making is rarely an entirely rational process. It is often messy, time- constrained, and subject to chance and the interventions of powerful players. Furthermore, research that is consistent with the current direction of travel within official policy circles is more likely to make an impact than research which raises challenging questions. This casts doubt on the degree of objectivity with which research evidence is reviewed by officials.

Longitudinal surveys

Any education system requires reliable information on which to base decisions. Over the last ten years, Scotland has withdrawn from certain surveys that provided useful comparative data which enabled trends to be identified. These included two international studies: PIRLS (Progress in International Reading Literacy Study) and TIMSS (Trends in International Mathematics and Science Survey). The ostensible reason was cost, but the decision was widely criticised as indicating a desire to conceal disappointing results. The Scottish Survey of Literacy and Numeracy (SSLN) was scrapped after the 2016 results indicated some downward trends, a pattern that was also shown in the findings of the 2015 PISA (Programme for International Student Assessment) report. Scotland does, however, continue to take part in the PISA programme. The introduction in 2017–18 of Scottish National Standardised Assessments (SNSAs) was controversial and the robustness of the data they will generate has been questioned.

Research that is consistent with the current direction of travel within official policy circles is more likely to make an impact. This casts doubt on the degree of objectivity with which research evidence is reviewed by officials.

Scotland’s current position is at odds with its historical record in this regard. A persistent critic of the Scottish Government’s attitude to independent research has been Lindsay Paterson, Professor of Education Policy at Edinburgh University. He has pointed out that in the middle decades of the 20th century, Scotland was a pioneer in the use of statistical surveys of school pupils, through the work of the Scottish Council for Research in Education. Later, the Centre for Educational Sociology at Edinburgh University carried out school leaver studies, starting in 1962 and running to 2005. These enabled researchers to evaluate the effects of major educational reforms, such as the introduction of comprehensive education and the expansion of higher education. Paterson argues that the current dearth of good-quality survey evidence makes Scotland a ‘data desert’. His conclusion is bleak: ‘There is now no survey series with which to hold Scottish government to account, and not even an openness in government to methodological discussion of the kinds of evidence that would be needed. This closing of minds to science is the very antithesis of accountability.’(5). Echoing the concerns of Paterson, Howieson and Croxford have reinforced the need for ‘system-wide, longitudinal data to enable a country to “know” its education and training system’.6 One longitudinal study that does exist is Growing Up in Scotland, started in 2005, tracing the development of ‘nationally representative cohorts’ of children over time (www.growingupinscotland.org.uk). It has produced interesting findings but it could not be used to evaluate Curriculum for Excellence, because there was no equivalent earlier study to enable meaningful comparisons to be made.

Local authorities and other organisations Central government is not the only agency with an interest in research evidence. Local authorities routinely collect data on the attainment of schools in their area, including standardised assessments of literacy and numeracy. This information can be used in discussions with headteachers about areas of strength and weakness. A key priority in recent years has been the desire to raise attainment generally, but in particular to reduce the gap in attainment between pupils in socially advantaged areas and those in deprived communities. Some headteachers claim that the instrument used to measure this, the Scottish Index of Multiple Deprivation (SIMD), based on postcodes, is too crude: there are disadvantaged children living in ‘affluent’ areas and not all children in ‘poor’ areas are deprived. This can be a particular problem in rural communities where the social and economic profile may be resistant to classifications that work in inner cities. Similarly, Insight, an online benchmarking tool for secondary schools and local authorities designed to help improve outcomes, makes it difficult to detect reliable trends when pupil numbers are small. There is also a concern about the capacity of senior staff to interrogate data and to use it effectively to make improvements. Teachers at all levels would benefit from opportunities to interpret research findings, whether quantitative or qualitative – a provision that would require both time and support.

This last point connects with an observation made by a senior academic familiar with staff development approaches in a number of Scottish local authorities. She reported that John Hattie’s work (as set out in Visible Learning and Visible Learning for Teachers) was strongly promoted, presumably because it drew on a wide range of research evidence and offered clear guidance about high-impact teaching strategies. But the academic wondered how well some of those recommending Hattie’s ideas understood the nuances of his approach. A simplistic application of research evidence may have unintended negative consequences.
Education Scotland, the national advisory body on the curriculum, claims that it draws on research in framing policy advice, though its record in this regard is patchy. The Scottish Qualifications Authority, which runs the national examination system, does rather better, collecting and analysing data on exam entries and results for the qualifications it offers. In recent years, the General Teaching Council for Scotland has sought to encourage teachers to engage in various forms of professional enquiry designed not only to enhance personal development but also to benefit the school through sharing insights with senior management and colleagues. The extent to which this represents a new approach and a genuine opening-up of professional autonomy has been questioned(7).

Grassroots developments and their limitations

There are a few indications of more positive developments. After years of disengagement from the research community, there are now regular contacts between the Cabinet Secretary for Education (John Swinney) and University Deans of Education. For these to be effective, leaders in the academic community will need to be prepared to abandon their tendency to collude in the deferential culture of Scotland’s educational establishment. Critics (such as the present writer) claim that academics have sometimes been complicit in their own containment. Perhaps a more encouraging development is taking place at grassroots level, where independent websites, personal blogs and social networking platforms enable teachers to share ideas, recommend reading and report on pedagogic innovations. In addition, increased numbers of practitioners are undertaking part-time study for postgraduate degrees. And judging from the success of last year’s well-attended researchED conference in Scotland, independent of the national agencies, there is a growing movement by teachers seeking to shape their own professional development and to pass on their insights to others. This event included interesting presentations on metacognition, memory research, the art and science of learning to read, the relation between family income and children’s developmental outcomes, and how teachers can best engage with research. The old ‘top-down’ model, led by government officials and controlled by bureaucratic institutions, has not served Scotland particularly well. A development that suggests classroom teachers are exercising greater agency in identifying topics worth investigating is surely to be welcomed.

But will that be enough? Here we need to be realistic about the political context. All governments tend to take a short-term view of policy initiatives. They think in terms of the next election and want to be able to boast of having fulfilled at least some of their promises. Many educational problems are complex and long-term, resistant to simple answers and ‘quick fixes’. Research evidence may be welcome up to a point, but in the cut and thrust of elections more powerful imperatives may come into play. Presentation becomes more important than substance and the language of public relations takes over from the measured tones of research. As Ben Levin (a Canadian who has worked both as an academic and a government adviser) has written: ‘In the political world belief is everything … No amount of evidence will displace or replace politics.’(8).


References
1. Scottish Government (2017) A Research Strategy for Scottish Education. Edinburgh: Scottish Government.
2. Organisation for Economic Cooperation and Development (2015) Improving schools in Scotland: an OECD perspective. Paris: OECD.
3. International Council of Education Advisers (2018) Report 2016– 18. Edinburgh: Scottish Government.
4. Humes, W. (2013) ‘Political control of educational research’, Scottish Educational Review 45 (2) pp. 18–28.
5. Paterson, L. (2018) Scottish education policy: why surveys matter. CES Briefing No. 66. Edinburgh: Centre for Educational Sociology.
6. Howieson, C. and Croxford, L. (2017) ‘To know ourselves? Research, data and policy-making in the Scottish education system’, Journal of Education and Work 30 (7) pp. 700–711.
7. Humes, W. (2014) ‘Professional update and practitioner enquiry: old wine in new bottles?’, Scottish Educational Review 46 (2) pp. 54– 72.
8. Levin, B. (2010) ‘Governments and education reform: some lessons from the last 50 years’, Journal of Education Policy 25 (6) pp. 739–747.

Message from the EDitor

2018 was a milestone for us at researchED: five years since our first conference in Dulwich, London, and no one could have predicted where it would take us. In the last year alone, we’ve been to New Zealand, Pretoria, Toronto, the Netherlands, Sweden, Philadelphia, and dozens of other places in the UK and beyond. The national UK conference sold out at 1300 attendees, with a waiting list of 600 more. In 2019 we’re not slowing down, with all of those countries on our event list, plus many more cities and countries: Dubai, Cape Town, Vancouver, Geneva, and more to be announced. It seems teachers and educators around the world are waking up to evidence.

What has struck me most about this global conversation is how international the dilemmas are that educators face. Different cultures and nations lead to different contexts; but the human dimension is universal. This presents us with a terrific opportunity: to share our collective wisdom as a community of practice to drive the quality, standards, efficiency and morality of what we collectively do.

We live in interesting times, at an intersection of unprecedented communicative powers where conversations are dense, instantaneous and international. Where once a teacher’s voice reached the back of the room at best, now ‘around the world it flies in the twinkling of an eye’. If we can hook this new agora to structured evidence, experience, reason and wisdom, then there are prizes to be won for everyone. If we don’t succeed then we face more of the same for decades to come: more folk teaching, more inequity, more waste and the same outcomes for the same children.

But I have hope we can choose the former. Never before has the international education community been so animated by the need to root its craft in evidence. And that’s what researchED stands for. I hope you enjoy issue 3.

Tom Bennett

researchEDitor

Founder of researchED

To sleep, perchance to learn

Joe Kirby is a teacher and deputy head who writes extensively on translating research into the classroom. Here he looks at how understanding sleep can help us make gains in helping students to learn and achieve.

Nature’s blunder?

Sleep seems like a biological puzzle. It makes animals conspicuously vulnerable. Is the land of Nod a spectacular blunder on the part of evolution?

All animals sleep in some way, even jellyfish. Cheetahs, the fastest land creatures on earth, sleep for up to 18 hours a day. So do most newborn babies, with the fastest growing brains on earth. Sleep is even more vital than food: animals die of sleep deprivation before starvation. Sleep must serve some evolutionary purpose, but what?

Fifty years of research on the sleeping brain has revealed useful insights. Sleep restores our brain and body cells.(1,2) Sleep consolidates our memories and our learning.(3,4) Sleep plays a vital role in our emotions, moods, decisions, cognition, health and immune systems.(2,5,6) Sleep regulates our metabolism, appetite and gut microbiomes.(2,7) Thousands of studies show that sleep enhances every major organ and every biological function, according to world-leading experts on sleep.(2,6)

Sleep deprivation

Sleeplessness increases our stress hormones and worsens decision-making.(8) Underslept people are more moody, irritable, tense and anxious.(9) Sleep deprivation impairs attention and inhibits learning.(10,11,12) The Great British sleep survey suggested that sleep-deprived people are five times more likely to feel lonely and seven times more likely to experience feelings of helplessness.(13) Sleep deprivation is linked with obesity, diabetes, stroke, heart attack and cancer.(2,6) It causes enduring damage.(14)

Teenagers are now chronically sleep deprived, researchers are finding. Teenagers should sleep for nine to ten hours, but many sleep far less.(15) Poorer neighbourhoods tend to be noisier, making a good night’s sleep harder for our poorest students.(16) Sleep deprivation makes teens more hostile, creates learning difficulties and impairs academic performance and has lasting detrimental cognitive effects.(17,18)

Sleep habits

Changing sleep behaviour patterns is hard, but sleep habits can be honed. A starting point is taking what we’ve learned from the science of sleep and sharing it with students. What have scientists discovered about how to get better sleep?

Sleep deprivation makes teens more hostile, creates learning difficulties and impairs academic performance and has lasting detrimental cognitive effects

Most important is to stick to a sleep schedule. Going to bed and waking up at the same time each day (including weekends) helps. It’s hard to adjust to changing sleep patterns. Sleeping later on weekends can’t catch us up and makes it harder to wake up on Monday morning.2,6 Setting a bedtime alarm is also recommended by sleep experts.2,6

Science also tells us that caffeine and alcohol reduce sleep quality.(2,6) We should avoid drinking these things in the evenings.

Screens reduce sleep quality, too.(2,6,19) Three things we can do, then: plug our phone, tablet and laptop chargers outside our bedrooms; stop using screens an hour before bedtime; and get a non-digital alarm clock.

Possibilities in schools

How might schools share this research with teachers and students?

One possibility is a CPD session on sleep for teachers and tutors. As teachers, we could do with applying this research in our own everyday lives! It is particularly difficult for those of us with young children ourselves. Books like Go the F*** to Sleep testify to the importance of knowing how sleep habits help children get into healthy sleeping patterns.

Another option is an assembly on sleep from senior leaders to show why and how to improve sleep patterns. Or a parents’ assembly on sleep to share the advantages of a sleep schedule and the damage of sleep deprivation, screens, alcohol and caffeine drinks.

Or how about a simple sleep survey to identify students who admit to struggling with sleep deprivation? A final possibility is sleep nudges: messages sent to parents and even students to remind them of making changes in their sleep schedule, patterns and habits – perhaps to those who opt in to supportive reminders after self-identifying as experiencing problematic sleep.

Roger Federer, who has won a men’s world-record 20 Grand Slam singles tennis championships, sleeps 11 hours a night. Perhaps, as an Irish proverb has it, sleep is better than medicine.


References

1. Killgore, W. D. (2010) ‘Effects of sleep deprivation on cognition’, Progress in Brain Research 185 (1) pp. 105–29.

2. Walker, M. (2017) Why we sleep: the new science of sleep and dreams. London: Penguin

3. Ellenbogen, J. M., Payne, J. D. and Stickgold, R. (2006) ‘The role of sleep in declarative memory consolidation’, Current Opinion in Neurobiology 16 (6) pp. 712–22.

4. Fattinger, S., de Beukaelaar, T., Ruddy, K., Volk, C., Heyse, N., Herbst, J., Hanloser, R., Wenderoth, N., Huber, R. (2017) ‘Deep sleep maintains learning efficiency of the human brain’, Nature Communications 8: 15405.

5. Irwin, M., Mascovich, A., Gillin, J. C., Willoughby, R., Pike, J. and Smith, T. L. (1994) ‘Partial sleep deprivation reduces natural killer cell activity in humans’, Psychosomatic Medicine 56 (6) pp. 493–498.

6. Winter, W. C. (2017) The sleep solution. London: Penguin.

7. Spiegel, K., Tasali, E., Penev, P., Van Cauter, E. (2004) ‘Brief communication: sleep curtailment in healthy young men is associated with decreased leptin levels, elevated ghrelin levels and increased hunger and appetite’, Annals of Internal Medicine 141 (11) pp. 846–850.

8. Harrison, Y. and Horne, J. A. (2000) ‘The impact of sleep deprivation on decision making: a review’, Journal of Experimental Psychology: Applied 6 (3) pp. 236–249.

9. Minkel, J. D. (2010) ‘Affective consequences of sleep deprivation’ [PhD Thesis], Publicly Accessible Penn Dissertations 218.

10. Curcio, G., Ferrara, M. and De Gennaro, L. (2006) ‘Sleep loss, learning capacity and academic performance’, Sleep Medicine Reviews 10 (5) pp. 323–37.

11. Alhola, P. and Polo-Kantola, P. (2007) ‘Sleep deprivation: impact on cognitive performance’, Neuropsychiatric Disease and Treatment 3 (5) pp. 553–567.

12. Lo, J. C., Ong, J. L., Leong, R. L., Gooley, J. J. and Chee, M. W. (2016) ‘Cognitive performance, sleepiness, and mood in partially sleep deprived adolescents: the need for sleep study’, Sleep 39 (3) pp. 687–98.

13. Sleepio (2012) The Great British sleep survey: new data on the impact of poor sleep. Available at: www.greatbritishsleepsurvey. com.

14. Kurth, S., Dean, D. C., Achermann P., O’Muircheartaigh, J., Huber R., Deoni S. C. L. and LeBourgeois M. K. (2016) ‘Increased sleep depth in developing neural networks: new insights from sleep restriction in children’, Frontiers in Human Neuroscience 10 (1) p. 456.

15. Crowley, S. J., Wolfson, A. R., Tarokh, L., Carskadon, M. A. (2018) ‘An update on adolescent sleep: new evidence informing the perfect storm model’, Journal of Adolescence 67 (1) pp. 55–65.

16. Huffington, A. (2016) The sleep revolution: transforming your life, one night at a time. London: WH Allen.

17. Talbot, L. S., McGlinchey, E. L., Kaplan, K. A., Dahl, R. E. and Harvey, A. G. (2010) ‘Sleep deprivation in adolescents and adults: changes in affect’, Emotion 10 (6) pp. 831–841.

18. Short, M. and Louca, M. (2015) ‘Sleep deprivation leads to mood deficits in healthy adolescents’, Sleep Medicine 16 (8) pp. 987–93.

19. Chang, A. M., Aeschbach, D., Duffy, J. F. and Czeisler, C. A. (2015) ‘Evening use of light-emitting eReaders negatively affects sleep, circadian timing, and next-morning alertness’, Proceedings of the National Academy of Sciences 112 (4) pp. 1232–1237.

Attachment theory: what do teachers need to know?

Attachment theory is frequently cited as an important part of a teacher’s understanding of how to manage and understand behaviour in the classroom. Nick Rose unpacks some of the background to this area and looks at how it maps on to practice in a meaningful way.

The British psychologist John Bowlby is fairly synonymous with attachment theory. From his clinical work with ‘juvenile delinquents’ over the course of World War II, he began formulating ideas about the role of early and prolonged separation from parents and caregivers in the development of problems in those children’s social and emotional development.

The core of his theory is that attachment is an evolutionary adaptation which is characterised by a child seeking proximity to a caregiver when that child perceives a threat or suffers discomfort. Given the intense needs of human infants, it is perhaps unsurprising that the formation of a ‘deep and enduring emotional bond that connects one person to another across time and space’ evolved to improve the chances of an infant’s survival.

Over the first year of life, an infant begins to develop attachments to parents or carers. As these attachments form, we tend to see characteristic behaviour in infant interactions with their attachment figure:

  • Stranger anxiety – the infant responds with fear or distress to arrival of a stranger.
  • Separation anxiety – when separated from parent or carer the infant shows distress; and upon that attachment figure’s return, a degree of proximity- seeking for comfort.
  • Social referencing – the infant looks at the parent or carer to see how they respond to something novel in the environment. The infant looks at the facial expressions of the parent or carer (e.g. smiling or fearful), which influence how they behave in an uncertain situation.

Attachment figures aren’t simply individuals who spend a lot of time with the infant, or the one who feeds the infant; they are typically the individuals who respond the most sensitively – for example, often playing and communicating with the infant. For many infants, the principal attachment figure is their mother, but fathers, grandparents or siblings may also fulfil this role. By about 18 months, most infants enjoy multiple attachments, though these may be somewhat hierarchical, with a primary attachment figure of particular importance. The behaviour relating to attachment develops over early childhood – for example, babies tend to cry because of fear or pain, whereas by about two years of age they may cry to beckon their caregiver (and cry louder or shout if that doesn’t work!).

Bowlby believed these early experiences of attachment formed an internal ‘working model’ which the child used to form relationships with secondary attachment figures – and later, friendships with peers and eventually romantic and parenting relationships in adult life.

Mary Ainsworth: types of attachment

There are individual differences in the behaviour related to attachment. Famous observation studies by Mary Ainsworth (who worked with John Bowlby during the 1950s) identified that in normal children there were a range of attachment types:

  • Secure attachment: The majority of infants, across different cultures, tend to have an attachment style typified by strong stranger and separation anxiety along with enthusiastic proximity-seeking with the parent upon reunion.
  • Insecure-avoidant: Slightly more common in Western cultures, an insecure-avoidant attachment tends to be characterised by avoiding or ignoring the caregiver and showing little emotion (whilst experiencing inward anxiety) when the caregiver leaves the room, and displaying little enthusiasm when the caregiver returns.
  • Insecure-resistant: Perhaps more common in ‘collectivist cultures’, an insecure-resistant (sometimes also called insecure-ambivalent) attachment tends to be characterised as showing intense distress during separation, and being difficult to comfort when the caregiver returns. Infants with this attachment type may also show some rejection or resentment towards the caregiver after a separation.
  • Disorganised attachment: Added in the 1990s, infants with a disorganised attachment tend to show no consistent pattern in behaviour towards their caregiver. For example, they may show intense proximity-seeking behaviour one moment, then avoid or ignore the caregiver the next.

If you are interested in some of the history and the origins of attachment theory, the work of John Bowlby and Mary Ainsworth are good places to start. There’s a nice summary in Inge Bretherton’s 1992 article ‘The origins of attachment theory’.(1)

Many children may display behaviour suggesting an insecure attachment type which may make it harder to form peer friendships, and this likely underlies an association between insecure and disorganised attachment and higher levels of behaviour problems. However, it’s not certain that differences in attachment are specifically the cause of behaviour problems. For example, a meta-analysis by Fearnon et al.(2) found that socioeconomic status accounted for a considerable portion of the variance in behaviour problems in childhood.

The teacher isn’t in a position to either make the clinical judgement or investigate the cause of problematic behaviour they suspect may relate to a safeguarding concern.

So, whilst there’s reasonable evidence to suggest that these individual differences in attachment correlate to differences in behaviour within school, it is very important to note that these differences are not ‘pathological’ in a clinical sense. Given that about 30– 35% of representative populations have an ‘insecure’ attachment, NICE suggests that it is unhelpful to view insecure attachment as an ‘attachment problem’.

Reactive attachment disorder (RAD)

A popular misconception about attachment is a conflation between the ‘types of attachment’ that children possess and an ‘attachment disorder’. CoramBAAF, a leading charity working within adoption and fostering, suggests that even when used by those trained to do so, attachment classifications cannot be equated with a clinical diagnosis of disorder. While the insecure patterns may indicate a risk factor in a child’s development, they do not by themselves identify disorders.The term ‘attachment disorder’ refers to a highly atypical set of behaviours indicative of children who experience extreme difficulty in forming close attachments. NICE suggests that the prevalence of attachment disorders in the general population is not well established, but is likely to be low. However, there are substantially higher rates among young children raised in institutional care or who have been exposed to abuse or neglect. The 2003 Office for National Statistics report for the Department of Health(3) estimated that somewhere between 2.5% to 20% of looked after children had an attachment disorder (depending on whether a broad or narrow definition was used).

There is a broad distinction between two classifications of RAD:

  • Inhibited attachment disorders: Characterised by significant difficulties with social interactions such as extreme detachment or withdrawal – usually attributed to early and severe abuse from ‘attachment figures’ such as parents.
  • Disinhibited attachment disorders: Characterised by diffuse attachments, as shown by indiscriminate familiarity and affection without the usual selectivity in choice of attachment figures – often attributed to frequent changes of caregiver in the early years.

Reactive attachment disorder is a psychiatric condition and often accompanied by other psychiatric disorders. CoramBAAF advises caution, arguing that the lack of clarity about the use of attachment concepts in describing children’s relationship difficulties can create confusion. A diagnosis of an attachment disorder can only be undertaken by a psychiatrist.

Unfortunately, there are also no widely applicable, evidence-based set of therapies for RAD. However, there has been significant concern expressed about some therapies. One example is ‘holding therapy’, involving holding a child in a position which prevents escape whilst engaging in an intense physical and emotional confrontation. CoramBAAF argues there is nothing in attachment theory to suggest that holding therapy is either justifiable or effective for the treatment of attachment disorders. Less controversial therapies involve counselling to address the issues that are affecting the carer’s relationship with the child and teaching parenting skills to help develop attachment.

What should teachers be doing?

This is why we can question the apparent excitement about attachment theory at the moment: there’s nothing a teacher can do that they shouldn’t already be doing.

Firstly, given the relationship between attachment disorders and abusive or neglectful relationships, perhaps some teachers are worried that they need to know about attachment disorder in order to fulfil their statutory safeguarding responsibilities. However, it’s important to note that whilst some children with RAD have suffered abuse or neglect, that doesn’t mean that problematic behaviour is evidence of such. The teacher • isn’t in a position to either make the clinical judgement
or investigate the cause of problematic behaviour they suspect may relate to a safeguarding concern. If a student is behaving in a way which concerns you, then report that concern to your designated member of SLT (as you would any safeguarding concern). Whether or not you might think a child has an insecure attachment or a disordered attachment isn’t really your professional call.

Secondly, it may be that some teachers feel they need to know more about attachment in order to support students with behaviour problems in school. However, the advice for working with RAD students isn’t really any different from good behaviour management generally. Teachers should not confuse their role in loco parentis with being the primary caregiver for a child. For example, the Center for Family Development is an attachment centre based in New York specializing in the treatment of adopted and foster families with trauma and attachment disorder. In their Overview of Reactive Attachment Disorder for Teachers they point out that, as a teacher, you are not the primary caregiver for a child you teach.

You cannot parent this child. You are the child’s teacher, not therapist, nor parent. Teachers are left behind each year, [it’s] normal. These children need to learn that lesson.(4)

They recommend approaching behaviour through explicit teaching of consequences: that there’s a consequence associated with good behaviour and there’s a consequence for poor behaviour.

Further suggestions include:

  • Create a structured environment with extremely consistent rules.
  • Be consistent and specific when giving praise or confronting poor behaviour.
  • Provide the child with choices, but choices provided by you, the teacher.
  • Maintain your professional boundaries (avoid attempting to create ‘friendship’ or ‘intimacy’ with the child).
  • Keep calm and avoid losing your temper; communicate directly, positively, and firmly.
  • When implementing consequences, remain unemotional and assume a tone that says, effectively, ‘That’s just the way business is done – nothing personal.’

In short, teachers should do the same things that they do when working with any student with challenging behaviour. Whether the challenging behaviour is due to an issue with attachment isn’t really the issue.

In summary

Whilst there’s a relationship between insecure attachment and behaviour problems in the classroom, teachers are not qualified to diagnose a student’s attachment type nor engage in any kind of therapy with that student. There is a condition called ‘reactive attachment disorder’ which has a higher incidence within looked-after students. Again, teachers are not qualified to make this psychiatric diagnosis.

There is an important difference between the professional role of a teacher and the role of a primary caregiver, and it’s vital that recent interest in attachment theory within the profession doesn’t blur that line. Where teachers are concerned that behaviour presented in the classroom might indicate abuse or neglect, they are already obliged by law to report these concerns (but not investigate them or try to involve themselves in resolving them).

In terms of managing the behaviour of students with attachment problems so that they can overcome the difficulties of their family background and experience success within school, the guidance suggests things like a structured environment, consistent rules, professional distance and focusing feedback on behaviour not the child – advice that forms the basis of good behaviour management regardless of the cause of problematic behaviour.

It may be the case that specific children with RAD will have different strategies which will help them achieve in school. However, that’s also the case for any student with SEND. Perhaps what is important for teachers is not specific training in attachment theory to help them diagnose attachments, but a clear understanding of their school’s SEND system and time to read, implement and work with SEND coordinators to ensure any specific strategies suggested by an educational psychologist or child psychiatrist are employed effectively.

This article first appeared on Nick’s blog, www.evidenceintopractice.wordpress.com


References

1. Bretherton, I. (1992) ‘The origins of attachment theory: John Bowlby and Mary Ainsworth’, Developmental Psychology 28 (5) pp. 759–775.

2. Fearon, R. P., Bakermans‐Kranenburg, M. J., Van IJzendoorn, M. H., Lapsley, A. M. and Roisman, G. I. (2010) ‘The significance of insecure attachment and disorganization in the development of children’s externalizing behavior: a meta‐analytic study’, Child Development 81 (2) pp. 435–456.

3. Meltzer, H., Gatward, R, Corbin, T., Goodman, R. and Ford, T. (2003) The mental health of young people looked after by local authorities in England. Office for National Statistics/Department of Health. London: The Stationery Office.

4. Center for Family Development (2007) An overview of reactive attachment disorder for teachers. Available at: www.bit.ly/2CL9t7D (Accessed 25 Jan 2019).

A message from our founder, Tom Bennett

We’ve managed to do so much with almost nothing. So far we have just about broken even with ticket sales (at deliberately affordable prices) and event-by-event sponsorship. Our ambition is to start to build a small core team who can run these days, and our website, so that we can grow, and offer more free resources and low-cost days to the education communities. Your donation would fund the time of this core team, plus help us to rebuild and maintain our website, which is crucial for sharing free resources from conference days.

We believe that we are on the edge of an evidence revolution in education. But it won’t happen by itself.

Please donate to our project, and help us help teachers, schools, and most of all students. And help us to keep our independence.

We’d be deeply grateful for any assistance you can give, because while all of our efforts are done for the greater good, it is often desperately hard for an entirely volunteer-driven organisation to be sustainable. Your donation can help us to continue to do the good work we do, and to build and grow so that we can do more in the future.

Thanks for reading!

Why all the fuss about randomised trials?

Hamish Chalmers is a teacher and a lecturer in applied linguistics at the University of Oxford. Here he demystifies the opportunities and challenges that randomised controlled trials – RCTs – offer education and the classroom. They are often seen as a gold standard in research, and being aware of the differences between these qualities is essential to appreciating their value.

In the past five years or so, randomised controlled trials (RCTs) have firmly entered the lexicon of educational research. They are fast becoming the preferred method by which to evaluate the effects of educational interventions in the UK. One-third of English state schools have taken part in RCTs funded by the Education Endowment Foundation, and RCTs are routinely referred to in order to guide policy decisions. But what is so special about RCTs that they are enjoying such privilege?

In 1957, pioneering experimental social scientist Donald Campbell laid down the fundamental principle of experimentation, saying that ‘The very minimum of useful scientific information involves at least one formal comparison and therefore at least two careful observations’(1). In education research this means that to understand the effects of a new teaching approach we need to compare what happens when pupils are taught using it with what happens when they are taught using an alternative approach.

It is impossible for one group of pupils to be taught simultaneously using more than one approach. Therefore, we need to create comparison groups that are approximations of each other. This has been attempted in several ways. For example, data from one group of pupils can be compared with data from another group (PISA rankings are a good example of this). Alternatively, pupil outcomes before a new intervention is introduced can be compared with outcomes afterwards (average reading attainment in the UK before and after the introduction of the phonics screener, for example). Or, groups of pupils can be matched on characteristics such as age and socioeconomic status, then each group taught using different approaches and their outcomes compared.

As any primary school pupil can tell you, a key requirement of any scientific experiment is that it is a fair test. One way of helping to make an educational experiment fair is to ensure that the groups of children being compared are as similar as possible. The designs described above fall short of this basic requirement. For example, PISA ranking relies on data from different children in different countries to assert the relative effectiveness of different approaches to teaching. Comparing attainment before and after an intervention does not account for changes in the children over time (the children at the beginning of the intervention are essentially different people by the end of it). Matched groups of children may be similar on characteristics we know about, but what about important things we don’t know about or haven’t measured?

A fair test requires that comparison groups have similar proportions of pupils who share characteristics that could affect the way they respond to the interventions being compared. That’s all well and good if you are confident that you can identify every conceivably influential characteristic of your pupils. Although even if that were possible, would this result in a fair distribution of all influential characteristics? The only honest answer is ‘We can’t know.’ In addition to characteristics that we can identify, there are likely to be some that we can’t. How do we account for things like personal enthusiasm for a subject, relevant experience outside of school, individual idiosyncrasies, and so on? These are all potentially important characteristics that we have no clear way to identify and quantify, and therefore no way to deliberately distribute equally across groups.

Differences among pupils emphasise the complexity of human beings. These differences and the resulting complexity is why random allocation to comparison groups is so powerful. Random allocation takes into account how messy human beings are and distributes the mess fairly. By deciding at the flip of a coin who goes in one group and who goes in the other, random allocation creates groups that differ only as a result of the play of chance. This is not the same as saying that groups are ‘equal’ (they probably won’t be in some respects), but it does mean that the groups are not systematically different, and that any differences result from pure coincidence. As a result, we can be more confident than with other research designs that any differences in outcomes between comparison groups are due to differences in the interventions and not because of non- random differences (biases) between the pupils in the comparison groups.

Failing to properly account for systematic differences between comparison groups can massively influence how we interpret the results of educational research. Consider driver’s education, a popular way to try to reduce car crashes among young drivers. Data from non-randomised comparisons has been used to promote this intervention. Researchers looked at the rates of car crashes among youths who had taken these classes and youths who had not, and they found that the latter were more than twice as likely to have been involved in a car crash than the former(2). When driver’s education was evaluated in a series of RCTs, however, very little difference in accident rates was detected between drivers randomly allocated to attend the classes and drivers randomly allocated to not take those classes(3). So, which evidence do you trust more? The non-randomised studies did nothing to account for possible differences between people who took the classes and those who did not. The RCT ensured that, even if not identical, the comparison groups differed only by chance.

As it turns out, there is a good explanation for why these two approaches came to conflicting conclusions. In a separate study, researchers found that people who take driver’s education courses tend to display psychological characteristics that are compatible with safer attitudes to road use. The drivers in each group in the non-randomised studies were systematically different from each other.

Failing to properly account for systematic differences between comparison groups can massively influence how we interpret the results of educational research.

The difference in results in the driver’s education studies had a plausible explanation. However, we are not always able to unpick causal relationships so easily. Even so, teachers must still take decisions about their practice. In a study of an after-school programme designed to reduce anti-social behaviour in primary school children,(4) non-randomised evaluations of the programme suggested that it helped. On the basis of that finding, schools were preparing to roll out the programme to all children. When it was evaluated in an RCT, however, researchers found that instances of anti- social behaviour increased in children who had taken part in the programme compared to their peers who had not. Unlike the driver’s education studies, there was little to explain why this was. Nonetheless, schools were faced with a choice over what to do. Should they trust the results of the non-randomised study, and roll out the programme to all children? Or should they trust the results of the RCT and cancel it? As with the driver’s education example, their choice was between a study in which they could not confidently say whether like was being compared with like, and one in which they knew that researchers had used the best method available for creating unbiased comparison groups. Logic prevailed and they chose to cancel the programme.

Random allocation to comparison groups is the only defining feature of an RCT, and it is the only feature that prevents allocation bias. This simple feature is why RCTs are the preferred method for assessing programme effectiveness. When faced with decisions about practice, all else being equal, teachers and policy makers must decide whether they trust the findings of these fair tests or the findings of studies for which no similar reassurance is possible.


References

1. Campbell, D. T. (1957) ‘Factors relevant to the validity of experiments in social settings’, Psychological Bulletin 54 (4) pp. 297–312, p. 298.

2. MacFarland, R. A. (1958) ‘Health and safety in transportation’, Public Health Reports 73 (8) pp. 663–680.

3. Vernick, J. S., Li, G., Ogaitis, S., MacKenzie, E. J., Baker, S. P. and Gielen, A. C. (1999) ‘Effects of high school driver education on motor vehicle crashes, violations, and licensure’, American Journal of Preventive Medicine 16 (1S) pp. 40–46.

4. O’Hare, L., Kerr, K., Biggart, A. and Connolly, P. (2012) Evaluation of the effectiveness of the childhood development initiative’s ‘Mate- Tricks’ pro-social behaviour after-school programme. Available online at: www.goo.gl/sVUtFJ (Accessed 10 July 2018).

An interview with…Professor Paul Kirschner

Paul A Kirschner is Distinguished University Professor at the Open University of the Netherlands and Visiting Professor of Education at the University of Oulu, Finland. He is an internationally recognised expert in the fields of educational psychology and instructional design. He is past President of the International Society for the Learning Sciences and former member of the Dutch Educational Council. He is also a member of the Scientific Technical Council of the Foundation for University Computing Facilities, chief editor of Journal of Computer Assisted Learning and associate editor of Computers in Human Behaviour.

His seminal paper, ‘Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry- based teaching’, was published in 2006, co- written with John Sweller and Richard E Clark. One of the most cited papers in education, it revolutionised the attitudes of many towards the effectiveness of enquiry versus direct instruction.

researchEDitor Tom Bennett spoke to him in the British Museum, London, where they discussed that paper, what led him to write it, and the fallout afterward.

TB: What was your own education like, and how did that then lead to the career where you are?

PK: Okay, my own education. I was a top student at elementary, junior high and high school. I used to get excellent in everything except conduct. It started out with good, and then it dropped to fair and then poor. ‘If Paul could only learn to keep his mouth shut…’ – this type of thing. That was until I was 12 years old.

But I was also a very good student. I mean I never got below an A in things like mathematics and stuff like that. Same thing in junior high school. I went to the best high school in the United States: the Bronx High School of Science. 10,000 children took an entrance test and 900 were chosen to do it. You couldn’t take the test, unless they thought you were good enough to do it. I also got good grades there and then went on to university. I started as an electrical engineer at Syracuse University.

TB: Really?

PK: Yeah, but it was a university – a semi-Ivy League university – and while I was protesting the Vietnam War, they were protesting that they wanted to have girls in the dormitory wards.

TB: [laughs]

PK: My problem was ‘Can I afford to buy a Toyota?’ and their problem was ‘Should Dad buy me a Corvette?’ – something like that.

So I transferred to a different university: the State University of Stony Brook. Primarily known for being busted twice by the Suffolk County police department for marijuana, it had a very advanced engineering school. I got there after a year of getting As in Syracuse and went on to just struggling to get by at Stony Brook. So I decided there’s one of two things I could do: I could either really buckle down and really work hard – but I had no idea how to do that because I’ve never done that; or I could do something else. So I decided to just think of something else. What can I do? Psychology – that’s almost a science!

TB: [laughs] That’s quite a leap though.

PK: Yeah, so what I did was I took some more physics and chemistry courses and a few biology courses, and so when I graduated Stony Brook I ended up with a bachelor’s in psychology and a teaching certificate for chemistry, mathematics and general science in high school.

And I had no idea what I wanted to do. All I knew was that I didn’t want to go and get a master’s somewhere in the United States. So I left for a year to get my head together. I went away from my family parents and friends…and never came back. That was 1973. And after I’d worked as a carpenter and a cook, and head of a restaurant in Amsterdam and planning on emigrating to New Zealand to become a teacher there, Catherine [his partner] decided at the last moment that she didn’t want to emigrate. She said, ‘Why don’t you go back to university?’

I went back to university – the City University of Amsterdam – and got a master’s in educational psychology. After that, first at an educational publisher and then at the Open University of the Netherlands, I went to work on my specialisation: text characteristics and learning processes. That’s the study of what you can do with text to try to ensure that people study in a way that facilitates their learning. And the rest, as they say, is history.

TB: Indeed. But why not teach?

PK: I realised one thing while teaching, namely that teaching was too frustrating for me. So I wanted to learn why the normal children that I was trying to reach – independent of how I explained things to them – weren’t learning. It was very frustrating for me because I myself was a very good student; I didn’t understand why they didn’t understand and couldn’t understand.

TB: Were you teaching at this point?

PK: Middle school. Yeah, and so that was the reason I thought I didn’t want to be a teacher. I don’t want to go back to university and the United States and get my master’s whatever and get my permanent certification as a teacher – it was just too frustrating for me. So after bumming around for a few years in Europe and the East (hippie time), I went back to school in the Netherlands – Amsterdam. My driving force was to understand how people study, how people learned and how you could make effective, efficient and enjoyable learning experiences for them.

And that’s what I’ve been doing since I started my university career in the Netherlands – 1976. We’re now 43 years later and I have been doing that exact same thing with different names for different jobs for those 43 years.

TB: What was it like going from natural/physical science to something like educational psychology?

PK: Well, there was a step in-between: I made the change at Stony Brook. You have to realise this was 1968 and the cognitive revolution had just started. And for me, the cognitive revolution began with Gagné’s third edition of The Conditions of Learning. Up to that moment, psychology was behaviouristic, and that was Stony Brook also. Dave Ausubel with meaningful verbal learning was (I think) 1966, so the seminal works in cognitive psychology hadn’t actually happened. Baddeley and Hitch was after that, so at that point in time I went over to behaviouristic psychology. And behaviouristic psychology is very, very ‘hard’ science. I mean, I even had my own lab rat.

I didn’t teach it anything, except ‘press this bar and food comes’ – kind of like how to open the refrigerator door. And I copied the lab manuals of the semester before mine because I didn’t like doing that to an animal. And when they would starve it in the holiday to see if it would learn better if it’s hungrier, I would come in every day and feed my rat and make sure that it wasn’t hungry, and do those types of things.

TB: I’m sure the rat was grateful.

PK: It was a very exact psychology at that point in time. The idea of a brain, and processing and learning like that, it was a stimulus response. It was based upon Skinner and the like. So it was a very ‘hard’ social science. When I started studying again, ten years later in the Netherlands, it had made the transition to cognitive psychology and in that point in time I was dealing with things like the use of adjunct questions from Ernie Rothkopf – as well as his work on mathemagenic activities – all of those types of things, so it was a re-introduction, a re-christening in the psychology, but then in the cognitive psychology. And I’ve been there ever since.

 

TB: I have to zoom in a little bit on your seminal work with John Sweller. How did that come about?

PK: Dick [Richard E] Clark’s story is different from my story. There’s an interview with him where he says how it happens. The way it happened – at least as far as I can remember – was like this. At an international conference, there were these people pontificating about constructivism and inquiry-based learning. John made a comment there. I had met him before that and had long discussions with him in the Netherlands. Afterwards I said to John something to the effect of ‘These people don’t understand what’s actually going on. It’s not that they’re unwilling, it’s just they don’t understand it.’ From that came the idea of writing this paper. The original title was ‘Inquiry learning isn’t’, which John thought was just a little too quizzical and whimsical for his taste.

TB: It’s more you?

PK: Yeah it’s definitely me. So we started working on that and talking to each other. And at a certain point, he said, ‘Well I know that Dick Clark has been doing some review work’ on things like that for Review of Educational Research and other journals. ‘He would be a good person to bounce it off.’ I also knew Dick, so I went up to him at a conference and asked: ‘You know, John and I are doing this and we would like to have you as a critical reader and bounce some things off of you.’ He said yes, and we sent him the first version of it and he came back with the question: ‘I’m a bit embarrassed to ask, but could I be the third author? This is an incredible thing that you are doing.’ For John and me the answer was a no-brainer: ‘Of course!’

So that’s how it happened. We first thought of going to Educational Researcher from the APA. We didn’t send it to them; we just asked if they wanted it. They were lukewarm. Eventually we chose Educational Psychologist, which ‘accepted’ it with major changes. Two of the reviewers gave very strong critical advice and they helped us a lot – it made the paper quite a lot better. The third reviewer was a diehard constructivist and nothing we could’ve done would’ve satisfied that person.

TB: That’s perhaps unsurprising…

PK: So I got in touch with the editor. This was around the time it went over from Lyn Corno to Gale Sinatra, so I got in touch with Gale and said, ‘This is the situation: we have two critical and constructive reviewers and we can meet their demands and it will become a great article; but if we want to meet the demands of the third, it will never happen. So if you’re going to make use of that third reviewer and treat that seriously, then tell me that now and save us the effort – we’ll go to another journal.’ It wasn’t meant as a threat; it was more a promise: ‘We’ll make use of the first two, make it a much better paper, and then go somewhere else.’ And she said, ‘No, no, do the paper.’ She really liked it and so we did that. It got accepted and the rest is history. One of the most cited papers – and when Daniel Willingham comes back on Twitter and says it’s one of the most important articles in the 21st century, it’s kind of something you’ve always dreamed about.

TB: That was my next question actually: what was your reaction to that kind of praise?

PK: There are certain papers in your life that you read, that I read, and then you say, ‘Okay, those are the papers.’ I mean, if you’re talking about levels of processing, it’s Craik and Lockhart, you know? That’s something you dream about: that you’re going to write such a paper. But you also know that it’s never going to happen in your lifetime, because there are very few that reach that status. But serendipitously, this came to be; and it became a paper that I’m incredibly proud of. And it’s just an incredible feeling.

TB: Were you surprised by its success?

PK: I knew it would…raise dust, make an impact and be controversial, because at that point in time everything you heard was inquiry, discovery and constructivist, new learning and all of those types of things. But I didn’t know it would be picked up by that many people. And I didn’t know that it would lead to debates at different conferences, and a book on constructivist versus instructivist learning by Sig Tobias and Tom Duffy. Those types of things, I had no idea at all.

TB: What have been the biggest criticisms of that paper?

PK: There were two: one was Deanna Kuhn, who said we didn’t understand children.

TB: Obviously!

PK: The second was that we in some way, shape or form had created a straw man that was easy to knock down – although the only thing we did was cite people and what they had actually written. And you can see it now, although the criticism has become less. The diehard constructivists have died out, maybe? But what you see in their places are apologists – inquiry learning people, discovery learning people, who then add a heavy dose of directive instruction, explicit instruction, and then somehow still call it inquiry- based learning. Where’s the discovery gone in discovery learning? If you read the review articles, they say inquiry only works if there’s enough explicit instruction – to which I say, ‘Well that’s called direct instruction.’ You explicitly teach children about something, teach them how to solve problems with what they’ve learnt, and then give them problems to solve after they have the knowledge and skills to do it. So, they still call themselves constructivists and/ or adept at discovery, or inquiry, or experiential learning. But what they’re actually doing is making use of certain aspects of discovery, either after explicit instruction or with the aid of ‘just in time’ explicit instruction. It’s no longer discovery learning.

TB: Which raises another point: I’m fascinated by I guess what you might call the ecosystem educational research inhabits – why some things ‘land’ and some things don’t. You’ve written about the idea that constructivist learning goes away, comes back, goes away, comes back…

PK: Always with a different name. Actually Rich Mayer wrote a great article about this: ‘Should there be a three- strikes rule against pure discovery learning?’

I want to be very humble about it: John, Dick and I didn’t do anything earth shattering. I mean, what we did was talk about what good teaching is and put it in a theoretical framework that could be understood. And we took constructivist ways of teaching, constructivist pedagogies, and put them against the same framework and showed that it can’t work and why it can’t work. So what we are saying is that nothing more and nothing less than good instruction from good teachers works. We told them why that is the case from an information- processing and cognitive load point of view – our cognitive architecture. We said why that was the case and that’s possibly what makes it so strong, so robust, and so long- lasting, because we didn’t come up with a new fad or a new name for something – we just explained why and how good teaching works.

TB: But nobody has made a single significant or serious pushback against this paper?

PK: No! But that’s the author speaking here.

TB: So why are people still so resistant to this?

PK: Because it doesn’t fit in with their idea of explicit instruction. There are at least two or three reasons for it.

Firstly, people don’t understand what explicit instruction is. They think that you are talking about standing in front of the class and lecturing. So there are even teachers who actually do a lot of explicit teaching – and possibly do it well – who are pushing back against it because they have this strange idea of what it is. They’re creating a straw man that doesn’t exist, because nobody does that nowadays. Even in a lecture hall with 600 people, nobody does that. All they have to do is read Barak Rosenshine’s work on direct instruction and they might possibly see that they are doing direct instruction! But that’s the first reason.

The second reason is that it doesn’t fit the zeitgeist. It’s like the zeitgeist is a kind of laissez-faire approach: ‘Give that child room’, ‘The school/classroom is a prison’ – that type of thing. A romantic version of the child a la Rousseau.

In their idea, it’s kind of like we need to give our flowers room to grow and bloom. But as E D Hirsch stated, current science essentially demolishes the romantic tradition in educational thought which holds that education should develop naturally for the individual child. He states that while romanticism has produced great poetry, it has led to terrible educational ideas that have done a lot of harm to our Western nations.

And this zeitgeist problem is also seen with things like multitasking. It’s hip to think that people can do a lot of things because we see children and adolescents doing it. That’s what Marc Prensky did. He saw children multitasking but he never studied whether they were actually processing more things at once or whether they were doing it in a way that didn’t affect the outcomes. In other words: did they learn better? Did it lead to more mistakes or did it take more time to complete identical tasks?

The idea that we can multitask fits our view of the world and people believe it. But try saying the following to one of these believers: ‘Have you ever watched the news on television and had your partner walk in and ask you something? And you give them an answer, and then you’ve missed what has happened in the news. You heard it; you possibly even saw it. But you were thinking about something else.’

Or maybe explain it like this: ‘You’re having a discussion with a colleague at work and, while talking, you look at your computer screen to read an email that’s just arrived (the pop-up on the screen caught your eye). And at that point, your colleague asks you a question – and you have to excuse yourself because you were reading that email.

‘What you were reading wasn’t rocket science and what your colleague was speaking to you about probably wasn’t rocket science (unless you work at the European Space Agency). You definitely heard their voice (you didn’t all of a sudden become deaf) but you couldn’t process what they were saying because you were processing the text of the email. In both cases, you weren’t capable of semantically decoding one stimulus while you were at the same time semantically decoding a different one.’

At this point, the believer in multitasking will probably admit to having experienced this. But up to that point, they had the idea that they really could multitask.

The same is true for learning styles. How many times do we have to tell teachers that learning styles don’t exist? How many times to we need to present empirical research showing the contrary and they still think it exists? 93% of British teachers still think that there are learning styles and that catering to them improves learning.

TB: What’s the main takeaway for a teacher from your paper with Dick and John?

PK: Knowledge and skills are necessary to do anything further. Without those, you can’t solve problems, you can’t creatively design anything, you can’t carry out a good argument and you can’t discuss things. (Although I know a lot of people who argue without having absolutely any knowledge!) I think the main takeaway is that our brains are limited in how much they can take up at one time and how they can process that effectively, efficiently and joyfully. And if you want a learner to do that then you need to design the learning experience in a way that is synchronous with our human cognitive architecture – how our brains function. Conversely, if you do anything that contradicts how your brain functions, it won’t work. But if you do things that fit, that synchronise with human cognitive architecture, then learning will happen either more quickly with less effort – that’s my idea of efficient – more effectively, learning more deeply and learning enjoyably. Learning isn’t always fun, of course, but following these principles leads to achieving a greater feeling of achievement and success.

TB: Satisfaction.

PK: Satisfaction, yeah. But I like ‘effective, efficient and enjoyable’. And anything that you do as a teacher – and this is possibly a second major takeaway – should be aimed at reaching at least one of those three and never to the detriment of the other two. So, if you have thought of something that makes something more effective, but it’s incredibly less efficient, then it probably won’t work. If you can make it more efficient but they learn less, you shouldn’t do it. I’m an atheist, but that’s kind of my holy trinity: effectivity, efficiency and enjoyment.

TB: Would you change anything about the paper now?

PK: If I was re-writing it now I would possibly might make more use of (or substantial use of) things like David Geary’s work on biological primary and secondary learning. We didn’t put that in because the paper was made in 1995, but his real work was in 2002, 2006.

Maybe I’d also make a slight change to talking about the cognitive load theory in it, because I’ve stopped using the three types of cognitive load – intrinsic, extraneous and germane – for a number of reasons. John is also more in that direction now at the moment. This is because there’s a certain amount of load that’s intrinsic to the task, which is based upon the complexity of the task, and a certain amount related to your own expertise – because as you become more expert, the complexity goes down.

And people have to realise that complexity is not the same thing as difficulty. You can have a very simple quantum mechanical problem, but for me it’s difficult because I don’t know quantum mechanics. It’s simple in terms of how many information elements there are and how much interaction there is between the elements. So that determines the intrinsic load. Then you have extraneous load, which is everything that deals with how you learn it: the techniques you use, the technology, all of the other things in the learning process.

And you can say, ‘Why have you got rid of germane load?’ Germane cognitive load is defined as ‘load caused by instruction that helps someone to learn’. And you can say extraneous load is ‘load that is caused by something in the environment, usually instruction, that hinders learning’. But the problem with that is that you can’t determine what is germane unless it is ex post facto or post hoc. I can only say what has helped learning if I determine the student has learned. It becomes a kind of circular way of reasoning. If someone learned from it then apparently the load was germane; but if someone didn’t learn from it, then apparently that was extraneous load.

I can measure the intrinsic load by looking at how many new information elements there are for this person and what the interaction is between them. Take playing scales on piano. I play no piano, so playing the scales is hard enough for me. A scale contains eight notes but it goes in one direction, or it goes in the other direction. The hardness/softness doesn’t change and it keeps a steady tempo, so that’s a low-complexity task. On the other hand, imagine playing a melody with fewer notes but with a greater variation in the tempo, the hardness/softness and the order of the notes. The interaction between elements is so much greater. This task is quite a lot more complex than playing a scale. So it’s always a combination of the number of elements, the number of new elements, and the amount of interaction. You can measure that beforehand; you can see it and put it into it a formula and say, ‘Okay, this is the intrinsic load of this task.’ And I can make a task more or less complex by adding or subtracting information elements or changing the level of interaction between the elements.

But I can’t do that with the other types of load – germane and extraneous – because they arise from the way the task is presented and the way you instruct in it. And if someone learns, then apparently the load that was created (and that was measured) was germane. If they didn’t learn, then apparently what you’ve caused was, in the old model, extraneous. And so it doesn’t make sense to keep talking about three types of cognitive load if I can only measure one. That was one of the major criticisms of the cognitive load model. As Slava Kalyuga noted, germane load is essentially indistinguishable from intrinsic load because it’s associated with task-related processes which are sources of intrinsic load, and therefore germane load as a concept is redundant. The dual intrinsic/extraneous framework is sufficient and non-redundant.

TB: What are your thoughts on David Geary’s biologically primary learning?

PK: It was incredibly insightful; but if you read it, it was so incredibly basic – you could kick yourself and say, ‘Why didn’t I write that?’

TB: [laughs] Yeah.

PK: There are certain things that are evolutionarily determined because if they weren’t there, the species would’ve died. For example: recognising someone’s face, communicating with a parent, having a sense of community and wanting to be with others, etc. Without these, a baby is doomed to die. So a child that doesn’t recognise its parent’s face won’t reach adulthood and won’t procreate. It’s incredibly basic that such a thing exists. This leads to things Geary discusses like folk psychology, folk biology and folk physics that are there because we need them to survive.

For example, in the wild, if a bush moved unexpectedly then we needed a flight reflex to get away. Because while it might have been a rabbit, it might actually have been a tiger. And if you didn’t have that reflex, you were probably consumed by the tiger. That’s biologically primary learning: you don’t have to teach a child that. Some people say that because we learn a first language that way, we can learn a second language like that. But they don’t understand – the second language is different: writing, reading, those things aren’t necessary for your primary survival. Those are secondary knowledge bases, and we need to teach those things more explicitly. And that’s something that’s usually the result of schools. It’s such an incredibly simple theory in the most positive sense of the word, as in, ‘Why didn’t I think of that and write it down?’ That’s how good it is.

TB: You’re only allowed one major breakthrough! [laughs]

PK: That’s the type of paper it is. It’s an incredible eye-opener, and it gets to the core of something – like John’s cognitive load theory. It brings together things like information processing from Baddeley and Hitch. You have sensory information, and you have long-term and short- term memory. And information held in the short-term memory is lost if you don’t repeat it after a certain period of time. And what happens if you read your slides to your audience while they also read them silently to themselves? You’re asking them to semantically process what they are reading and hearing at the same time. They just can’t do that! They’ll learn less, but you think they’re learning more because you’re saying it twice, in two different ways.

TB: A lot of teachers think they’re teaching with greater impact that way.

PK: But you’re not! If you had a picture that they were iconically interpreting in what is known as a visuospatial sketchpad, while on the other hand the words that they’re hearing is being semantically decoded in what’s called the phonological loop, then that’s dual coding from Allan Paivio. This, along with cognitive load theory, is one of the foundations of the multimedia principle in Rich Mayer’s cognitive theory of multimedia learning. Once again it’s one of those theories that is so robust, it can explain almost anything. It explains why certain things work for experts and not for novices: the expertise reversal effect. It’s at the roots, it’s foundational. It’s about Geary’s work and John’s work, and it’s incredibly foundational because they both deal with the essence of learning. Firstly: how our brain evolved and what that means for learning and education, and secondly, how does our cognitive architecture function and what does that mean for learning and education? What do you need that’s more fundamental than that?

TB: One last question: what are you working on now?

PK: Two things. I just published an article, which I hope will have an impact and that’s called ‘From cognitive load theory to collaborative cognitive load theory’. It expands cognitive load theory to collaborative learning situations.

TB: That will grab a lot of people.

PK: I hope so. It came about from the fact that I used have a chair in computer-supportive collaborative learning and one of the things that intrigued me was what I call transaction costs and transactive activities. That’s when you’re working with someone else on something, and you have to expend time, effort, energy on communicating about and coordinating what you’re doing with others. Those are intrinsic costs to the task of learning collaboratively. If the task itself isn’t complex enough that the benefits of working together with other don’t exceed the transaction costs caused by working together, then people won’t work together.

That’s one of the many things that’s problematic for teachers using collaborative learning. They really don’t either think about or are capable of designing tasks that are complex enough to require collaboration. They can’t or don’ make tasks where the benefits of working together are greater than the costs caused by transactive activities. And you’ll see that they’ll say things like ‘ you have to contribute five things to the discussion group’ because people don’t communicate and contribute enough.

The second is my little secret. All I’ll say is that it deals with modern assessment. I’m trying to find funding to do the research, but don’t want to alert any hijackers on the horizon.


References

1. Kirschner, P. A., Sweller, J., Kirschner, F. and Zambrano, J. (2018) ‘From cognitive load theory to collaborative cognitive load theory’, International Journal of Computer-Supported Collaborative Learning 13 (2) pp. 213–233.

Everyone’s a teacher of SEND

Karen Wespieser, Director of Operations at the Driver Youth Trust, talks about a change in the way we understand SEND discussions

A small Twitter debate erupted following the 2018 researchED National Conference when someone pointed out: ‘110 workshops – SEND mentioned twice, dyslexia once and a session about reversing therapeutic-based practice. In a profession where 14 per cent of our students have SEND…’ But does professional development need to be explicitly about special educational needs and disabilities (SEND) in order to improve the teaching of this group of young people?

The latest data from the government’s annual survey of newly qualified teachers (NQTs) found that assessing the progress of SEND pupils was one of only three areas where fewer than 50% of NQTs gave a rating of 7–10 out of 10.1 The proportion of NQTs who reported that their initial teacher training (ITT) had prepared them well to teach SEND pupils wasn’t much higher: just over half (53%) felt prepared (ibid.). Whilst improving ITT is clearly an important issue that needs addressing, it isn’t one for researchED. However, if initial teacher education isn’t equipping the school workforce with this information, then surely professional development events like researchED could?

Yet, as with many SEND-related discussions, maybe this is actually an issue of labels. Whilst there may not be many workshops labelled SEND at researchED events, there are often plenty that addressed key ideas of how best to teach SEND students in the mainstream classroom.

Defining the label of SEND and then applying it to children and young people is a complex issue and can be arbitrary. In 2010 the number of pupils identified with SEND in the UK was five times the EU average. This led Ofsted to review how children were being identified and supported in schools. They concluded that ‘as many as half of all pupils identified for School Action [support] would not be identified as having special educational needs if schools focused on improving teaching and learning for all’.2

The Children and Families Act (2014), the catalyst for the largest reforms in decades, mandated a new system of identification. The Act describes someone as having a SEND when ‘they have a learning difficulty or disability which calls for special educational provision to be made for them’ (Section 20). It then defines ‘special educational provision’ as ‘provision that is additional to or different from that which would normally be provided for children or young people of the same age in a mainstream education setting’ (Section 21).

Such a definition is problematic, however, because what ‘learning difficulty’ and ‘additional’ or ‘different’ provision mean is open to subjective interpretation. As a result of these changes to the definition, the number of children and young people identified as having a SEND declined from over 1.5 million in 2010 to around 1.2 million in 2016.3 The figure has been rising again since 2017 and latest data shows it at nearly 1.3 million, or 14.6% of pupils.4

It is interesting to note, however, that while the proportion of children and young people identified as having a SEND declined between 2010 and 2016, the number of children who have an education, health and care plan (EHCP) remained consistent at 2.8%. As the figures began to increase in 2017, the proportion with EHCPs also rose and currently stands at 2.9%.

What is often missed in discussions about SEND is that the vast majority of children and young people with SEND will be in a mainstream school. Data from the Department for Education5 shows that of the 1,178,2356 SEND learners in state-funded compulsory education, 56% (650,455) are in state-funded primary schools and 34% (399,800) are in state-funded secondary schools. Far fewer of these learners are educated in special schools (only 10% – 114,755) or in pupil referral units (1% – 13,315), although the incidence of SEND in these settings is substantially higher.

Whilst many papers and commentators focus on children and young people who have EHCPs or attend special schools, the vast majority of SEND children and young people receive their education in a mainstream school. Therefore, all teachers need to ensure their professional development includes how best to teach this cohort.

For this reason, using a specific label to identify where SEND professional development is taking place is a potential distraction. It risks an ‘us and them’ mentality and, despite the statistics above, faced with a choice, many teachers may still not recognise a gap in their knowledge. But does this matter?

Good teaching is essential for all pupils, and all teachers are teachers of SEND. We therefore need to find a balance; whilst the NQT data above highlights a need for more specialist training on various learning difficulties to develop teaching skills further, we also need to ensure all CPD builds in inclusive elements and refers to children with SEND so it is not ‘bolted on’.

Some of the best evidence we currently have has grown from educational psychologists and neuroscientists whose research was first picked up by teachers working with young people with special educational needs. For example, Professor John Sweller’s research on cognitive load theory or Professor Allan Paivio’s work on dual coding – both stalwarts of researchED presentations, and both, I would argue, provide useful tools in teaching children with SEND.

So whilst I would not necessarily argue that there needs to be more SEND-focused sessions, I do believe that there could be more emphasis on SEND in the questions that are asked of the research and practice that is shared.

For example, School Minister Nick Gibb’s researchED speech7 at the 2018 national conference included celebratory remarks about early literacy and the 87% who reach the expected standard in the Year 1 phonics screening check. He did not mention the worrying discrepancies between regions and local authorities where a child with an EHCP in Inner London is 50% more likely to reach the expected standard in the phonics screening check compared to a child in the North West, East or West Midlands.8

If we are all teachers of SEND, we may not need our own conferences or conference stream, but we do all need to be asking these questions.

Recommendations for further reading

Sweller J., Ayres P. and Kalyuga, S. (2011) Cognitive load theory. Berlin: Springer-Verlag.

Paivio, A. (1986) Mental representations: a dual coding approach. Oxford: Oxford University Press.

Weinstein, Y., Sumeracki, M. and Caviglioli, O. (2018) Understanding how we learn: a visual guide. London: Routledge.


References

1. Ginnis, S., Pestell, G., Mason, E. and Knibbs, S. (2018) Newly qualified teachers: annual survey 2017. Department for Education. London: The Stationery Office.

2. Ofsted (2010) The special educational needs and disability review. Manchester: Ofsted.

3. Department for Education (2016) Special educational needs in England: January 2016. London: The Stationery Office.

4. Department for Education (2018) Special educational needs in England: January 2018. London: The Stationery Office.

5. Ibid.

6. This is a smaller number than the total number of SEND pupils as it excludes nursery and independent school pupils.

7. Gibb, N. (2018) ‘School standards minister at researchED’ [Speech], researchED National Conference 2018. Harris Secondary Academy, St Johns Wood, London. 8th September.

8. Selfridge, R. (2018) Effective specialist support and ringfenced funding is needed to support those who struggle to learn to read. London: Driver Youth Trust.