Writing and cognitive load theory

Cognitive load theory has been described as one of the most important discussions in modern psychology that educators need to be familiar with. Natalie Wexler looks at what the implications of this theory are for the way we teach writing, and what it means in the classroom.

It’s been said that reading is the most difficult thing we ask students to do. In fact, that description applies more accurately to writing, which has received far less attention from both cognitive scientists and educators. Because it requires students to express themselves and not merely to receive and process information, writing imposes the greater cognitive load.

It’s clear that reading places a heavy burden on short- term or working memory – the aspect of cognition that could also be called ‘consciousness’, and which can hold only a limited number of things for a limited amount of time. When it comes to decoding, those things include the correspondences between letters and sounds; for reading comprehension, they expand to include knowledge and vocabulary relating to the topic (1). The key to successful reading is to have as many of these factors as possible stored in long-term memory – which has a virtually infinite capacity – so they don’t take up precious space in working memory and overload it.

With writing, background knowledge is even more crucial. It may be difficult to read about a subject that’s unfamiliar, but it’s virtually impossible to write about one coherently. At the same time, knowledge of the topic is only one of many factors vying for space in working memory. Even when producing a single sentence, inexperienced writers may be juggling things like letter formation, spelling, word choice, and sentence structure. When asked to write at length, they need to cope with the challenges of adhering to a topic, creating smooth transitions, avoiding repetition, and ensuring that the overall organization of the piece is coherent. All of this is in addition to absorbing the information that forms the basis for their writing, deciding what to say about it, and anticipating what a reader will need to know.

In some situations, the key to easing cognitive load is to provide what are known as ‘worked examples’. Rather than asking learners who are unfamiliar with a topic to acquire knowledge through solving problems themselves, the theory goes, teachers should have them study problems that have already been solved. In the context of math, for example, research has shown that students who study worked examples of algebra problems perform better than those who solve problems on their own, when tested later on their ability to solve similar problems. The reason appears to be that problem solving imposes such a heavy cognitive load on novice learners that they have little capacity left for transferring the strategies they’ve used into long-term memory (2).

It’s been suggested that the worked-example effect can be applied to writing as well: if teachers explicitly teach sentence structures and vocabulary, provide exemplars that illustrate these things, and lead discussions on the subject, students should be able to study the exemplars and reproduce those features in their own writing (3). But many American teachers already use a version of worked examples when trying to teach writing: they show students ‘mentor texts’ to use as models (4). Considering that a mere 25% test of American students test at the proficient level in writing (5), it’s fairly clear that that approach is not having the desired effect.

Showing students exemplar sentences rather than entire texts is definitely a step in the right direction, because it focuses students’ attention on a manageable unit. But the problem, as Greg Ashman has put it, is that there’s a difference between ‘knowing that’ and ‘knowing how’. Students may know, for example, that a sentence is ‘a set of words containing a subject and a predicate and expressing a complete thought’. Showing students examples of complete sentences and contrasting them with sentence fragments may make the concept more concrete. But many students will nevertheless fail to know how to write complete sentences and continue to use sentence fragments in their own writing. A basic problem is that the massive cognitive load that inexperienced writers face makes it difficult for them to remember to put their conceptual knowledge into practice (6).

How do we get students to know how to write well? That question is crucial, and not just because we want students to acquire writing skills. When the cognitive load is modulated, writing is perhaps the most effective way to build and deepen students’ knowledge and develop their analytical abilities. To be sure, students need some knowledge of a topic to begin writing. But once they start to write, they need to recall information they have recently learned, determine which points are important and connect them to one another, and put all of this into their own words. If students have the cognitive capacity to engage in these steps, the effect is powerful – akin to the ‘testing effect’ (the boost in retention that comes from being quizzed on recently learned material) and the similar ‘protégé effect’ (which results from explaining a topic to another person) (7,8).

When the cognitive load is modulated, writing is perhaps the most effective way to build and deepen students’ knowledge and develop their analytical abilities.

Because of the complexity of the writing process, students need more than direct instruction and worked examples to become competent writers. They need ‘deliberate practice’: repeated efforts to perform aspects of a complex task in a logical sequence, with a more experienced practitioner providing prompt and targeted feedback (9). And for many students, including many at upper grade levels, this kind of practice needs to begin at the sentence level – partly because sentences are the building blocks of all good writing, and partly because sentence-level tasks lighten the cognitive load. That’s not to say that constructing a sentence is an inherently simple task. It all depends on the content. For example, there’s nothing simple about completing this sentence: ‘Immanuel Kant believed that space and time are subjective forms of human sensibility, but _________.’

Deliberate practice in writing also needs to extend beyond English class to the rest of the curriculum. Not only does that provide teachers of history, science, math, and other subjects with a powerful tool to enhance their instruction, it also gives students more opportunities to practice the writing strategies. Eventually, many of those strategies will become lodged in long-term memory, becoming so automatic that students don’t even realize they’re using them.

When students are ready to embark on lengthier writing, where the cognitive load is even greater, they need to learn to construct clear, linear outlines that enable them to organize their thoughts, avoid repetition, and stay on track. Juggling those tasks in working memory while writing can be overwhelming even for many experienced writers. Once students have used an outline to create a draft, they can use their pre-existing knowledge of sentence-level strategies to vary their sentence structure and create smooth transitions.

While this approach to writing is still rare and unorthodox, it is gaining traction largely thanks to a US- based organization called The Writing Revolution, of which I am board chair, and a book that explains the method – also called The Writing Revolution – of which I am the co- author with Dr Judith C. Hochman. A veteran educator, Dr Hochman has developed a series of writing strategies that are designed to be taught explicitly and practiced repeatedly in a variety of contexts, with prompt feedback from a teacher. Although originally created for learning- disabled students, the method has been shown to be effective with students of all abilities, including those still learning English.

What does the method look like in practice? Let’s return to the example of students who use sentence fragments rather than complete sentences. In addition to showing students examples of fragments and complete sentences side by side, the Hochman Method has students practice distinguishing between the two – and turning the fragments into complete sentences. For older or more sophisticated students, the terms ‘subject’, ‘verb’, and ‘predicate’ might be used, but it’s sufficient to simply ask questions in functional terms. For example, if a fragment says, ‘ate a great meal,’ the teacher might ask the class, ‘Does that tell us who ate a great meal? How can we make these words into a sentence?’(10)

To derive the maximum benefit from this activity, the examples should be embedded in whatever content students are learning. A math teacher who has taught rational numbers could review – and simultaneously build writing skills – by giving students the following fragments and asking them to transform the phrases into sentences, with proper punctuation and capitalization:

  • can be expressed as a fraction or a ratio
  • rational numbers

Their responses might be:

  • A rational number is a number that can be expressed as a fraction or a ratio.
  • Rational numbers can be ordered on a number line.

Eventually, through the repeated process of identifying and correcting fragments, students will develop an understanding of how to create a complete sentence and apply that knowledge to their own writing.

Students don’t need to learn the names of grammatical structures and parts of speech for their own sake. But certain terms are useful as a shorthand for strategies that will enhance writing and lessen cognitive load. For example, the method has students learn the word ‘appositive’ – that is, a phrase that renames a noun – because it provides them with an effective strategy for varying sentence structure and expanding their responses. Once students have grasped the concept, they can be asked to provide appositives for sentences grounded in the content of the curriculum. A biology teacher might give students the sentence, ‘Natural selection, __________, results in species with favorable traits.’ A student might supply the appositive, ‘a process of evolution’.

When students have moved on to lengthier writing, they’re advised that appositives can be used to create good topic sentences – and they’ll understand what to do. Ultimately, that information will be stored in their long-term memory, along with the knowledge of other possible sentence types and structures, to be drawn on when beginning a paragraph or an essay. Rather than having their working memory occupied with searching for a way to begin – or, if they’re revising an essay, to vary their sentences – they’ll be able to devote more cognitive capacity to what they want to say.

Those of us who are already competent writers have vastly underestimated the difficulties faced by many (if not most) students in reaching that point. In years past, the assumption was that teaching rules of grammar and parts of speech was sufficient. After studies determined that approach had no positive impact on student writing, and in some cases had a negative one (11), another school of thought took hold. Its proponents assumed students would basically pick up the conventions of written language if they just read enough mentor texts and engaged in enough writing (12). Given the generally dismal results, it’s time for a new approach, supported by research: explicit instruction, mentor texts or ‘worked examples’, and the deliberate practice that will enable students to transform their conceptual knowledge into knowing how to write. Not only will schools produce better writers, but easing the cognitive load imposed by writing will lead to better thinking as well.

Natalie Wexler
Author, The Knowledge Gap: The Hidden Cause of America’s Broken Education System–and How to Fix It (forthcoming from Avery, August 2019)
Co-author with Judith C. Hochman of The Writing Revolution: A Guide to Advancing Thinking Through Writing in All Subjects and Grades (Jossey-Bass 2017)

References
1. Willingham, D. (2017) The reading mind: a cognitive approach to understanding how the mind reads. San Francisco, CA: Jossey-Bass, pp. 116–18.
2. Ashman, G. (2018) The truth about teaching: an evidence-informed guide for new teachers. London: SAGE Publications, pp. 42–43.
3. Needham, T. (2019) ‘Cognitive load theory in the classroom’, researchED 3, pp. 31–33.
4. Alber, R. (2014) ‘Using mentor texts to motivate and support student writers’, Edutopia [Website]. Available at: www.edut.to/2IB3ZPI.
5. The Nation’s Report Card (2011) Writing 2011. US Department of Education, Institute of Education Sciences. Washington, DC: United States Government Publishing Office. Available at: www.bit.ly/2I9LGm1.
6. Ashman, G. p. 122.
7. Roediger, H. and Karpicke, J. (2006) ‘Test-enhanced learning: taking memory tests improves long-term retention’, Psychological Science 17 (3) pp. 249–255
8. Boser, U. (2017) Learn better: mastering the skills for success in life, business and school. New York, NY: Random House.
9. Ericsson, K. A. and Pool, R. (2016) Peak: secrets from the new science of expertise. New York, NY: Houghton Mifflin.
10. This example and others are taken from Hochman, J. C. and Wexler, N. (2017) The writing revolution: a guide to advancing thinking through writing in all subjects and grades. San Francisco, CA: Jossey- Bass.
11. Graham, S. and Perin, D. (2007) Writing next: effective strategies to improve writing of adolescents in middle and high schools. Washington, DC: Alliance for Excellent Education.
12. Calkins, L. M. (1986) The art of teaching writing. Portsmouth, NH: Heinemann.

Evidence-based school leadership

A veteran of speaking about evidence, Gary Jones flags up some concerns he has about the difficulty of leading a school in an evidence- informed way that is also meaningfully and, crucially, has an impact that matters.

The first researchED event I attended was the London national conference in September 2014. Without doubt, this was some of the most inspiring and influential professional development I had experienced in the 30 years I had been involved in education. It was inspiring because I was taking part in an event with over 1000 teachers who had given up a Saturday morning to speak and listen about something they cared about – namely, improving teaching and learning though the appropriate use of research evidence. It was influential in that it got me thinking, reading and writing about evidence-based school leadership and management.

researchED London 2014 got me thinking about evidence-based school leadership and management for two reasons. First, the vast majority of the sessions at the event had a focus on teaching and learning and little attention seemed to be paid to the role of research and other sources of evidence in the decision-making of senior leaders in schools. Second, that summer I had by chance read an article by Adrian Furnham(1) which introduced me to the discipline of evidence-based management and I was intrigued as to whether there was a possible synthesis with evidence-based education. This contributed to me writing a book – Evidence-based School Leadership and Management: a practical guide – and 220 blogposts (www.garyrjones.com/blog).

We need to have an honest conversation about teachers’ research literacy and their subsequent abilities to make research- informed changes in their practice.

So having written around 300,000 words on all things evidence-based, I would like to make the following observations about the current state of evidence-based practice within schools. First, the ‘evidence-based movement’ is not going away any time soon. We have 22 schools in the Research Schools Network; an increasing number of schools appointing schools research leads; hundreds if not thousands of educational bloggers contributing to discussions about how to improve education; social media and EduTwitter providing a forum for the articulation of views; over 20 researchED conferences scheduled for 2019; the Education Endowment Foundation (EEF) spending over £4m in 2017–18 to fund the delivery of 17 projects, involving 3620 schools and other educational settings reaching approximately 310,000 children and young people(2); and finally, we have Ofsted using research evidence to inform their inspection framework (3).

Nevertheless, despite all this time, effort and commitment being put into research and evidence-based practice, there is still much to ensure evidence-based practice contributes to improved outcomes for pupils. First, we need to have an honest conversation about teachers’ research literacy and their subsequent abilities to make research-informed changes in their practice. Research undertaken by the National Foundation for Educational and the EEF suggests that teachers have a weak variable knowledge of the evidence-based relating to teaching and learning and have a particularly weak understanding of research requiring scientific or specialist knowledge (4). Second, there is a distinction between the rhetoric and the reality of evidence-based practice within schools. Research undertaken for the Department for Education identified a number of schools where headteachers and senior leaders ‘talked a good game’ about evidence-informed teaching within their schools, whereas the reality was that research and evidence was not embedded within the day-to-day practice of the school (5). Third, it’s important to be aware there is a major debate taking place amongst educational researchers about randomised controlled trials, effect sizes, meta-analyses. Indeed, as Professor Rob Coe states: ‘Ultimately, the best evidence we currently have may well be wrong; it is certainly likely to change.’(6)

And finally, if I were to offer any advice to teachers, school leaders and governors/trustees who are interested in evidence-based practice, it would be the following. Becoming an evidence-based practitioner is hard work. It doesn’t happen by just reading the latest EEF guidance document, John Hattie’s Visible Learning or by spending one Saturday morning a year at a researchED conference. It requires a career-long moral commitment to challenging both your own and others’ practice, critically examining ‘what works’ to ensure whatever actions you take bring about improvements in pupil outcomes.

Dr Gary Jones is the author of Evidence-Based School Leadership and Management: a practical guide. Prior to his recent work – in blogging, speaking and writing about evidence-based practice – Gary worked in the further education sector and has over 30 years of experience in education as a teacher and senior leader. Gary is currently engaged by the University of Portsmouth as a researcher on projects looking at area-based reform and increasing socialmobility.
@DrGaryJones
www.garyrjones.com/blog

Further reading:

Brown, C. (2015) Leading the use of research & evidence in schools. London: UCL IOE Press.

Barends, E. and Rosseau, D. (2018) Evidence-based management: how to use evidence to make better organizational decisions. London: Kogan-Page.

Cain, T. (2019) Becoming a research-informed school. London: Routledge.

Jones, G. (2018) Evidence-based school leadership and management: a practical guide. London: SAGE Publishing.

Kvernbekk, T. (2016) Evidence-based practice in education: functions of evidence and causal presuppositions. London. Routledge.


References
1. Furnham, A. (2014) On your head: a magic bullet for motivating staff?, The Sunday Times, 13 July.
2. Education Endowment Foundation (2018) EEF annual report 2018. London: EEF. Available at: www.bit.ly/2Iw9ajY
3. Ofsted (2019) Education inspection framework: overview of research. London: The Stationery Office. Available at: www.bit.ly/31hVQbN
4. Nelson, J., Mehta, P., Sharples, J. and Davey, C. (2017) Measuring teachers’ research engagement: findings from a pilot study: report and executive summary. London: Education Endowment Foundation/ NFER.
5. Coldwell, M., Greany, T., Higgins, S., Brown, C., Maxwell, B., Stiell, B., Stoll, L., Willis, B. and Burns, H. (2017) Evidence-informed teaching: an evaluation of progress in England. Department for Education. London: The Stationery Office.
6. Coe, R. (2018) ‘What should we do about meta-analysis and effect size?’, CEMblog [Blog]. Available at: www.bit.ly/2ZcWm96

Ambitious rhetoric but the reality falls short

Education in Scotland is often trumpeted by some as exemplary, with other countries – such as Wales – using it as a template from which to build their own systems. But recent years have seen troubling levels of unhappiness in the education community about the reality of delivering the Curriculum for Excellence (CfE) – the once-lauded program of educational reinvention that was supposed to revolutionise the learning of a nation’s children. Here, Walter Humes looks at some of the problems with the CfE, and what went wrong.

This article is based not only on my own analysis of the situation in Scotland, but also on the views of a small sample of people who are well placed to comment on the extent to which educational policy and practice are informed by research evidence. They include academics and researchers who have worked with government agencies, funding bodies and local authorities, a senior figure in a national organisation that regularly responds to consultations and policy proposals, and an experienced headteacher. To encourage frankness, respondents were guaranteed anonymity. Responsibility for the text that follows is, however, entirely mine.

Behind the official discourse

The word ‘evidence’ appears no fewer than 41 times in the document A Research Strategy for Scottish Education (1). The paper’s aims include a commitment to ‘learning from data and evidence’, ‘empowering practitioners to produce and use evidence and data’, and the ‘effective commissioning and dissemination of evidence on ‘what works’. The reference to ‘what works’ suggests a rather narrow view of the function of educational research – it should be concerned with fundamental questions of meaning and value, not just practical recommendations – but the general thrust seems to indicate a positive attitude towards the use of evidence at classroom, school, local authority and national levels.

However, these aspirations need to be understood against a background of mounting criticism about the Scottish Government’s record in relation to research and the use of evidence in policy development. The OECD report of 2015 which reviewed the progress of Scotland’s flagship policy of Curriculum for Excellence, launched in 2004, said that it could not conduct a full evaluation of the reform programme because there was insufficient information available (2). It called for ‘a robust evidence-base on learning outcomes and progression’. A similar plea was made in a report by the International Council of Education Advisers (ICEA), appointed by the Scottish Government in 2016, partly in response to public disquiet about a perceived decline in standards. One of the recommendations in the ICEA report was that the government should work with universities and other providers ‘to further develop and implement the educational research strategy published in 2017. This will enhance the system’s capacity for independent research and evaluation, and build a Scottish empirical evidence base’(3).
It is significant that it has taken pressure from outside Scotland to produce a shift of attitude in relation to research. Until recently, the post-devolution period was marked by progressive disengagement of government officials from the research community (4). Many academics felt that researchers were regarded with suspicion by politicians, inspectors and local authority officers, especially if their work took a critical line. The notion that critique may lead to improved strategies was not welcome in the conformist culture of Scottish education.

Although the mutual mistrust has eased slightly, with some reports that government is more willing to listen, it should not be overstated. One researcher recounted conflicting experiences in relation to the influence of his work on policy. He said that a particular project ‘did influence aspects of government policy’ and offered two explanations: first, the agency funding the research played an important role ‘in brokering access to policy makers’; and secondly, the research was ‘timely’ in the sense that the topic being investigated was already ‘gaining some momentum in government’.

Another project fared less well. It had minimal impact partly because ‘multiple policies were being introduced and the civil servants had little time to engage fully with the issues’. Furthermore, there seemed to be limited capacity to synthesise the results of this project with other related studies which had been commissioned, and so the opportunity to better inform proposed policies was missed.

These examples illustrate that policy making is rarely an entirely rational process. It is often messy, time- constrained, and subject to chance and the interventions of powerful players. Furthermore, research that is consistent with the current direction of travel within official policy circles is more likely to make an impact than research which raises challenging questions. This casts doubt on the degree of objectivity with which research evidence is reviewed by officials.

Longitudinal surveys

Any education system requires reliable information on which to base decisions. Over the last ten years, Scotland has withdrawn from certain surveys that provided useful comparative data which enabled trends to be identified. These included two international studies: PIRLS (Progress in International Reading Literacy Study) and TIMSS (Trends in International Mathematics and Science Survey). The ostensible reason was cost, but the decision was widely criticised as indicating a desire to conceal disappointing results. The Scottish Survey of Literacy and Numeracy (SSLN) was scrapped after the 2016 results indicated some downward trends, a pattern that was also shown in the findings of the 2015 PISA (Programme for International Student Assessment) report. Scotland does, however, continue to take part in the PISA programme. The introduction in 2017–18 of Scottish National Standardised Assessments (SNSAs) was controversial and the robustness of the data they will generate has been questioned.

Research that is consistent with the current direction of travel within official policy circles is more likely to make an impact. This casts doubt on the degree of objectivity with which research evidence is reviewed by officials.

Scotland’s current position is at odds with its historical record in this regard. A persistent critic of the Scottish Government’s attitude to independent research has been Lindsay Paterson, Professor of Education Policy at Edinburgh University. He has pointed out that in the middle decades of the 20th century, Scotland was a pioneer in the use of statistical surveys of school pupils, through the work of the Scottish Council for Research in Education. Later, the Centre for Educational Sociology at Edinburgh University carried out school leaver studies, starting in 1962 and running to 2005. These enabled researchers to evaluate the effects of major educational reforms, such as the introduction of comprehensive education and the expansion of higher education. Paterson argues that the current dearth of good-quality survey evidence makes Scotland a ‘data desert’. His conclusion is bleak: ‘There is now no survey series with which to hold Scottish government to account, and not even an openness in government to methodological discussion of the kinds of evidence that would be needed. This closing of minds to science is the very antithesis of accountability.’(5). Echoing the concerns of Paterson, Howieson and Croxford have reinforced the need for ‘system-wide, longitudinal data to enable a country to “know” its education and training system’.6 One longitudinal study that does exist is Growing Up in Scotland, started in 2005, tracing the development of ‘nationally representative cohorts’ of children over time (www.growingupinscotland.org.uk). It has produced interesting findings but it could not be used to evaluate Curriculum for Excellence, because there was no equivalent earlier study to enable meaningful comparisons to be made.

Local authorities and other organisations Central government is not the only agency with an interest in research evidence. Local authorities routinely collect data on the attainment of schools in their area, including standardised assessments of literacy and numeracy. This information can be used in discussions with headteachers about areas of strength and weakness. A key priority in recent years has been the desire to raise attainment generally, but in particular to reduce the gap in attainment between pupils in socially advantaged areas and those in deprived communities. Some headteachers claim that the instrument used to measure this, the Scottish Index of Multiple Deprivation (SIMD), based on postcodes, is too crude: there are disadvantaged children living in ‘affluent’ areas and not all children in ‘poor’ areas are deprived. This can be a particular problem in rural communities where the social and economic profile may be resistant to classifications that work in inner cities. Similarly, Insight, an online benchmarking tool for secondary schools and local authorities designed to help improve outcomes, makes it difficult to detect reliable trends when pupil numbers are small. There is also a concern about the capacity of senior staff to interrogate data and to use it effectively to make improvements. Teachers at all levels would benefit from opportunities to interpret research findings, whether quantitative or qualitative – a provision that would require both time and support.

This last point connects with an observation made by a senior academic familiar with staff development approaches in a number of Scottish local authorities. She reported that John Hattie’s work (as set out in Visible Learning and Visible Learning for Teachers) was strongly promoted, presumably because it drew on a wide range of research evidence and offered clear guidance about high-impact teaching strategies. But the academic wondered how well some of those recommending Hattie’s ideas understood the nuances of his approach. A simplistic application of research evidence may have unintended negative consequences.
Education Scotland, the national advisory body on the curriculum, claims that it draws on research in framing policy advice, though its record in this regard is patchy. The Scottish Qualifications Authority, which runs the national examination system, does rather better, collecting and analysing data on exam entries and results for the qualifications it offers. In recent years, the General Teaching Council for Scotland has sought to encourage teachers to engage in various forms of professional enquiry designed not only to enhance personal development but also to benefit the school through sharing insights with senior management and colleagues. The extent to which this represents a new approach and a genuine opening-up of professional autonomy has been questioned(7).

Grassroots developments and their limitations

There are a few indications of more positive developments. After years of disengagement from the research community, there are now regular contacts between the Cabinet Secretary for Education (John Swinney) and University Deans of Education. For these to be effective, leaders in the academic community will need to be prepared to abandon their tendency to collude in the deferential culture of Scotland’s educational establishment. Critics (such as the present writer) claim that academics have sometimes been complicit in their own containment. Perhaps a more encouraging development is taking place at grassroots level, where independent websites, personal blogs and social networking platforms enable teachers to share ideas, recommend reading and report on pedagogic innovations. In addition, increased numbers of practitioners are undertaking part-time study for postgraduate degrees. And judging from the success of last year’s well-attended researchED conference in Scotland, independent of the national agencies, there is a growing movement by teachers seeking to shape their own professional development and to pass on their insights to others. This event included interesting presentations on metacognition, memory research, the art and science of learning to read, the relation between family income and children’s developmental outcomes, and how teachers can best engage with research. The old ‘top-down’ model, led by government officials and controlled by bureaucratic institutions, has not served Scotland particularly well. A development that suggests classroom teachers are exercising greater agency in identifying topics worth investigating is surely to be welcomed.

But will that be enough? Here we need to be realistic about the political context. All governments tend to take a short-term view of policy initiatives. They think in terms of the next election and want to be able to boast of having fulfilled at least some of their promises. Many educational problems are complex and long-term, resistant to simple answers and ‘quick fixes’. Research evidence may be welcome up to a point, but in the cut and thrust of elections more powerful imperatives may come into play. Presentation becomes more important than substance and the language of public relations takes over from the measured tones of research. As Ben Levin (a Canadian who has worked both as an academic and a government adviser) has written: ‘In the political world belief is everything … No amount of evidence will displace or replace politics.’(8).


References
1. Scottish Government (2017) A Research Strategy for Scottish Education. Edinburgh: Scottish Government.
2. Organisation for Economic Cooperation and Development (2015) Improving schools in Scotland: an OECD perspective. Paris: OECD.
3. International Council of Education Advisers (2018) Report 2016– 18. Edinburgh: Scottish Government.
4. Humes, W. (2013) ‘Political control of educational research’, Scottish Educational Review 45 (2) pp. 18–28.
5. Paterson, L. (2018) Scottish education policy: why surveys matter. CES Briefing No. 66. Edinburgh: Centre for Educational Sociology.
6. Howieson, C. and Croxford, L. (2017) ‘To know ourselves? Research, data and policy-making in the Scottish education system’, Journal of Education and Work 30 (7) pp. 700–711.
7. Humes, W. (2014) ‘Professional update and practitioner enquiry: old wine in new bottles?’, Scottish Educational Review 46 (2) pp. 54– 72.
8. Levin, B. (2010) ‘Governments and education reform: some lessons from the last 50 years’, Journal of Education Policy 25 (6) pp. 739–747.