Ambitious rhetoric but the reality falls short

Education in Scotland is often trumpeted by some as exemplary, with other countries – such as Wales – using it as a template from which to build their own systems. But recent years have seen troubling levels of unhappiness in the education community about the reality of delivering the Curriculum for Excellence (CfE) – the once-lauded program of educational reinvention that was supposed to revolutionise the learning of a nation’s children. Here, Walter Humes looks at some of the problems with the CfE, and what went wrong.

This article is based not only on my own analysis of the situation in Scotland, but also on the views of a small sample of people who are well placed to comment on the extent to which educational policy and practice are informed by research evidence. They include academics and researchers who have worked with government agencies, funding bodies and local authorities, a senior figure in a national organisation that regularly responds to consultations and policy proposals, and an experienced headteacher. To encourage frankness, respondents were guaranteed anonymity. Responsibility for the text that follows is, however, entirely mine.

Behind the official discourse

The word ‘evidence’ appears no fewer than 41 times in the document A Research Strategy for Scottish Education (1). The paper’s aims include a commitment to ‘learning from data and evidence’, ‘empowering practitioners to produce and use evidence and data’, and the ‘effective commissioning and dissemination of evidence on ‘what works’. The reference to ‘what works’ suggests a rather narrow view of the function of educational research – it should be concerned with fundamental questions of meaning and value, not just practical recommendations – but the general thrust seems to indicate a positive attitude towards the use of evidence at classroom, school, local authority and national levels.

However, these aspirations need to be understood against a background of mounting criticism about the Scottish Government’s record in relation to research and the use of evidence in policy development. The OECD report of 2015 which reviewed the progress of Scotland’s flagship policy of Curriculum for Excellence, launched in 2004, said that it could not conduct a full evaluation of the reform programme because there was insufficient information available (2). It called for ‘a robust evidence-base on learning outcomes and progression’. A similar plea was made in a report by the International Council of Education Advisers (ICEA), appointed by the Scottish Government in 2016, partly in response to public disquiet about a perceived decline in standards. One of the recommendations in the ICEA report was that the government should work with universities and other providers ‘to further develop and implement the educational research strategy published in 2017. This will enhance the system’s capacity for independent research and evaluation, and build a Scottish empirical evidence base’(3).
It is significant that it has taken pressure from outside Scotland to produce a shift of attitude in relation to research. Until recently, the post-devolution period was marked by progressive disengagement of government officials from the research community (4). Many academics felt that researchers were regarded with suspicion by politicians, inspectors and local authority officers, especially if their work took a critical line. The notion that critique may lead to improved strategies was not welcome in the conformist culture of Scottish education.

Although the mutual mistrust has eased slightly, with some reports that government is more willing to listen, it should not be overstated. One researcher recounted conflicting experiences in relation to the influence of his work on policy. He said that a particular project ‘did influence aspects of government policy’ and offered two explanations: first, the agency funding the research played an important role ‘in brokering access to policy makers’; and secondly, the research was ‘timely’ in the sense that the topic being investigated was already ‘gaining some momentum in government’.

Another project fared less well. It had minimal impact partly because ‘multiple policies were being introduced and the civil servants had little time to engage fully with the issues’. Furthermore, there seemed to be limited capacity to synthesise the results of this project with other related studies which had been commissioned, and so the opportunity to better inform proposed policies was missed.

These examples illustrate that policy making is rarely an entirely rational process. It is often messy, time- constrained, and subject to chance and the interventions of powerful players. Furthermore, research that is consistent with the current direction of travel within official policy circles is more likely to make an impact than research which raises challenging questions. This casts doubt on the degree of objectivity with which research evidence is reviewed by officials.

Longitudinal surveys

Any education system requires reliable information on which to base decisions. Over the last ten years, Scotland has withdrawn from certain surveys that provided useful comparative data which enabled trends to be identified. These included two international studies: PIRLS (Progress in International Reading Literacy Study) and TIMSS (Trends in International Mathematics and Science Survey). The ostensible reason was cost, but the decision was widely criticised as indicating a desire to conceal disappointing results. The Scottish Survey of Literacy and Numeracy (SSLN) was scrapped after the 2016 results indicated some downward trends, a pattern that was also shown in the findings of the 2015 PISA (Programme for International Student Assessment) report. Scotland does, however, continue to take part in the PISA programme. The introduction in 2017–18 of Scottish National Standardised Assessments (SNSAs) was controversial and the robustness of the data they will generate has been questioned.

Research that is consistent with the current direction of travel within official policy circles is more likely to make an impact. This casts doubt on the degree of objectivity with which research evidence is reviewed by officials.

Scotland’s current position is at odds with its historical record in this regard. A persistent critic of the Scottish Government’s attitude to independent research has been Lindsay Paterson, Professor of Education Policy at Edinburgh University. He has pointed out that in the middle decades of the 20th century, Scotland was a pioneer in the use of statistical surveys of school pupils, through the work of the Scottish Council for Research in Education. Later, the Centre for Educational Sociology at Edinburgh University carried out school leaver studies, starting in 1962 and running to 2005. These enabled researchers to evaluate the effects of major educational reforms, such as the introduction of comprehensive education and the expansion of higher education. Paterson argues that the current dearth of good-quality survey evidence makes Scotland a ‘data desert’. His conclusion is bleak: ‘There is now no survey series with which to hold Scottish government to account, and not even an openness in government to methodological discussion of the kinds of evidence that would be needed. This closing of minds to science is the very antithesis of accountability.’(5). Echoing the concerns of Paterson, Howieson and Croxford have reinforced the need for ‘system-wide, longitudinal data to enable a country to “know” its education and training system’.6 One longitudinal study that does exist is Growing Up in Scotland, started in 2005, tracing the development of ‘nationally representative cohorts’ of children over time (www.growingupinscotland.org.uk). It has produced interesting findings but it could not be used to evaluate Curriculum for Excellence, because there was no equivalent earlier study to enable meaningful comparisons to be made.

Local authorities and other organisations Central government is not the only agency with an interest in research evidence. Local authorities routinely collect data on the attainment of schools in their area, including standardised assessments of literacy and numeracy. This information can be used in discussions with headteachers about areas of strength and weakness. A key priority in recent years has been the desire to raise attainment generally, but in particular to reduce the gap in attainment between pupils in socially advantaged areas and those in deprived communities. Some headteachers claim that the instrument used to measure this, the Scottish Index of Multiple Deprivation (SIMD), based on postcodes, is too crude: there are disadvantaged children living in ‘affluent’ areas and not all children in ‘poor’ areas are deprived. This can be a particular problem in rural communities where the social and economic profile may be resistant to classifications that work in inner cities. Similarly, Insight, an online benchmarking tool for secondary schools and local authorities designed to help improve outcomes, makes it difficult to detect reliable trends when pupil numbers are small. There is also a concern about the capacity of senior staff to interrogate data and to use it effectively to make improvements. Teachers at all levels would benefit from opportunities to interpret research findings, whether quantitative or qualitative – a provision that would require both time and support.

This last point connects with an observation made by a senior academic familiar with staff development approaches in a number of Scottish local authorities. She reported that John Hattie’s work (as set out in Visible Learning and Visible Learning for Teachers) was strongly promoted, presumably because it drew on a wide range of research evidence and offered clear guidance about high-impact teaching strategies. But the academic wondered how well some of those recommending Hattie’s ideas understood the nuances of his approach. A simplistic application of research evidence may have unintended negative consequences.
Education Scotland, the national advisory body on the curriculum, claims that it draws on research in framing policy advice, though its record in this regard is patchy. The Scottish Qualifications Authority, which runs the national examination system, does rather better, collecting and analysing data on exam entries and results for the qualifications it offers. In recent years, the General Teaching Council for Scotland has sought to encourage teachers to engage in various forms of professional enquiry designed not only to enhance personal development but also to benefit the school through sharing insights with senior management and colleagues. The extent to which this represents a new approach and a genuine opening-up of professional autonomy has been questioned(7).

Grassroots developments and their limitations

There are a few indications of more positive developments. After years of disengagement from the research community, there are now regular contacts between the Cabinet Secretary for Education (John Swinney) and University Deans of Education. For these to be effective, leaders in the academic community will need to be prepared to abandon their tendency to collude in the deferential culture of Scotland’s educational establishment. Critics (such as the present writer) claim that academics have sometimes been complicit in their own containment. Perhaps a more encouraging development is taking place at grassroots level, where independent websites, personal blogs and social networking platforms enable teachers to share ideas, recommend reading and report on pedagogic innovations. In addition, increased numbers of practitioners are undertaking part-time study for postgraduate degrees. And judging from the success of last year’s well-attended researchED conference in Scotland, independent of the national agencies, there is a growing movement by teachers seeking to shape their own professional development and to pass on their insights to others. This event included interesting presentations on metacognition, memory research, the art and science of learning to read, the relation between family income and children’s developmental outcomes, and how teachers can best engage with research. The old ‘top-down’ model, led by government officials and controlled by bureaucratic institutions, has not served Scotland particularly well. A development that suggests classroom teachers are exercising greater agency in identifying topics worth investigating is surely to be welcomed.

But will that be enough? Here we need to be realistic about the political context. All governments tend to take a short-term view of policy initiatives. They think in terms of the next election and want to be able to boast of having fulfilled at least some of their promises. Many educational problems are complex and long-term, resistant to simple answers and ‘quick fixes’. Research evidence may be welcome up to a point, but in the cut and thrust of elections more powerful imperatives may come into play. Presentation becomes more important than substance and the language of public relations takes over from the measured tones of research. As Ben Levin (a Canadian who has worked both as an academic and a government adviser) has written: ‘In the political world belief is everything … No amount of evidence will displace or replace politics.’(8).


References
1. Scottish Government (2017) A Research Strategy for Scottish Education. Edinburgh: Scottish Government.
2. Organisation for Economic Cooperation and Development (2015) Improving schools in Scotland: an OECD perspective. Paris: OECD.
3. International Council of Education Advisers (2018) Report 2016– 18. Edinburgh: Scottish Government.
4. Humes, W. (2013) ‘Political control of educational research’, Scottish Educational Review 45 (2) pp. 18–28.
5. Paterson, L. (2018) Scottish education policy: why surveys matter. CES Briefing No. 66. Edinburgh: Centre for Educational Sociology.
6. Howieson, C. and Croxford, L. (2017) ‘To know ourselves? Research, data and policy-making in the Scottish education system’, Journal of Education and Work 30 (7) pp. 700–711.
7. Humes, W. (2014) ‘Professional update and practitioner enquiry: old wine in new bottles?’, Scottish Educational Review 46 (2) pp. 54– 72.
8. Levin, B. (2010) ‘Governments and education reform: some lessons from the last 50 years’, Journal of Education Policy 25 (6) pp. 739–747.