When Michael Gove, the Secretary of State for Levelling Up, Housing and Communities was putting together his recent White Paper he needed some education policies. Importantly, they needed to target left behind areas. The document therefore includes a map of the country, showing the average attainment of 16 year olds in each local authority (LA). It is on this basis that areas were designated ‘Education Investment Areas’.
In some ways that makes sense: if we care about equity then of course we should be improving outcomes for children and young people in the places where they do least well. However as Dr Sam Baars from The Centre for Education and Youth has explained, when policy makers select areas based on LA-level attainment data, they fail to understand the nature of a place, including what it’s like to grow up there and what stands in the way of educational success for the young people who live there.
This insight will come as no surprise to teachers and academics who know that ‘educational outcomes’ are not the same as ‘school outcomes’. In other words, a young person’s exam results partly reflect school outcomes, but other factors play a much bigger role. Therefore, whilst educational achievement makes a good first port-of-call for the policy maker interested in equitable outcomes, it can only be the first step in an investigation - otherwise the solutions that jump to mind all too often focus myopically on schools.
Combining longitudinal information on academic achievement, attendance, school exclusion, pupil context and the opportunities available (see part 3 of this blog series) provides a much fuller picture of different pupils’ situations and the areas policy makers should target in order to unlock impact. Better information is increasingly available, whether through novel sources of insight like TeacherTapp, ImpactED and the CfEY qualitative data-bank or joined-up, longitudinal datasets like LEO. A new report from the National Foundation for Education Research in association with the Edge Foundation explores the potential for improved use of the LEO data and calls on policy makers to develop a benchmark tool to better contextualise and interpret destination data.
The charity Action for Children has recently published new research demonstrating this issue and calling for urgent action, noting that:
“It is widely recognised that a lack of data on the amount of early help, as well as the type of support, that’s offered is a long-term barrier to proving its effectiveness. Our research corroborates this. Only 40% of local authorities were able to provide the number of children receiving early help for all five years covered by this report. [The government should therefore] start collecting data on early help provision as part of the Child in Need Census Data collection on early help. [This] should be accompanied by a National Outcomes Framework for early help and existing statutory services"
Similarly, being in school is (generally) a prerequisite for getting the most out of education (although the pandemic showed it wasn’t a sine qua-non). Poor attendance can also provide a warning sign of issues beyond the school gates. Yet, policy makers do not have timely or accurate data on who’s in school, as one policy maker explained:
“Attendance must be the most important possible thing but what data do we have in order to improve that? We have a spreadsheet that schools fill out if they remember to, with a 50% response rate and a census that inexplicably lags 12 months behind. How can you guide a system with that? You don't have the data you need and you don't have the codes needed to tell whether the children are actually in school, or just marked as 'equivalent to being in school'”
Poor data drove suboptimal education policy decisions at the start of the Covid pandemic. For example the government vastly underestimated the extent to which ‘the digital divide’ would hinder learning. The mistake came about because they used figures from Ofcom which tracked households with at least one device. This turned out to be insufficient to make a success of learning in lockdown. They also underestimated the consequences of pupils' lack of access to quiet spaces in which to learn. While the pandemic put issues like these under the spotlight, they are not new problems, and greater visibility of other blind spots like food poverty might also prompt concerted action.
Information that might be missing from formal spreadsheets sometimes resides elsewhere, for example within networks of social relationships in a community, among parents and with young people themselves. However, although this information is invaluable at a small, local scale, aggregating insights to drive action at a regional and national level is challenging. Initiatives like “The Big Ask” - in which half a million young people shared their views with policy makers - are helping, but we’re not there yet.
Given this, is it any surprise that the ‘big ticket’ announcements in the Levelling Up White Paper focused on generic school improvement policies, such as mechanisms for taking over coasting schools and opening a new network of selective post-16 colleges?
To its credit, the White Paper does go on to broaden its lens, referencing the need to “tackle wider issues that may be limiting school improvement” and the importance of “other factors beside time spent in the classroom”. There is also a commitment to ensuring that “place” remains central to the government’s approach on early years and families, as well as a reference to LAs where children’s social care and SEND services perform badly. However, if education policy makers are to deliver in these areas, they need to start paying attention to far more than just headline ‘[school] performance’ measures.
Traditional metrics might point to problems, but rarely to explanations. Instead, policy makers need to turn to joined up datasets that include outcomes at different stages of education and interactions with a range of government funded services, including social workers and youth clubs. It is also incumbent on those of us who use data to interrogate it more deeply; as one attendee at our international seminar pointed out, it can be tempting to look for tidy and consistent insights, but instead, we should open our eyes to the tensions and contradictions within the data and use those to open up new lines of enquiry. Only then can we build up a fuller, technicolour picture of young people’s lives - and in doing so, a more nuanced approach to improving outcomes.
Over the coming months, Loic Menzies and Cambridge Assessment Network will be convening a series of expert round tables and discussion seminars on the topic of equity in education. Our first event in February asked, 'What information do policy makers need to help them map the way to educational equity?’
This is the second in a four-part blog series setting out the key considerations to come from the event. Part one focused on the need to look at young people across the distribution. The aim for the series is to build an incredibly rich body of thinking, which participants will have played a critical role in helping to develop.
You can catch up with the latest event over on our YouTube channel and make sure you're signed up to our mailing lists to be find out how you can get involved in the next one.