Brilliant Club
News

Beyond 5A*s-C: Chris Hall, Impact Box

02 Nov 2016

Results day is always eventful. In most schools and colleges amongst the joy and occasional agony a relatively mundane task is also taking place. Results day is usually the best opportunity schools have to record where students are going on to. The amount of energy invested in this process can be darkly comic. I’ve witnessed some bizarre things occur in the name of recording destinations, from a Head of Year trying to catch a Year 13 on results day who’d scaled a fence to avoid handing in his destinations slip, to a Sixth Form Administrator searching Facebook to try and work out from some Freshers Week photos which university a student had gone to.

The problem with all of this is that once the arduous process of recording destinations is completed nothing really happens. The information is sent to the Local Authority. Eventually after endless cross-referencing and bureaucratic validation the data is sent back to schools in summarised form giving breakdowns of the percentage of students on each pathway. This summary is provided two and a half years after the students leave. So when the school performance tables were published in January 2016 the exam performance of students in Summer 2015 was presented alongside the destinations of students leaving in 2013. Putting the confusing juxtaposition to one side, the truth is that a metric with a time-lag of almost three years is useless in terms of driving school improvement.

 

startup-photos-4-300x200Does it matter if the headline destinations measure is rubbish? After all, it’s hardly unprecedented for a useless metric or two to sully the good name of the performance tables. What makes the metrics limitations particularly frustrating is that the education system is in desperate need of a way to measure destinations. Ever since the performance tables were first published in 1991 the degree of scrutiny and accountability for exam results has been increasing. In the darkest moments when we’ve almost despaired at the spiralling pressure on schools and students to get results, we’ve taken comfort by telling ourselves that these exams really matter, that good results are crucial to opening doors and increasing opportunities in later life. But the reality is that we are largely in the dark about whether or not those things actually happen. At the moment we’re like the coach of an Olympic sprinting team who obsessively monitors performance in training but doesn’t turn up to watch the actual race.

If we think education is valuable at least partly because of what it leads to, we need to get better at measuring the ultimate outcome. Over the last couple of months I’ve been working with Unifrog to develop a new way of doing just that. There are a couple of key principles that have underpinned our approach.

First, we are keenly aware that the performance table approach we’ve become used to in UK education is unsatisfactory. As soon as a measure becomes a way of externally benchmarking schools, all kinds of tensions and perverse incentives arise. But when a metric is designed simply to help a school work out where it’s strengths and weaknesses lie it can be incredibly powerful. We have tried to design a metric that is an internal tool for school improvement, rather than a statistic to massage for Ofsted.

Second, we’ve tried to focus relentlessly on how this will work in practice. It’s clear that one of the biggest challenges for schools and colleges at the moment is actually finding out from students where they are going, and recording it. That’s why we’ve embedded the metric in a platform that makes it straightforward both to contact students to find out where they’ve gone, and to input the responses. Similarly, to be useful the metric needs to provide analysis on the fly. School leaders need to be able to review the data as soon as it is recorded, not two and a half years later.

The core of our solution is what we are calling ‘destination value-add’. By combining data from the National Pupil Database, universities and apprenticeship providers, we’ve devised a method of quantifying whether the university course, apprenticeship, or FE course that each student goes on to is in line with what would be expected given their GCSE results. We do this by comparing each student’s GCSE results with the average GCSE results of all the other students going to the same destination. Much like they can with exam results through something like RAISE Online, schools will be able to break down performance by different groups of students to identify trends. For example, schools would be able to see the average value-add score of male and female students side by side. They’d also be able to compare the performance of their students with both the national average and the average for similar schools.

Attaching a number to a destination is only ever going to tell part of the story for each student. I once worked with a student who got straight A*s at GCSE, took Maths, Physics, Chemistry and Biology at A-level and then took up a position as an apprentice plumber. If he’d gone on to Medicine at a Russell Group university it might have looked better for the school – but he was passionate about becoming a plumber and it was the right thing for him to do. The point is that the value-add metric is only designed to be the starting point for considering a destination at an individual level. And because it’s not tied to any form of accountability, schools are free to put it to one side when it doesn’t tell the whole story for an individual student. However, as an overall picture of how a school is doing in terms of destinations being able to put a number on it is invaluable. Doing so opens up a whole wealth of possibilities in terms of tracking progress over time and identifying areas of strength and weakness.

Schools and colleges are always going to have to record destinations. Statutory duties aside, schools should know where students go given they’ve spent years preparing students for the next step. But we need to do more to ensure that the information is being put to good use. Imagine a Head of Sixth being able to present to Governors in October not just where students have gone but how the school has improved over time in terms of destinations for low prior attainers, even if more work is needed to ensure that students going on to apprenticeships get onto the more competitive ones. That type of information, available in real time, has the power to transform how schools think about destinations.

Chris Hall

Founder and director

Impact box