“Grown-ups like numbers. When you tell them about a new friend, they never ask questions about what really matters. They never ask: ‘What does his voice sound like?’ ‘What games does he like best?’ ‘Does he collect butterflies?’ They ask: ‘How old is he?’ ‘How many brothers does he have?’ ‘How much does he weigh?’ ‘How much money does his father make?’ Only then do they think they know him.” (Antoine de Saint-Exupery, The Little Prince, ch4)
Those working in education, from Early Years up to Higher Education, would agree that numbers are important. They help paint the picture of the ‘good’ or the ‘bad’ school, teacher, pupil; or to speak the Ofsted/DfE lingo, those who are ‘outstanding’ (1) and those that ‘require improvement’ (3).
If we take a look at the National curriculum assessments at key stage 1 and phonics screening checks in England (2018), the performance of 6 year olds are recorded with all sorts of graphs, charts and tables. Various comparisons follow, then, from these results, e.g. between local authorities, children from different ethnic backgrounds, etc. Some of these comparisons are indeed beneficial, in that they help us define cases of educational disadvantage, but here lies the problem: disadvantage is defined here only in terms of performance at this test. And this is the first issue with these numbers. Numbers narrow down the importance of education and educational disadvantage to test results.
But education is much more than a test, schools are much more than a test centre, and educational disadvantage is much broader than a test result.
The second issue with performance data is that they fail to capture the lived experience of that which they seek to measure, for example the lived experience of actually going through a KS1 phonics screening check or sitting for GCSEs. Looking at performance data one can only gather the outcomes of this test, but what about the whole testing process? What about the amount of teaching time that goes into preparing children for high-stakes tests? What about the amount of pressure that falls on the shoulders of teaching staff and families? What about children and the lost play time?
In 2016, the Let our Kids Be Kids campaign gained momentum as parents were boycotting SATs for Year 2 and Year 6 pupils. Movements like this are an indication of the wider social implications of education and the strong links between education and the wider community which policy tends to disregard, and which is definitely not captured in the numbers which record student performance.
Lastly, performance data are being used in a very selective and specific way, not only in terms of what they capture but also in terms of what they compare. How much of this data takes into account pupils’ lives and living conditions outside school? Indeed, performance data consider social factors, such as ethnicity, eligibility for Free School Meals (FSM), gender. But there is also so much that goes unnoticed. The Education in England: Annual Report 2019 by the Education Policy Institute declares that “The most persistently disadvantaged pupils are those who have been eligible for Free School Meals for at least 80 per cent of their time at school, indicating that they have lived in households with little or no employment income, not just temporarily, but long term”. And it is the second half of the statement that makes me wonder: how often do we hear about pupils’ household situation? Have we ever considered improving education by looking outside schools and into the wider social web?
Over the summer the Department for Education announced the Explore Education Statistics dissemination platform, with the view to “build a reliable service which makes it easier for all users to find, access, navigate and understand education statistics” and to “empower producers to work more innovatively, provide more coherent and engaging information and save time and effort through automation”. The question is: are the grown-ups going to use their numbers any differently this time?