Friday, 2 January 2015

The Standards Puzzle

"Have standards improved over time?" is one of the most persistent questions in education policy. And understandably so - under the last Government spending on education doubled, so it's reasonable to want to know if that's made any difference to the "output" of the education system.

Unfortunately our main measuring tools in education - national exams - are useless for answering the question. First because they've changed so often and secondly because they are used for school accountability and so schools have got better at teaching to the test (this isn't a criticism - it's an entirely rational response to a clear incentive).

Professor Rob Coe at the University of Durham has made the best attempt at using alternative "low stakes" test data to look at standards over time. In his inaugural lecture he set out his analysis of international tests like PISA as well as Durham's own tests, used by many schools. His conclusion: "The best I think we can say is that overall there probably has not been much change."

In the absence of any better evidence I'd have to agree with Professor Coe that this is what the data shows. And yet it feels counter-intuitive. I've worked in education for ten years and it certainly feels to me that schools have improved over that time. Likewise most of the more experienced teachers and headteachers I've discussed this issue with think things have got significantly better too.

Of course this could simply be cognitive biases at work. We all desperately want things to improve so we convince ourselves they have. But a recently published DfE report suggests another possible explanation.

The snappily titled "Longitudinal study of young people in England: cohort 2" (LSYPE2) will track 13,000 young people from year 9 to the age of 20. The first cohort were tracked from 2004-2010. Comparing the two cohorts will show how things have changed over the past ten years. This report looks at the first year's data from the new cohort and compares it with the first year's data from 2004.

And the trends are very clear. Ironically, given the recent obsession with "character building" amongst policymakers, there have been big improvements in a range of "non-cognitive" measures. Reported bullying has fallen; the percentage who claim to have tried alcohol has plummeted; aspiration has increased - higher percentages say they are likely to go to university; and relationships with parents seem to be a lot stronger too.

This is entirely consistent with other data showing massive falls in risky behaviours by young people over the last decade - including a huge fall in criminal behaviour. As well as big increases in participation in education post-16 and in higher education.

All of this would suggest I, and others, are not imagining it when we claim that schools are - on average - nicer places to be than ten years ago. And that pupils are making more progress, at least in the sense that more are going on to further and higher education.

But here's the puzzle: given the improvements in behaviour; the reduction in criminality; the falls in truancy; the increase in aspiration; the improvements in home lives - all of which are known to link to academic attainment - why haven't we seen a commensurate, observable, rise in academic standards? Either academic standards have actually improved, but we just don't have the measurements available to identify it properly, or something is happening in schools that's preventing us capitalising on these "non-cognitive" improvements to genuinely improve standards. So what's going on?

All thoughts welcome!