Pages

Saturday 28 June 2014

The London Schools Effect - what have we learned this week?



Perhaps the biggest question in education policy over the past few years is why the outcomes for London schools have been improving so much faster than in the rest of the country. I wrote about it here last year. Until now there's been little in the way of research into the question but last week two reports came out - one by the IFS and one from CFBT - that seek to provide some answers.

They both agree that the change in GCSE results has been spectacular. There's plenty of data in both reports on this but I found this graph from the IFS particularly powerful because it relates to a metric that isn't something schools are held accountable to - and so feels like authentic proof that something extraordinary has happened in London.




But what, exactly, has happened? Here the two reports seem to disagree. According to the IFS - whose analysis is purely quantitative the main reasons are:
  • Changes in pupil and school characteristics - in particular London and other inner-city areas have seen an increase in pupils from a range of ethnic backgrounds (partly) as a result of immigration. The IFS analysis suggests this accounts for about half the improvement in London between 2002-2012.
  • Changes in "prior attainment" - the authors argue that once higher levels of attainment in key stage 2 (end of primary) tests are taken into account then the "London effect" in secondaries looks less impressive. Indeed once prior attainment and changes in pupil/school characteristics have been controlled for the gap between London and the rest of the country falls from 21 percentage points in the  5 A*-C GCSE with English and Maths measure to just 5 percentage points. Moreover this gap is fairly stable between 2002-2012 - though it does increase a by about 2 percentage points over the period.
  • There was a big increase in key stage 2 schools for disadvantaged pupils between 1999-2003 and that led to big increases in GCSE scores for these pupils between 2004-08 - but the GCSE improvement was actually the result of prior attainment. The authors hypothesise this may be due to the introduction of "national strategies" in primary literacy and numeracy in the late 90s - these were piloted in inner London authorities (as well as some other urban areas e.g. Liverpool).
  • London secondaries do have a better record at getting disadvantaged pupils to stay in education post-16. After controlling for pupil/school characteristics they are around 10 percentage points more likely to stay in education.

The CFBT report does include quantitative analysis but is much more focus on qualitative research - specifically interviews with headteachers, academics, civil servants and other experts. This report argues the key reasons for London's improvement are:
  • Four key "improvement interventions" between 2002 and 2014 - the "London Challenge" (a Labour initiative that used data to focus attention on weaker schools and used better schools to support their improvement); Teach First; the introduction of sponsored academies; and improvements driven by local authorities.
  • They conclude that: "each of these interventions played a significant role in driving improvement. Evaluations of each of these interventions have overall been positive, although the absence of RCT evidence makes it impossible to identify the precise gains from each set of activities. The exact causal mix also varied from borough to borough because there were variations in the level of involvement in London Challenge, variations in the effectiveness of local authority activity, variations in the level of ‘academisation’ and variations in the level of input from Teach First."
  • The authors argue that there were cross-cutting themes covering these interventions and the wider improvement story. In particular - the better use of data; practitioner-led professional development and, particularly, leadership - both politically and at school level.

At first glance it's hard to reconcile the positions taken in the two reports. The IFS focus on primary, and to a lesser extent pupil characteristics, while CFBT focus on secondary policy changes. I think, though, they are two different bits of an extremely complicated jigsaw that hasn't been finished yet - and because of the lack of evidence/data - never will be. Like the apocryphal blind men with the elephant they're looking at different parts of the whole.

1) Both reports probably underestimate the importance of changes in pupil characteristics. CFBT completely dismiss this as a driver based on an inadequate analysis of ethnicity data. The IFS analysis is more comprehensive and so does pick up a significant effect but may still miss the true extent because of the limitations of available data on ethnicity. I think this may explain the extent of the "primary effect" in the IFS report. Essentially they're saying the big improvements in GCSE results are partially illusory because they were already built into those pupils' primary attainment. However, they are unable (because of a lack of data) to analyse whether those primary results were also partly illusory because those pupils started primary at a higher level.

There is a clue that this may be a factor in their analysis of Key Stage 1 data for more recent years. Controlling for prior attainment at KS1 reduces the "London effect" at Key Stage 2 by about half. But the authors are unable to do this analysis for the crucial 1999-2003 period when results really improved. They are also unable to look from the beginning of primary - because we don't have baseline assessments when pupils start school.

2) The IFS report probably underestimates the secondary effect. As Chris Cook has shown the London secondary effect at least doubles if you exclude equivalents.

3) The CFBT report definitely underestimates the primary effect because it doesn't look for it. Thought there are some quotes from people who worked in local authorities during the crucial period who highlight their focus on literacy and numeracy during the late 90s.

So pupil characteristics; primary schools and secondary schools all seem to have played a role in boosting attainment in London. The CFBT report is convincing on some of the factors at play in secondaries; the IFS report is convincing that primaries also played some kind of a role. The big questions for me after digesting both reports:

  • Are there "London specific" pupil characteristics that wouldn't be apparent from the available data. E.g. are immigrants who go to London different to those who don't? Are some of the ethnicity effects stronger than indentified because key groups (e.g. Polish) are hidden in larger categories?
  • Are there policy reasons why London primaries improved faster than those elsewhere in the crucial 1999-2003 period? I struggle to buy the idea that the national strategies were the key driver here as they were rolled out nationally (albeit that the pilots were focused on inner London). But the quotes in the CFBT report suggest their might be something here around a general focus on literacy/numeracy. This is a key area for further research.
  • To what extent were the policy interventions (London Challenge, academies etc...) the main reasons for secondary improvement? Or was it more to do with the number of good school leaders during that period? One of the most interesting tables in the CFBT report - pasted below - shows that inner London is the only part of the country where headteacher recruitment has got easier in the last ten year. And the importance of leadership shines through in the interviews conducted for the CFBT report. Is it possible to more closely identify the relationship between individual leaders and school improvement? What can we learn from these leaders?

























And of course the really big question - is any of this replicable in other areas? We're starting to see a raft of local improvement initiatives across the country - Wales Challenge; Somerset Challenge; North East Challenge and so on. It's really important that in these areas we do a better job of evaluating all the interventions put in place from the start so that if we see big improvements we have a better understand of the causes.


Further reading:

The IFS report

The CFBT report

Chris Cook's analysis

Loic Menzies - one of the CFBT authors - on the two reports

The London Challenge evaluation by Merryn Hutchings and others

Transforming Education For All: The Tower Hamlets Story by Chris Husbands et al