Pages

Monday, 6 April 2015

The birth of a zombie statistic


Last week the "i" newspaper splashed on a startling statistic: "40% of teachers leave within one year". It has since been repeated in the Guardian, Times, Mail, Observer and probably hundreds of other places.* It was cited in this weeks' Any Questions. It's been tweeted by thousands of people.

The only problem is that it's entirely untrue. 9% of teachers leave in their first year (Table C2). It's been 9 or 10% a year every year for the last 20 years. This isn't particularly interesting; it isn't news; but it is true.

The 40% figure comes from ATL - who press released the numbers to generate publicity for their annual conference. To be fair to ATL they never claimed that 40% of teachers leave within the first year. Their claim was that 40% of those who achieve qualified teacher status (QTS) aren't teaching after a year - this includes people who qualified but then never went into a teaching job in the first place. They generated this number by adding the 9% who leave in the first year to another table showing 10,800 people (roughly a third) who achieved QTS in 2011 never started teaching (Table I2).

Even if this number was correct all the newspaper reports would still be wrong because they're making a claim about the numbers who start teaching and then leave within a year. However, the 10,800 number is also wrong because it is generated from pensions data which omits certain groups of people. The correct data to use if you want to see how many people gain QTS and then don't start teaching is here in Table 5. This shows just 15% of those who gained QTS in 2012 were either not in a teaching job or had an "unknown" status six months after completion. It was 16% in 2011 (Table 1).

This matters because the 40% figure creates a false narrative about a profession in crisis. I agree with ATL that teacher workload is too high - often driven by nonsense compliance rules around marking and planning. I agree that it's a very stressful and tiring job and that many first year teachers don't get the support they need. But the vast majority of those who start teaching do stay and succeed. Exaggerating the problem through dodgy statistics risks putting off new entrants to the profession - which we really can't afford to do at the moment given an improving economy and changes to teacher training are creating serious recruitment issues.


*Massive credit to Schools Week for being the only publication to realise there was something dodgy about the statistic.


Saturday, 7 March 2015

What we should have put in the White Paper


One of the problems with the education debate in England is the tendency to focus on the merits of individual policies - “should we decouple A-levels?”; “are free schools working?” – rather than thinking strategically about what we’d like the system to look like and then using that template for making policy decisions.

My big regret about the 2010 White Paper is that it reads too much like a laundry list of policies rather than a set of design principles for system reform. The vision of a school-led system is explicit but there’s too little about what that means. Having a clearer set of design principles would have made it much easier to explain how various policies fitted into the overall picture and would have provided a firebreak against Ministers/No. 10 inserting their own random or contradictory policies into the mix.

So what would the core building blocks for a genuinely school-led system look be? I think there are three keys elements: school autonomy; accountability and capacity-building.

Autonomy is important because it leads to: faster decision-making as you don’t have to wait for a request to go up the chain; innovation because not everyone is following the same model; accuracy because decisions are based on local information rather than aggregated information at the national or regional level.

Accountability is important because transparent information leads to: the ability to uphold minimum standards; schools being able to benchmark their performance against others and identify areas for improvement; parents being able to more accurately assess their options.

But autonomy and accountability aren’t enough. The latter creates incentives to perform well (along, of course, with teachers’ typically high intrinsic motivation) and the former gives the agency to perform well but neither give them the capacity to perform well if they don’t know how. This is why my third building block is capacity-building. A school-led system needs the institutional infrastructure to broker support between strong and weaker schools without impeding their autonomy.

At the moment we have all the elements of this system but the balance is not yet right. Autonomy is impeded by an accountability system that is too punitive and the infrastructure for capacity-building is under-resourced and patchy. The links between the accountability system and capacity-building are too weak leaving struggling schools unclear what they need to do to improve (though the introduction of Regional Schools Commissioners has mitigated this to some extent).

So what might a set of principles based on these elements, which would allow us to realign the system, look like?

Autonomy:

1)      Schools should have authority over all their functions apart from those that require co-ordination between schools (e.g. exclusions; admissions; place planning).
 

2)      Where functions need to be carried out above the school level they should – where possible – be done through collective agreement at the local level.
 

3)      Schools should be funded consistently regardless of where they are in the country so they have the necessary resources to fulfil their functions.
 

Accountability:

1)      Accountability should be based on outputs (e.g. test results; destination data) and not inputs (e.g. whether a particular form of pedagogy is being practised).
 

2)      The consequences of accountability should be proportionate and in particular should not disadvantage schools with lower-attaining intakes.
 

3)      Accountability systems should reward collaborative behaviour where it leads to improvements.


4)      All data/information should be published (unless doing so would break data protection law).


Capacity-building:

1)      Where schools are considered to be below a minimum standard there should be immediate intervention.
 

2)      For all schools the accountability system should be linked to means of getting support for areas requiring improvement.
 

3)      Support should be available to all schools regardless of where they are in the country.


I’ve come up with these suggestions by myself and in a hurry so they’re unlikely to be right and certainly aren’t exhaustive. My aim is to illustrate the sort of discussion we should be having. Are these the right principles? If not why and what should we have instead? If they are right what would have to change in the system to ensure they were kept?

Next week I’ll explore this last question. What kind of policy changes would be necessary to make these principles a reality?

Thursday, 12 February 2015

What do Labour's spending plans mean for schools?



Last week I wrote this on the Conservatives plans for school spending. My conclusion was that over the next Parliament their plans would mean a 10.5% cut to the amount schools get per pupil (depending on inflation).

Today it was Labour's turn to announce their plans. They pledged to increase the schools' budget in line with inflation and, unlike the Conservatives, they have extended this to include early years and 16-19 provision.

On the face of it this looks like a much better pledge for schools. And for early years / 16-19 institutions it is. However, Labour haven't pledged to increase budgets per pupil. So while the current budget for schools (5-16) will increase in line with inflation it won't be increased to take account of the very substantial increase in pupil numbers over the next Parliament. I estimate this will mean a cut to the amount schools get per pupil of 9.5% compared to the Conservatives 10.5%.

Here's my working:

  • The DfE predicts there will be 566k more pupils in 2020 than in 2015. As these children have mostly been born already this is a fairly safe prediction. Average per pupil costs are £4.5k a year but there will be larger proportionate increases in secondary and special school places which are more expensive than primary. When this distribution is taken into account (see detailed working at the end of the post*) the total cost is = £2.85 billion. The current schools budget is £41.6 billion so those additional pupils represent an effective 7% cut.
  • On top of this there are upcoming increases to schools contributions to teacher pensions and National Insurance that represent an effective 2.5% cut. These are explained in more detail in by blog on the Conservatives' plans.

This is actually much easier to calculate than the effect of the Conservative plans because it doesn't rely on assumptions about inflation. There will definitely be a cut in this range if pupil numbers rise as per projections. It also means that if inflation is lower than my assumptions the Conservative plans would actually better for schools than Labour's (unless they have a sixth form). Equally if inflation rises above my assumptions the Conservative cut grows whereas the Labour one stays the same.

Either way it now looks certain that schools will have a significant cut in their budgets over the next Parliament - though probably smaller than other public sector institutions outside of the NHS. All headteachers and business managers will need to work through the implications for their schools. Austerity is here to stay.



* According to the DfE figures there will be 307k more secondary pupils; 245k more primary pupils and 15k more pupils in special or alternative provision settings. The DfE doesn't publish per pupil rates for different stages but the average primary rate (before things like pupil premium and deprivation funding are added) was just under £3k. The KS3 rate was just over £4k and the KS4 rate was £4.6k. I have assumed the additional funding on top of this is roughly approximate between schools (primaries get more pupil premium but secondaries get more of other types of additional funding). Given the per pupil average is £4.5k I've calculated an additional primary place at £3.5.k and an additional secondary place at £5.75k (to balance KS3+ KS4). I've assigned a figure of £15k per place to special schools - this is definitely an underestimate but I can't find a source for the per pupil amount and I want to be conservative in my assumptions. So:

307k secondary pupils x £5750
245k primary pupils x £3500
15k special/AP pupils x £15000

= £2.85 billion.

Because of the lack of data this is an estimate and may be out by half a percentage point or so either way.

Monday, 2 February 2015

What do Conservative spending plans mean for schools?



Just after the Prime Minister's speech on education earlier today I tweeted:

"PM confirms Tory school spending plans for next Parliament - flat cash per pupil. Combined with NI/pension changes = at least 10% cut".

This got picked up by a few news outlets this afternoon as a source for how big the cut would be. So I thought I better explain my working and add a few other points too detailed for a tweet.

I derived the "10%" from two things.

First, over the next couple of years schools will need to pay around £350-400 million more into teacher pensions as employer contributions increase. In addition there will be an additional £550-600 million more national insurance for schools to pay as a result of changes to state pensions in 2016. These figures come from a very helpful paper produced by the Association of Colleges. The schools budget is £41.6 billion so these changes represent a cut of around 2.5%.

Secondly, the "flat cash per pupil" settlement announced by the PM today means that from 2016 schools will not see their income rise in line with inflation. Based on the latest Bank of England estimates I've assumed inflation will run at: 0.5%; 1.5%; 2%; 2%; 2% over the next Parliament - 8% in total.

Add 8% to 2.5% and you get a 10.5% reduction in the amount schools receive per pupil. Obviously if inflation is lower the cut will be lower and if higher the cut will be higher.

Some additional points:

  • The vast majority of school spending is on staff and we can probably expect pay rises of 1% to continue (though schools have freedom over this so could choose to pay more or less). This means that, in practice, they won't feel the full 10.5% cut as their main area of expenditure will increase at levels below inflation.
  • On the other hand the protection announced today doesn't cover either the pupil premium or 16-19 budgets. These could be cut by more. 16-19 has been cut over this Parliament so schools with sixth forms are already hurting.
  • Moreover other budget cuts to welfare and social care have knock on effects on schools that aren't accounted for. This is, naturally, especially true of schools in poorer areas.
  • It is not yet clear whether these cuts will hit all schools equally. The current Government plans to shift to a National Funding Formula after this election. This could mean some schools in currently overfunded areas lose more while others receive some protection. As schools have surpluses of £4.5bn at the moment - that are not evenly distributed across the country - this would be sensible.

It's also worth noting that, given overall Conservative spending plans, this settlement will still leave schools much better off than other non-health departments like the Home Office. Moreover schools have had a relatively generous settlement this Parliament - seeing their revenue funding increase on average by 1% per pupil in real terms (not including 16-19 spending).

Labour and the Lib Dems plan to cut less over the next Parliament so have more leeway. Labour have not yet revealed their plans, and I'll update this blog when they do. The Lib Dems have promised to protect all education funding from 3-19 in real terms but have not yet said if this will be a "red line" in coalition negotiations. As all parties have accepted the NI/pensions changes even "real terms protection" will feel like a small cut.



Friday, 2 January 2015

The Standards Puzzle


"Have standards improved over time?" is one of the most persistent questions in education policy. And understandably so - under the last Government spending on education doubled, so it's reasonable to want to know if that's made any difference to the "output" of the education system.

Unfortunately our main measuring tools in education - national exams - are useless for answering the question. First because they've changed so often and secondly because they are used for school accountability and so schools have got better at teaching to the test (this isn't a criticism - it's an entirely rational response to a clear incentive).

Professor Rob Coe at the University of Durham has made the best attempt at using alternative "low stakes" test data to look at standards over time. In his inaugural lecture he set out his analysis of international tests like PISA as well as Durham's own tests, used by many schools. His conclusion: "The best I think we can say is that overall there probably has not been much change."

In the absence of any better evidence I'd have to agree with Professor Coe that this is what the data shows. And yet it feels counter-intuitive. I've worked in education for ten years and it certainly feels to me that schools have improved over that time. Likewise most of the more experienced teachers and headteachers I've discussed this issue with think things have got significantly better too.

Of course this could simply be cognitive biases at work. We all desperately want things to improve so we convince ourselves they have. But a recently published DfE report suggests another possible explanation.

The snappily titled "Longitudinal study of young people in England: cohort 2" (LSYPE2) will track 13,000 young people from year 9 to the age of 20. The first cohort were tracked from 2004-2010. Comparing the two cohorts will show how things have changed over the past ten years. This report looks at the first year's data from the new cohort and compares it with the first year's data from 2004.

And the trends are very clear. Ironically, given the recent obsession with "character building" amongst policymakers, there have been big improvements in a range of "non-cognitive" measures. Reported bullying has fallen; the percentage who claim to have tried alcohol has plummeted; aspiration has increased - higher percentages say they are likely to go to university; and relationships with parents seem to be a lot stronger too.
























This is entirely consistent with other data showing massive falls in risky behaviours by young people over the last decade - including a huge fall in criminal behaviour. As well as big increases in participation in education post-16 and in higher education.

All of this would suggest I, and others, are not imagining it when we claim that schools are - on average - nicer places to be than ten years ago. And that pupils are making more progress, at least in the sense that more are going on to further and higher education.

But here's the puzzle: given the improvements in behaviour; the reduction in criminality; the falls in truancy; the increase in aspiration; the improvements in home lives - all of which are known to link to academic attainment - why haven't we seen a commensurate, observable, rise in academic standards? Either academic standards have actually improved, but we just don't have the measurements available to identify it properly, or something is happening in schools that's preventing us capitalising on these "non-cognitive" improvements to genuinely improve standards. So what's going on?

All thoughts welcome!


Tuesday, 15 July 2014

Education after Gove


I wasn't expecting to be writing this post today. There has been a rumour going round for months that Michael Gove would be moved to an election role - though as Party Chair rather than Chief Whip. But in recent weeks all the noises were that he would be staying in post until the election.

Tim Montgomerie said earlier on twitter that: "I understand Osborne opposed Gove move but dire opinion polling presented by Lynton Crosby of MG's standing with teachers forced change." Another possibility is that he's simply got fed up with being blocked on any further policy by the Lib Dems. If holding the fort until the election is all that's left then he's happy for someone else to do it.

Whatever the reason what does his departure mean for education? Here are a few initial thoughts:

1) There are unlikely to be any major policy reversals. No. 10 have very deliberately ensured the new Secretary of State Nicky Morgan is surrounded by Govites - Nick Boles, Nick Gibb and John Nash. Moreover Gove himself will still be in No. 10 and will be in the PM's daily meeting - he is still in a position to prevent anything he thinks would significantly undermine his legacy. What's more the election is only 10 months away - it's not a time for big U-turns.

2) I don't know Nicky Morgan and she doesn't have a track record in education but I'm sure, like all ministers, she will want some specific policies that are identified as hers. Briefing around her appointment suggests her thing will be early years and the Conservatives certainly see this as a key election battleground. Labour already have some big and expensive policies in this space.

3) Some officials will see the appointment of a new Secretary as an opportunity to tweak various policies they are worried about. The slew of upcoming exam changes is an obvious area where they may try to use her appointment to lengthen the timelines of reform. There is also an on-going review of ITT which may not now go as far as might have been expected.

4) There won't be much time for the new Secretary to learn her brief. Her first crisis might come as soon as next month. Ofqual recently wrote to schools saying they expected greater than normal turbulence in exam results this summer as a result of earlier reforms (including linearity and the end of vocational equivalences). Last time there was greater than normal turbulence - in English results in 2012 - there was a firestorm of complaints from schools than ended in a judicial review. Even if results' week passes without incident, in September we have the launch of a new curriculum; new rules on assessment; the introduction of free school meals in key stage one; compulsory English and Maths post-16 for those without GCSE "C" grades; and about a dozen other things. While the Govian reform phase may be over the implementation phase is at a critical moment.

5) Gove's enemies may be celebrating prematurely. Though policy is unlikely to change much it will be significantly harder to demonise Nicky Morgan than it has been to attack Gove. He was something of a unifying factor for the teacher unions - the last NUT strike was effectively an anti-Gove demonstration. They may find their campaigns lose some momentum now.

Now is not the time for a proper retrospective of Michael Gove's time at the DfE. But - as Andrew Old says - perhaps his greatest achievement has been to normalise comprehensive education for the Conservative party; to shift the argument from "saving" a few bright poor kids through grammar schools or assisted places to creating a genuinely world class system for all. In time I suspect that will be more widely recognised than it is now.









Saturday, 28 June 2014

The London Schools Effect - what have we learned this week?



Perhaps the biggest question in education policy over the past few years is why the outcomes for London schools have been improving so much faster than in the rest of the country. I wrote about it here last year. Until now there's been little in the way of research into the question but last week two reports came out - one by the IFS and one from CFBT - that seek to provide some answers.

They both agree that the change in GCSE results has been spectacular. There's plenty of data in both reports on this but I found this graph from the IFS particularly powerful because it relates to a metric that isn't something schools are held accountable to - and so feels like authentic proof that something extraordinary has happened in London.




But what, exactly, has happened? Here the two reports seem to disagree. According to the IFS - whose analysis is purely quantitative the main reasons are:
  • Changes in pupil and school characteristics - in particular London and other inner-city areas have seen an increase in pupils from a range of ethnic backgrounds (partly) as a result of immigration. The IFS analysis suggests this accounts for about half the improvement in London between 2002-2012.
  • Changes in "prior attainment" - the authors argue that once higher levels of attainment in key stage 2 (end of primary) tests are taken into account then the "London effect" in secondaries looks less impressive. Indeed once prior attainment and changes in pupil/school characteristics have been controlled for the gap between London and the rest of the country falls from 21 percentage points in the  5 A*-C GCSE with English and Maths measure to just 5 percentage points. Moreover this gap is fairly stable between 2002-2012 - though it does increase a by about 2 percentage points over the period.
  • There was a big increase in key stage 2 schools for disadvantaged pupils between 1999-2003 and that led to big increases in GCSE scores for these pupils between 2004-08 - but the GCSE improvement was actually the result of prior attainment. The authors hypothesise this may be due to the introduction of "national strategies" in primary literacy and numeracy in the late 90s - these were piloted in inner London authorities (as well as some other urban areas e.g. Liverpool).
  • London secondaries do have a better record at getting disadvantaged pupils to stay in education post-16. After controlling for pupil/school characteristics they are around 10 percentage points more likely to stay in education.

The CFBT report does include quantitative analysis but is much more focus on qualitative research - specifically interviews with headteachers, academics, civil servants and other experts. This report argues the key reasons for London's improvement are:
  • Four key "improvement interventions" between 2002 and 2014 - the "London Challenge" (a Labour initiative that used data to focus attention on weaker schools and used better schools to support their improvement); Teach First; the introduction of sponsored academies; and improvements driven by local authorities.
  • They conclude that: "each of these interventions played a significant role in driving improvement. Evaluations of each of these interventions have overall been positive, although the absence of RCT evidence makes it impossible to identify the precise gains from each set of activities. The exact causal mix also varied from borough to borough because there were variations in the level of involvement in London Challenge, variations in the effectiveness of local authority activity, variations in the level of ‘academisation’ and variations in the level of input from Teach First."
  • The authors argue that there were cross-cutting themes covering these interventions and the wider improvement story. In particular - the better use of data; practitioner-led professional development and, particularly, leadership - both politically and at school level.

At first glance it's hard to reconcile the positions taken in the two reports. The IFS focus on primary, and to a lesser extent pupil characteristics, while CFBT focus on secondary policy changes. I think, though, they are two different bits of an extremely complicated jigsaw that hasn't been finished yet - and because of the lack of evidence/data - never will be. Like the apocryphal blind men with the elephant they're looking at different parts of the whole.

1) Both reports probably underestimate the importance of changes in pupil characteristics. CFBT completely dismiss this as a driver based on an inadequate analysis of ethnicity data. The IFS analysis is more comprehensive and so does pick up a significant effect but may still miss the true extent because of the limitations of available data on ethnicity. I think this may explain the extent of the "primary effect" in the IFS report. Essentially they're saying the big improvements in GCSE results are partially illusory because they were already built into those pupils' primary attainment. However, they are unable (because of a lack of data) to analyse whether those primary results were also partly illusory because those pupils started primary at a higher level.

There is a clue that this may be a factor in their analysis of Key Stage 1 data for more recent years. Controlling for prior attainment at KS1 reduces the "London effect" at Key Stage 2 by about half. But the authors are unable to do this analysis for the crucial 1999-2003 period when results really improved. They are also unable to look from the beginning of primary - because we don't have baseline assessments when pupils start school.

2) The IFS report probably underestimates the secondary effect. As Chris Cook has shown the London secondary effect at least doubles if you exclude equivalents.

3) The CFBT report definitely underestimates the primary effect because it doesn't look for it. Thought there are some quotes from people who worked in local authorities during the crucial period who highlight their focus on literacy and numeracy during the late 90s.

So pupil characteristics; primary schools and secondary schools all seem to have played a role in boosting attainment in London. The CFBT report is convincing on some of the factors at play in secondaries; the IFS report is convincing that primaries also played some kind of a role. The big questions for me after digesting both reports:

  • Are there "London specific" pupil characteristics that wouldn't be apparent from the available data. E.g. are immigrants who go to London different to those who don't? Are some of the ethnicity effects stronger than indentified because key groups (e.g. Polish) are hidden in larger categories?
  • Are there policy reasons why London primaries improved faster than those elsewhere in the crucial 1999-2003 period? I struggle to buy the idea that the national strategies were the key driver here as they were rolled out nationally (albeit that the pilots were focused on inner London). But the quotes in the CFBT report suggest their might be something here around a general focus on literacy/numeracy. This is a key area for further research.
  • To what extent were the policy interventions (London Challenge, academies etc...) the main reasons for secondary improvement? Or was it more to do with the number of good school leaders during that period? One of the most interesting tables in the CFBT report - pasted below - shows that inner London is the only part of the country where headteacher recruitment has got easier in the last ten year. And the importance of leadership shines through in the interviews conducted for the CFBT report. Is it possible to more closely identify the relationship between individual leaders and school improvement? What can we learn from these leaders?

























And of course the really big question - is any of this replicable in other areas? We're starting to see a raft of local improvement initiatives across the country - Wales Challenge; Somerset Challenge; North East Challenge and so on. It's really important that in these areas we do a better job of evaluating all the interventions put in place from the start so that if we see big improvements we have a better understand of the causes.


Further reading:

The IFS report

The CFBT report

Chris Cook's analysis

Loic Menzies - one of the CFBT authors - on the two reports

The London Challenge evaluation by Merryn Hutchings and others

Transforming Education For All: The Tower Hamlets Story by Chris Husbands et al