Every now and then the Education Endowment Fund releases a series of reports. It’s interesting to see the sheer difference in media attention they create, and also the buzz on social media. I collated all the studies in a Google spreadsheet:
I tabulated the name of the project, the project lead, the money involved (to me it is unclear if this includes the cost of evaluation cost), whether the project was completed or in progress (at time of writing), the evaluator, type of project, number of schools, and I added an indicator of whether there were signs of ‘significance testing’. In this post I want to summarise the recently released reports. I make no claim to cover all aspects of the studies.
Philosophy for Children (EEF page)
This was an effectiveness trial with 40 schools which evaluated Philosophy for Children (P4C), an “approach to teaching in which students participate in group dialogues focused on philosophical issues.” A key conclusion in the report was that there was a positive impact on KS2 attainment, with “2 months progress”. This project caused the most discussion, mainly because of some crucial aspects in the design. There was a guest post by Inglis, and this post.
Word and World Reading Programme (EEF page)
This study was a pilot study of Core Knowledge (E.D. Hirsch inspired) materials from The Curriculum Centre. I was surprised the blogosphere did not really pick up on this study. A suspicious mind might think this might be because the results of this ‘Core Knowledge’ programme were quite underwhelming, and this does not fit the ‘knowledge’ preferences. But of course, that is just as suggestive 🙂 I will write a separate post on this.
Affordable Individual and Small Group Tuition: Primary (EEF page)
Key conclusion that stands out: “Due to the study’s design and problems recruiting schools to receive tuition or participate in the evaluation, this evaluation has not provided a secure estimate of the impact of the project on pupil outcomes.” and also “Participating pupils made slightly less progress in both English and mathematics than those in the matched comparison group. However, this finding was not statistically significant, meaning that it could have occurred by chance.”. Staff members were positive. Recommendations are to improve.
Affordable Individual and Small Group Tuition: Secondary (EEF page)
“Due to the limitations of the study design and the absence of a high-quality comparison group, this evaluation has not provided a secure estimate of the impact of the project on academic outcomes.” and “Participating pupils achieved slightly higher mathematics GCSE scores than pupils in the comparison group, and lower English GCSE scores than pupils in the comparison group. However, it is not possible to attribute either change to the tuition provided.” Staff members are positive.
Graduate Coaching Programme (EEF page)
This trial showed a positive effect with moderate security.”The programme had a positive impact on pupils’ attainment in reading, spelling and grammar, equivalent to approximately five additional months’ progress. The evaluation did not seek to prove that the approach would work in all schools, but did identify strong evidence of promise.” and . The cost was quite high, and the delivery was very varied.
Peer Tutoring in Secondary Schools (EEF page)
This study concluded “This evaluation does not provide any evidence that the Paired Reading programme had an impact on overall reading ability, sentence completion and passage comprehension of participating pupils. “. The security is high and cost relatively low.
Shared Maths (EEF page)
This evaluation (of a 750k+ project) does not provide any evidence that the Durham Shared Maths programme had an impact on attainment in maths, when used with Year 5 and 3 pupils.”, the evidence strength was high and the cost low.
Talk for Writing (EEF page)
This project is “an approach to teaching writing that encompasses a three-stage pedagogy”. Teachers were enthusiastic about the implementation and this went quite smoothly. There was mixed evidence, although teachers reported it had an impact (this seems a theme, teachers thinking something has impact but the evidence not being there?).
Some observations from these studies
What strikes me in most of these studies is that:
- Most studies report quite small or no effects.
- Most studies report that staff are positive about the interventions, which seems to suggest that effectiveness and teachers’ perception are only a little bit related.
- Effects are often worded positively even if these are small or non-significant (with a few reports by one evaluator even making a case against Null Hypothesis Significance Testing which I understand but find strange given the majority of reports).
- Some reports mention ‘redeeming factors’ for non-effects, for example low costs. Like the Maths Masters study it seems that ‘low cost’ automatically makes an intervention worthwhile, even when no or very small effects.
- Pilots mainly concluded that (i) yes, the approach was feasible, (ii) mixed results, (iii) interventions needed to be developed for a full trial.
- There are many ‘arguments’ for further study along the lines of “more time is needed” or “larger samples are needed”, even when initial studies have spent significant amounts of money and had decent samples
Why is this notable? Well, for me mainly because EEF reports have been proposed to tell us ‘what works’. I would be the first to acknowledge that we need a range of qualitative and quantitative research, and that means, in my book, that there *should* be space for smaller scale studies as well. However, this does not seem to be the premise of most of the studies conducted. If £2.4 million is spent on 8 projects I would hope that the conclusions would be a bit more informative than ‘we should try more and harder’. I think it would be good if the reports report *only* on the results and do not make recommendations.



















