Categories
Education Research

July 2015 round of EEF projects

eefEvery now and then the Education Endowment Fund releases a series of reports. It’s interesting to see the sheer difference in media attention they create, and also the buzz on social media. I collated all the studies in a Google spreadsheet:

I tabulated the name of the project, the project lead, the money involved (to me it is unclear if this includes the cost of evaluation cost), whether the project was completed or in progress (at time of writing), the evaluator, type of project, number of schools, and I added an indicator of whether there were signs of ‘significance testing’. In this post I want to summarise the recently released reports. I make no claim to cover all aspects of the studies.

Philosophy for Children (EEF page)
This was an effectiveness trial with 40 schools which evaluated Philosophy for Children (P4C), an “approach to teaching in which students participate in group dialogues focused on philosophical issues.” A key conclusion in the report was that there was a positive impact on KS2 attainment, with “2 months progress”. This project caused the most discussion, mainly because of some crucial aspects in the design. There was a guest post by Inglis, and this post.

Word and World Reading Programme (EEF page)
This study was a pilot study of Core Knowledge (E.D. Hirsch inspired) materials from The Curriculum Centre. I was surprised the blogosphere did not really pick up on this study. A suspicious mind might think this might be because the results of this ‘Core Knowledge’ programme were quite underwhelming, and this does not fit the ‘knowledge’ preferences. But of course, that is just as suggestive 🙂 I will write a separate post on this.

Affordable Individual and Small Group Tuition: Primary (EEF page)
Key conclusion that stands out: “Due to the study’s design and problems recruiting schools to receive tuition or participate in the evaluation, this evaluation has not provided a secure estimate of the impact of the project on pupil outcomes.” and also “Participating pupils made slightly less progress in both English and mathematics than those in the matched comparison group. However, this finding was not statistically significant, meaning that it could have occurred by chance.”. Staff members were positive. Recommendations are to improve.

Affordable Individual and Small Group Tuition: Secondary (EEF page)
“Due to the limitations of the study design and the absence of a high-quality comparison group, this evaluation has not provided a secure estimate of the impact of the project on academic outcomes.” and “Participating pupils achieved slightly higher mathematics GCSE scores than pupils in the comparison group, and lower English GCSE scores than pupils in the comparison group. However, it is not possible to attribute either change to the tuition provided.” Staff members are positive.

Graduate Coaching Programme (EEF page)
This trial showed a positive effect with moderate security.”The programme had a positive impact on pupils’ attainment in reading, spelling and grammar, equivalent to approximately five additional months’ progress. The evaluation did not seek to prove that the approach would work in all schools, but did identify strong evidence of promise.” and . The cost was quite high, and the delivery was very varied.

Peer Tutoring in Secondary Schools (EEF page)
This study concluded “This evaluation does not provide any evidence that the Paired Reading programme had an impact on overall reading ability, sentence completion and passage comprehension of participating pupils. “. The security is high and cost relatively low.

Shared Maths (EEF page)
This evaluation (of a 750k+ project) does not provide any evidence that the Durham Shared Maths programme had an impact on attainment in maths, when used with Year 5 and 3 pupils.”, the evidence strength was high and the cost low.

Talk for Writing (EEF page)
This project is “an approach to teaching writing that encompasses a three-stage pedagogy”. Teachers were enthusiastic about the implementation and this went quite smoothly. There was mixed evidence, although teachers reported it had an impact (this seems a theme, teachers thinking something has impact but the evidence not being there?).

Some observations from these studies
What strikes me in most of these studies is that:

  • Most studies report quite small or no effects.
  • Most studies report that staff are positive about the interventions, which seems to suggest that effectiveness and teachers’ perception are only a little bit related.
  • Effects are often worded positively even if these are small or non-significant (with a few reports by one evaluator even making a case against Null Hypothesis Significance Testing which I understand but find strange given the majority of reports).
  • Some reports mention ‘redeeming factors’ for non-effects, for example low costs. Like the Maths Masters study it seems that ‘low cost’ automatically makes an intervention worthwhile, even when no or very small effects.
  • Pilots mainly concluded that (i) yes, the approach was feasible, (ii) mixed results, (iii) interventions needed to be developed for a full trial.
  • There are many ‘arguments’ for further study along the lines of “more time is needed” or “larger samples are needed”, even when initial studies have spent significant amounts of money and had decent samples

Why is this notable? Well, for me mainly because EEF reports have been proposed to tell us ‘what works’. I would be the first to acknowledge that we need a range of qualitative and quantitative research, and that means, in my book, that there *should* be space for smaller scale studies as well. However, this does not seem to be the premise of most of the studies conducted. If £2.4 million is spent on 8 projects I would hope that the conclusions would be a bit more informative than ‘we should try more and harder’. I think it would be good if the reports report *only* on the results and do not make recommendations.

 

By cbokhove

LecturerInMathsEducation| Research|Technology|Algebra| Dr|Politics|Enkhuizen2S'hampton| Caving|Married|5kids|MKSFBA| WuLyf|PatrickWolf|Mumford|Antony| Suede|Love|

5 replies on “July 2015 round of EEF projects”

Agree completely Christian. Over at Schools Week we do cover EEF reports, and spend a lot of time thinking about how best to do coverage and whether there are, genuinely, lessons to take away – or not. It’s a tough balance.

How do you think newspapers can best report this sort of thing?

It indeed is a tough balance. I think newspapers need to be critical of press releases and certainly not just trust that what’s in them. Read the underlying reports, but even there use statistical literacy to unpick what the reports are really saying. Then report on that. Not too much interpretation (maybe even none), or maybe only when it’s clear it is an opinion piece.

But to be fair, I don’t really see newspapers as the biggest challenge here. There are more target audiences.
1. To the EEF and other funders. I feel that if you ‘promise’ to say ‘what works’ then they would need to be more clear about this, and not be too nuanced. However, if they would say -rightly so- that such an approach does not fit the ‘messy’ nature of (social) science, then that would be fine as well. But then I would appreciate a less dogmatic view of what ‘research’ is and would like them to appreciate the roles of different types and phases of research. Mind you, in my view it would be better if they fund many more smaller projects than a few large ones.
2. For researchers making the report I would say the same thing as to newspapers: report on the facts. If there is speculation about how the facts might have turned out differently because of factors X,Y and Z underpin this with some evidence that this is realistic to expect. Just a blanket cover ‘oh, it’s cheap any way’ doesn’t cut it in my opinion. Also be more modest about conclusions.
3. From institutions involved in the interventions (sometimes the ones ‘selling the product’) I would expect them to be a bit more restrained in their bold statements.

The problem is that we often aren’t given research itself in advance. Only press releases. So we either wait, and report on it *after* everyone else with the report (but who wants to read the story again the next day, even if it has slightly more nuance?) or you go with it on the basis of what’s best. It’s a tough call.

I agree that other stakeholders have a part to play.

Leave a comment