TIMSS and PISA results: Seeing past the headlines

By Ben Durbin

Monday 7 November 2016

This blog post first appeared on the IEA blog.

Education research does not often make newspaper headlines. Even less often does it make headlines in multiple countries around the world. In just a few weeks time we will see a rare exception.

In the space of a week, the latest set of results from both TIMSS (29th November) and PISA (6th December) – two international surveys that compare the performance of education systems around the world – will be published. The results are a closely guarded secret until then, but two things can be guaranteed: there will be newspaper headlines, and the headlines will be misleading.

I know there will be headlines, because I recently spent a week in Oslo at the General Assembly of the IEA – the organisation responsible for TIMSS. Not only did we have a preview of the results, but we also spent much of the week discussing how to ensure the results are reported widely and accurately in the media, and are understood and applied by policymakers and the wider sector (my blog from the What Works Global Summit last month discusses exactly this topic).

And I know the headlines will be misleading. Most notably, they will focus on the rankings of countries and will fail to grasp the notion of statistical significance. A country’s ranking may change from year to year simply because the number of countries participating in the study has changed. Everyone knows the old joke about coming second in a beauty contest with only two contestants (rather a different achievement to coming second out of 50!)

And differences between countries, or changes over time, are not always indicative of fundamental differences/changes. In football they say the league table never lies. But at any given point in time, some teams will be punching above or below their weight (to mix metaphors!) simply because of an unusually good or bad run of form. Small differences are sometimes just the result of these blips.

NFER has been involved in the international studies since IEA was founded fifty years ago, so we’re familiar with the good, the bad and the ugly of how they’re reported. Over the coming weeks we will play our part in helping to explain the results and de-bunk the myths.

Despite the challenges, attending IEA’s General Assembly has emphasised two ways in which the value of international studies will be realised long after the initial flurry of headlines.

Firstly, the data generated goes far beyond measuring overall performance. Student and school questionnaires provide a rich source of information on students’ attitudes and experiences of school; teachers, professional development and school organisation. The TIMSS encyclopaedia (published last week) also contains a wide range of comparative data on curricula, teacher training, school starting age, and more.

By comparing countries and tracking changes over time, key education issues can be explored in ways that are not possible with any other data source. Indeed, NFER is currently undertaking a project using international data to provide an important new perspective on social mobility in England.

Secondly, the international studies promote cooperation and learning between educators around the world. You will rarely find such a coming together of people from education ministries and research institutions from such a diverse set of countries. The conversations and relationships formed through networks such as the IEA and OECD (who are responsible for PISA) enable genuine dialogue and learning to take place – not just an over-simplified policy borrowing from the top performing nations.

NFER’s mission is summed up in our strap line ‘Evidence for Excellence in Education’. This is a headline I’d be very happy to see later this year.