Magazine article Times Educational Supplement

Is Pisa Fundamentally Flawed?: Feature

Magazine article Times Educational Supplement

Is Pisa Fundamentally Flawed?: Feature

Article excerpt

They are the world's most trusted education league tables. But academics say the Programme for International Student Assessment rankings are 'useless' and based on a 'profound conceptual error'. So should countries be basing expensive and often controversial reforms on them? William Stewart reports.

In less than five months, the most influential set of education test results the world has ever seen will be published. Leaders of the most powerful nations on Earth will be waiting anxiously to find out how they have fared in the latest Programme for International Student Assessment (Pisa).

In today's increasingly interconnected world, where knowledge is supplanting traditional industry as the key to future prosperity, education has become the main event in the "global race". And Pisa, the assessment carried out by the Organisation for Economic Cooperation and Development (OECD) every three years, has come to be seen as education's most recognised and trustworthy measure.

Politicians worldwide, such as England's education secretary Michael Gove, have based their case for sweeping, controversial reforms on the fact that their countries' Pisa rankings have "plummeted". Meanwhile, top-ranked success stories such as Finland have become international bywords for educational excellence, with other ambitious countries queuing up to see how they have managed it.

Pisa 2012 - due to be published on 3 December 2013 - will create more stars, cause even more angst and persuade more governments to spend further billions on whatever reforms the survey suggests have been most successful.

But what if there are "serious problems" with the Pisa data? What if the statistical techniques used to compile it are "utterly wrong" and based on a "profound conceptual error"? Suppose the whole idea of being able to accurately rank such diverse education systems is "meaningless", "madness"?

What if you learned that Pisa's comparisons are not based on a common test, but on different students answering different questions? And what if switching these questions around leads to huge variations in the all- important Pisa rankings, with the UK finishing anywhere between 14th and 30th and Denmark between fifth and 37th? What if these rankings - that so many reputations and billions of pounds depend on, that have so much impact on students and teachers around the world - are in fact "useless"?

This is the worrying reality of Pisa, according to several academics who are independently reaching some damning conclusions about the world's favourite education league tables. As far as they are concerned, the emperor has no clothes.

Perhaps just as worrying are the responses provided when TES put the academics' arguments to the OECD. On the key issue of whether different countries' education systems are correctly ranked by Pisa, and whether these rankings are reliable, the answer is less than reassuring.

The sample data used mean that there is "uncertainty", the OECD admits. As a result, "large variation in single (country) ranking positions is likely", it adds.

The organisation has always argued that Pisa provides much deeper and more nuanced analysis than mere rankings, offering insights into which education policies work best. But the truth is that, for many, the headline rankings are the start and finish of Pisa and that is where much of its influence lies.

On other important questions, such as whether there is a "fundamental, insoluble mathematical error" at the heart of the statistical model used for Pisa, the OECD has offered no response.

Concerns about Pisa have been raised publicly before. In England, Gove's repeated references to the country plunging down the Pisa maths, reading and science rankings between 2000 and 2009 have led to close scrutiny of the study's findings.

Last year, the education secretary received an official reprimand from the UK Statistics Authority for citing the "problematic" UK figures from 2000, which the OECD itself had already admitted were statistically invalid because not enough schools took part. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.