NewsEducation

PISA doesn’t define education quality, and knee-jerk policy proposals won’t fix whatever is broken

Since the OECD’s Programme for International Student Assessment (PISA) began in 2000, published results have sent commentators and politicians in some countries into meltdown. The release of the 2018 test results last week was no exception.

Out of 79 participating countries and economies in 2018 Australia came equal 11th (with countries including NZ, UK and US) in reading (in 2015 it was 12th); equal 13th (with countries including US, UK and Germany) in science (in 2015 it was tenth); and equal 24th (same as the OECD average, and NZ and Russia) in maths (in 2015 it was 20th).

The Daily Telegraph claimed Australian schools “are failing”. The Australian bemoaned Australia had “plunged in global rankings”; and business leaders told Australian educators to “lift your game”.

According to education minister Dan Tehan, “alarm bells should be ringing”. Tehan claimed he would raise the issue with state and territory ministers for education at the Education Council meeting in Alice Springs this week.

In particular, Tehan said he wanted to “take a chainsaw to the curriculum” and “put literacy and numeracy back to the heart” of it.

There are two striking features of the reactions to the PISA results. First, it seems broadly accepted PISA defines educational quality when it only tests three subject areas and the methodology itself is questionable.

And second, the results inevitably produce a flurry of policy responses, none of which have been specifically tested as a means to improve our PISA scores.

There are serious problems with PISA scores

PISA has seemingly become the arbiter of education quality in Australia and around the world. When the results are released, the numbers are broadly accepted as accurate measures of the quality of the world’s education systems.

This ignores the chorus of concerns growing for years about PISA’s serious methodological problems which test a stratified sample of 15 year olds in three curriculum areas.

These same concerns were raised by 100 educational researchers around the world in a public letter to the Director of PISA at the OECD in 2014.

Some such concerns relate to the problems of making an international test neutral when it is converted to many different languages and cultures. Other concerns come from educational statisticians and researchers who argue the validity and reliability of the tests themselves are at best dubious and at worst render the league tables “useless”.

There are also issues with the sampling process. Some participating countries only include certain parts of the country or exclude more schools and students from the tests than others. This clearly makes it more difficult to compare countries.

Mainland China is an example. For instance, when Shanghai finished top in each of the three domains in the 2012 PISA tests, Tom Loveless from the Brookings Institute in Washington argued the students tested weren’t representative of the students in the city.

He wrote Shanghai’s PISA-tested students didn’t include thousands from poor rural areas whose parents had moved to the city in search of work.

The OECD’s Education Director Andreas Schliecher, admitted to the UK’s House of Commons Education Select Committee that, in fact, only 73% of students were represented in the Shanghai province’s sample of students in the 2012 test.

Loveless’ also analysed the recent results, which were reported in the Washington Post. He noted the four Chinese provinces participating in 2018 (Beijing, Shanghai, Jiangsu and Zhejiang) did significantly better than the four (Beijing, Shanghai, Jiangsu and Guangdong) that participated in 2015.

Loveless wrote “the typical change in a nation’s scores is about ten points. The differences between the 2015 and 2018 Chinese participants are at least six times that amount…”. He hypothesised this had something to do with the change in provinces selected for testing.

These and other problems with PISA’s methodology suggest it is foolhardy to accept the test results as precise readings of educational quality in any country, or for ranking countries.

Knee-jerk policy fixes will only take us backward

The other feature of the reaction to PISA results is the litany of policy proposals to fix the problems PISA has supposedly unearthed.

Minister Tehan confined his strategies to “stripping back” the “crowded” Australian curriculum and focusing on “the basics”, as well as fast tracking professionals from other fields into teaching.

But there is no obvious link between the PISA results and the strategies proposed and certainly no analysis of what information PISA offered to support them. Even if PISA data are taken seriously, we are obliged to investigate the reasons for educational outcomes before designing policies to address the problems.

And although PISA only tests three areas of the curriculum, the strategies proposed apply to the whole curriculum and, it seems, to the whole of schooling.

Minister Tehan should explain what part of the PISA data convinced him that the Australian curriculum should be pared back to “the basics”. And if that data exists, why should it apply to every other area of the curriculum apart from the three areas tested by PISA: maths, science and reading?

Or is PISA simply being used as a convenient vehicle for introducing favoured policy positions?

I am not suggesting some of the PISA data cannot usefully contribute to the ongoing effort to enhance the quality of Australian education. There is a strong case to be made for sharing educational ideas and practices with other countries.

But superficial, knee-jerk readings of international standardised test data are more likely to impede than advance quality improvement.

The meeting of the Education Council this week could use the opportunity to do two things when they reach the agenda item on PISA.

First, it could set in train a process for achieving national agreement about an approach to educational accountability which goes beyond the simplistic reliance on standardised tests. This would start with the purposes of accountability, including supporting schools to enhance education quality, and aim to provide the community with rich information about educational progress.

Inevitably new approaches would mean broadening our evidence options. These should be both qualitative as well as quantitative. They should also support teachers in their work rather than impose time consuming form-filling. And they should be based on trust in the professional expertise of our teachers.

Second, the Education Council should institute a review of PISA which might consider the flaws in the testing regime, and ways to overcome these. This might help ensure policy, media commentary and research based on PISA results would acknowledge the limitations of the tests and be more tentative about using PISA as the sole arbiter of what constitutes quality in education.

These two steps would help break the destructive cycle of the release of PISA tests results every three years, followed by the barrage of criticising educators and the work of schools, as well as a flurry of political, ideologically pre-determined policy proposals.

After all, it is somewhat ironic that the policy makers who for so long have enforced a standardising educational policy regime, simply double down on it when their own standardising measures have deemed such policies to have failed.

Australia needs a new education narrative, and work on it should start this week at the Education Council.


Alan Reid is the author of “Changing Australian Education: How policy is taking us in the wrong direction and what can be done about it”, out now from Allen and Unwin.The Conversation Alan Reid, Professor Emeritus of Education, University of South Australia. This article is republished from The Conversation under a Creative Commons license. Read the original article.

School News

School News is not affiliated with any government agency, body or political party. We are an independently owned, family-operated magazine.

Related Articles

Back to top button
SchoolNews - Australia