Comparative policy analysis: An insightful use of PISA data

Jude Cosgrove, Chief Executive Officer of the Educational Research ‎Centre, Drumcondra, Dublin

Thursday 20 May 2021

NFER has just published Using PISA 2018 data to inform policy: Learning from the Republic of Ireland, an in-depth analysis of policies of the four nations of the UK and the Republic of Ireland. The report uses PISA data as a backdrop to tease out the policy contexts which have enabled Irish students to do consistently well on the PISA reading assessment, particularly at the lower end of the achievement distribution. The report demonstrates the potential of large-scale international assessments to strategically inform policy when additional, reflective work is done to situate those ‘PISA numbers’ in their broader contexts.

Since the first cycle of PISA in 2000, the number of participating countries/jurisdictions has grown from 32 to 79 in PISA 2018, with 85 participants planning to take part in PISA 2022. OECD country membership has also grown from 30 to 37 in the last decade. Alongside this expansion, the complexity of the design of PISA has grown, moving from paper to online assessment, and with various enhancements to the scaling and analysis of data [1].

Evidently, the landscape of international large-scale assessments has changed drastically since 1958, when a small group of educational psychologists and sociologists met in Hamburg to consider undertaking a study of “measured outcomes and their determinants within and between systems of education” [2]. The meeting might be considered the point at which the International Association for the Evaluation of Educational Achievements (IEA) was founded. Also around this time (1961), the OECD was founded (arising from its predecessor, the OEEC, founded in 1948).

The notion that international studies could provide information on “optimal conditions for human development that could be used as a basis for educational policy” [3] must have been quite appealing. As the organisers of the first international study  put it: “If custom and law define what is educationally allowable within a nation, then educational systems beyond one’s national boundaries suggest what is educationally possible.” [4]

At around this time (early 1960s), a consensus was emerging that there was a need to focus on internationally valid measures of achievement as key outcomes which would provide countries with a comparative framework. This was later to become a major issue in considering international competitiveness in knowledge-based economies in an increasingly globalised context [5], and arguably what drove the establishment of the PISA study.

Various authors [6] have described functions of comparative educational research which have potential relevance to the development of government policies on education:

  • Descriptive comparisons serving to  identify  aspects  of  a  system  that are at odds with others (‘mirroring’)
  • Benchmarking standards against which policymakers judge their education systems
  • Monitoring educational processes and outcomes over groups and time
  • Understanding differences between systems and groups to enable decisions about issues such as resource deployment and teaching and learning practices
  • Serving an ‘enlightenment function’ by revealing assumptions  about  what  schools or systems  try  to  achieve  through  an  analysis  of what they actually  achieve  and  a  discussion  about  what  it  is  possible  to 

Perhaps due to our natural human tendencies to compare and order, media and policymakers tend traditionally to focus on descriptive comparisons and benchmarking standards. However, we are now at the stage in the life cycles of large-scale assessments (PISA has seven cycles covering two decades) to further exploit the monitoring, understanding differences and, perhaps most importantly for policymakers, the enlightenment functions of these studies. The difficulty here is that understanding difference and using these studies to provide genuine and impactful policy insights requires considerable additional analysis and reflection.

A significant strength of the Using PISA 2018 data to inform policy: Learning from the Republic of Ireland report is that it has placed a comparative policy analysis side by side chronologically with the PISA 2018 cohort, and has succeeded in identifying contextual, attainment and  policy-related differences between five nations’ education systems. This accords substantive weight to the report, as well as providing a template for future comparative policy enquiry.

The report notes that two of the strengths of education policy in the Republic of Ireland are the gradual ‘build and join’ approach underpinning educational disadvantage policy and initiatives, and genuine efforts on the part of the Department of Education to understand and enable ownership among school communities.

The Irish education system is of course not without its weaknesses. Two aspects of the system which emerge as problematic are firstly the rather mediocre mathematics standards of students relative to reading literacy [7]), and secondly, the difficult task of upper secondary assessment reform [8]. These are two areas for which a follow-on study of these five nations may well shed some light, thereby making it Ireland’s turn to learn from its four neighbouring countries.

References

Department of Education (2016). Review of national and international reports on literacy and numeracy. Dublin:
Author. www.education.ie/en/Schools-Colleges/Information/Literacy-and-Numeracy/Review-of-National-and-International-Reports-on-Literacy-and-Numeracy.pdf

Foshay, A.W., Thorndike, R.L., Hotyat, F., Pidgeon, D.A., & Walker, D.A. (1962). Educational achievements of thirteen-year-olds in twelve countries. Hamburg: UNESCO.

Husen,   T.N.,   &   Postlethwaite,   T.N.   (1996).   A   brief  history   of  the   International Association  for  the  Evaluation  of  Educational  Achievement  (IEA). Assessment in Education, 3, 129-141.

Kellaghan, T.  (1996). IEA studies and educational policy. Assessment in Education, 3,143-160.

Kellaghan, T., &  Greaney,  V.  (2001). The globalisation  of  assessment  in  the  20th century. Assessment in Education, 8(1), 87-102.

McKeown, C., Denner, S., McAteer, S., Shiel, G., & O’Keeffe, L. (2019). Learning for the future: The performance of 15-year-olds in Ireland on reading literacy, mathematics and science in PISA 2018. Dublin:
Educational Research Centre. www.erc.ie/wp-content/uploads/2020/07/B23321-PISA-2018-National-Report-for-Ireland-Full-Report-Web-4.pdf

OECD (2014). PISA 2012 Technical Report. Paris:
Author. www.oecd.org/pisa/pisaproducts/PISA-2012-technical-report-final.pdf

OECD (2020a). PISA 2018 Technical Report. Paris:
Author. www.oecd.org/pisa/data/pisa2018technicalreport/

OECD (2020b). Education in Ireland: An OECD assessment of the senior cycle review. Paris:
Author.
Perkins, R., & Clerkin, A. (2020). TIMSS 2019: Ireland’s results in mathematics and science. Dublin: ERC. www.erc.ie/wp-content/uploads/2021/01/03-ERC-TIMSS-2019-Report_A4_Online.pdf

Plomp, T., Howie,   S.,   &  McGaw,  B.   (2003).   International   studies of educational achievement.   In   T.   Kellaghan   &   D.L.   Stufflebeam   (Eds.), International handbook of educational evaluation    (pp.    951-978).   
Dordrecht:    Kluwer Academic.
Postlethwaite,    T.N.    (1999). International   studies   of educational achievement: Methodological issues.  CERC studies in comparative education 6.  Hong Kong: Comparative Education Research Centre, University of Hong Kong.

[1] e.g. compare OECD, 2020a, Chapter 9 and OECD, 2014, Chapter 9
[2] Husen & Postlethwaite, 1996, p. 129
[3] Kellaghan, 1996, pp. 143-144
[4] Foshay et al., 1962, p. 7
[5] Kellaghan & Greaney, 2001
[6] e.g., Kellaghan & Greaney, 2001; Plomp et al., 2003
[7] e.g. Perkins & Clerkin, 2020; McKeown et al., 2019; Department of Education, 2016
[8] OECD, 2020b