There have been a lot of discussions on the policy impact of international large-scale assessments (ILSAs) on education systems. But do ILSAs ultimately influence students’ learning? In this blog post, Huiming Ding answers this question by drawing on findings from her study of the impact of the Programme for International Student Assessment (PISA) has at local, school and student levels.
The influence of ILSAs in a range of domestic education systems has been extensively discussed. For example, much has been written about the possible impact and consequences of the OECD’s PISA, such as encouraging policy borrowing and enabling the global governing of education systems. However, claims regarding policy impact do not necessarily mean that actual impact has occurred in practice. The literature suggests that knowledge learned from other education systems or the PISA framework might be recontextualised and adapted by domestic factors to fit specific contexts. Indeed, some have argued that policies described as responses to PISA often do not have close a link with PISA data.
In light of PISA’s global reach and widely-discussed policy impact, and its expansion to local and school levels with PISA for Schools, a number of key questions arise: How is PISA actually used in local contexts? Does PISA have an impact on local education policies and practices, including students’ learning? If so, how does that impact occur? Does this impact foster the development of students’ learning outcomes? Empirical investigation of these questions is pretty rare.
To address this gap, I studied PISA’s impact on students’ learning in a local context in China. I focused on the local area of Fangshan District of Beijing, which has participated in the PISA 2009 China Trial, the PISA 2012 China Trial, and PISA 2015. My study employed a mixed-methods design. I explored the perceptions of local education policymakers and practitioners. I also conducted trend analyses and multilevel modelling of the local PISA data to understand local use and interpretation of these data, and also to triangulate the perceptions about PISA’s impact on the development of student learning outcomes.
I found that PISA has explicit and substantial influence on education policies in this local area. Local educational policymakers consider PISA to be a credible tool allowing for benchmarking local students’ performance against international scales and supplementing their understanding of the quality of local education. They also described its influence on a number of local policy initiatives. Many of these initiatives translated and promoted some concepts featured in PISA (e.g. practical reading, applied mathematics, scientific enquiry, real-life context).
For example, a reading project to increase students’ exposure to different types of reading, especially practical reading, was mobilised as the direct response to the local students’ reading performance in PISA 2009 China Trial. Two projects, which focused on mathematics and science, were initiated to draw insights from the PISA framework for improving teaching and learning. These initiatives went beyond simply aligning with the concepts highlighted in PISA and aimed to meet the local needs of students. Rather than seek best practice from elsewhere, these local policy initiatives involve targeted measures to address the local specific issues in teaching and learning identified in PISA and other resources such as domestic examinations and classroom observation. To this end, they are closely linked to teaching and learning practices at schools and engage classroom teachers in shaping and implementing these initiatives. Some of these initiatives even encourage parents’ involvement in school implementation to build a better home environment that supports students’ relevant learning activities.
I also found that PISA has indirectly impacted the processes and outcomes of students’ learning through school enactment of these initiatives, although this impact varies across schools, classes, and students. The impact of PISA can be seen in changes to students’ overall performance in class and in assessments such as PISA and domestic examinations. I have identified that this impact is mediated by various factors in different levels of contexts surrounding students’ learning. At local policy level, the limited local capacity for analysing, interpreting and translating complex PISA data, as described by local education policymakers, restricts the breadth and depth of PISA’s impact on informing and framing policies. At school level, factors such as school resources and policies make the enactment of the local policy initiatives vary across schools. Moreover, teachers play a key role in mediating PISA’s impact on students. Those who have close involvement in the local PISA-relevant work, and who recognise and accept the concepts featured in PISA that are promoted by the local initiatives, are more likely to actively enact these initiatives and change their teaching practices. Since school implementation of some of the initiatives involves parents, school-home interactions also mediate PISA’s impact on students.
The formulation and enactment of the local initiatives, and even parents’ attitudes towards these initiatives, are greatly influenced by a range of factors in the broader national context. Recent reforms in national assessment and curriculum in China tend to be convergent with some concepts of PISA (e.g. real-life contexts). However, domestic approaches to education (e.g. whole-book reading, fine traditional culture) are still valued and developed to meet the needs of Chinese society and economy. These approaches are carefully considered in framing the local PISA-motivated initiatives. They are of priority for schools in developing students and are valued by parents. Hence, local PISA-motivated initiatives that have resonance with the national education reforms are more likely to have a strong impact on students’ learning.
My analyses of students’ performance trends in PISA confirms local education policymakers’ and practitioners’ perceptions about the performance improvement, but also echoes their reflection on limited local expertise in interpreting PISA data. Analyses of explanatory factors of the performance largely support their perceptions of the contributions that local PISA-motivated initiatives have made to this improvement. Some aspects of students’ learning have been fostered progressively through the school enactment of the local PISA-motivated initiatives, as well as recent reforms of national assessment and curriculum which have some convergence with PISA features. I also identified a paradox: local education policymakers have the strong desire to efficiently and effectively use PISA data to inform education policymaking, whilst also facing the challenge of conducting in-depth analysis and making appropriate interpretation of the complex data provided by PISA.
The findings of my study add to our understanding of how PISA is used in local contexts and how it impacts on students’ learning. My study also reveals the dynamics of such impact in local contexts, and underlines the importance of looking at PISA’s impact at local, school and student levels. I highlight the need for appropriate theoretical and technical support to be made available and accessible for local ILSA data consumers to encourage and facilitate appropriate and efficient use of ILSA data in these contexts.
Huiming Ding has recently completed her PhD at the School of Education, University of Leeds. Her research interests focus on education assessment and language assessment, including using assessment data to understand and foster learning, and to inform policies and practices.