Why does PISA appear to be everyone’s solution?

Why does PISA appear to be everyone’s solution?

By Camilla Addey, Humboldt University in Berlin

This blog draws on the recently published forum paper Addey, C., Sellar S., Steiner-Khamsi G., Lingard B. & Verger A. (2017): The rise of international large-scale assessments and rationales for participation, Compare: A Journal of Comparative and International Education, DOI: 10.1080/03057925.2017.1301399


At a glance, it would seem that PISA is everyone’s educational solution. Indeed, one might even say that after many different education policy ideas and tools have been tried out, PISA has led to apparent global educational consensus. This might help explain why approximately 80 participants have measured their education systems with the PISA metric and why the OECD forecasts 150 participants by 2030. But is PISA lab everyone’s solution? This blog responds to this question by highlighting the main points made in the abovementioned forum paper authored by Addey, Sellar, Steiner-Khamsi, Lingard and Verger.

Scratching the surface of how PISA is used by participants shows how this apparent global education consensus hides a polyvalent and polyglot PISA. In the forum paper, Addey and Sellar draw on qualitative research projects carried out in contexts as diverse as Mongolia, Laos, the UK, Australia, Ecuador, Paraguay and other European countries, to show that when governments decide to participate in International Large-Scale Assessments (ILSAs), they are not necessarily concerned with education. Although participation is often justified in terms of the dominant narrative of ‘better data for better policies’, the socio-political contexts where ILSAs are adopted transform the intended uses of ILSAs. Addey and Sellar argue that the localised meanings that ILSAs are given relate to seven broad rationales for participation:

1) Generating evidence for policy. ILSA data are seldom used as evidence for policy in ways promoted by organisations administering the tests – that is, by drawing lessons from the data – but instead are often used to legitimise reforms or inaction.

2) Technical capacity building and developing national large-scale assessments. ILSAs are valued as an opportunity to acquire innovative and sophisticated large-scale assessment methodologies. ILSA participation may also be closely linked to national large-scale assessments, which may be redeveloped in the light of PISA frameworks and/or legitimated through references to PISA. It is not surprising therefore that the OECD has made the alignment of national large-scale assessments with the PISA metric one of its long term PISA objectives.

3) Receiving funding and aid. ILSA participation fees and implementation costs in low- and middle-income contexts are frequently covered by international donors or bilateral aid schemes. Donors often require ILSA data as a benchmark to monitor educational progress or to display commitment to accountability and transparency. Data from ILSAs are also used as evidence to obtain funds for education and are likely to become a key criterion guiding the allocation of funding.

4) Enhancing international relations. ILSAs are extensively used for international relation purposes. For example, they are used as part of a nation-building narrative; to make a statement about political or economic status; to align with the international community’s values; and to access political, economic or trade entities. Participation is also driven by pressure to participate as signatories of global commitments, or due to membership in international organisations that administer ILSAs.

5) Responding to or driving national political agendas. ILSAs are made to serve domestic politics. For example, pressures may come from ministries and institutions not directly related to education, or as a response to special interest lobbies, media pressure or public opinion. Scandalising comparatively ‘bad’ data can be a means for political parties to win support or put the spotlight on weak performance and discredit opposing political groups. The tactical use of ILSA data to respond to domestic political agendas by ‘glorifying’ education performance can also be observed in the form of the statistical elimination of educational problems (Steiner-Khamsi and Stolpe 2006).

6) Driving economic growth. ILSAs are used as instruments for measuring human capital and providing proxy indicators for economic competitiveness and attractiveness to the corporate world. Data from ILSAs, particularly PISA and PIAAC, are used to represent skill-pools and labour competitiveness. The OECD has positioned PISA as the instrument to measure how well education systems prepare students to compete for jobs globally and to identify policy solutions to train the most competitive workers.

7) Informing curriculum and pedagogy. The rationale of improving teaching and learning is rarely mentioned as a primary driver of ILSA participation. This may be a result of PISA becoming the most prominent ILSA. Whilst the IEA’s TIMSS and PIRLS measure how well students have acquired the curriculum that is taught, PISA measures the capacity to apply skills learnt over the first 15 years of life. There is thus only a loose relation between curriculum and PISA performance. Technical questions about how ILSAs can sensibly inform better teaching and learning practices are increasingly being voiced by those concerned that resources invested in ILSAs give something back to education at the classroom level.

In addition to these broad rationales, there are also cases in which the ‘push’ to participate is related to personal agendas. Further, Verger (2017) argues that two other rationales must also be considered: the presence of an education industry (that is actively lobbying for ILSA participation and the adoption of test-based evaluation systems at multiple scales) and new public management (since both NPM and ILSAS promote the evaluative role of the state in education, and a culture of management by results). Lingard suggests that we also need to understand why International Organizations develop, administer and analyse ILSA data. He argues that in the case of the OECD, it has impact on policy through ‘the active role the OECD plays by framing the PISA stories reported in the media around the globe’ (2017: 15).

But how is it that PISA is used in such diverse ways?

Steiner-Khamsi claims the answer can be found in the elusiveness and polyvalence of PISA. These are the result of the exponential growth in the number of education systems participating in PISA, the frequency of PISA implementation, PISA’s detachment from national curricula, and its focus on cross-national twenty-first century skills. Steiner-Khamsi argues that PISA is also polyglot, as it speaks ‘the language of accountability used in (new) public management, the language of standards advocated by the private sector and non-state actors and, last but not least, the pedagogical language of effective learning spoken by educators (see Steiner-Khamsi, 2015)’ (2017: 13).

Steiner-Khamsi, Lingard and Verger add that to deepen our understanding of the diverse uses of PISA in different socio-political contexts, these rationales need to be understood through theories of the state. Verger (2017) suggests that a rationalistic approach, would understand PISA participation as part of a linear policy process whereby policy-makers weigh up the benefits of participation with the aim of informing policy making and evaluation based on ‘what works’. From a neo-institutionalist perspective, governments take part in PISA as a response to legitimation pressures and to demonstrate to the international community they are modern and responsible. From a critical or political economy approach, Verger suggests PISA participation is driven by economic and political interests. In support of this third approach, Verger concludes that ‘the way in which ideas are constructed, framed and mobilised is key to understanding why and how countries engage with global education policies and, without going any further, with ILSAs’ (2017: 16).

Empirical research and theoretical lenses show that in practice PISA is not everyone’s educational solution. PISA legitimates everyone’s solution in that PISA participation or PISA data are used to address multiple and diverse problems (as this blog has shown).  As a high level policy actor in Uruguay recently said, anything is legitimated as long as you start your sentence with ‘PISA says…’



To reference this blog: Addey, C. 2017. Why does PISA appear to be everyone’s solution? Laboratory of International Assessment Studies Blogs. Accessible at http://international-assessments.org/why-does-pisa-appear-to-be-everyones-solution/


Lab Team
Written by Lab Team