Discourse: Studies in the cultural politics of education

Is PISA Worth its Cost? Some Challenges Facing Cost-Benefit Analysis of ILSAs

By Laura Engel (The George Washington University) and David Rutkowski (Indiana University)

International large-scale assessments (ILSAs) are a fairly recent phenomena, developed in the mid-20th century and rapidly picking up steam since the mid-1990s. In the last decade, the ILSA industry has grown in terms of the different kinds of assessments, the subject areas assessed, and the numbers of educational systems involved. The economies participating in PISA have increased, for example, from 47 participants in the initial cycle in 2000 to over 70 in the more recent cycles– with OECD member states only accounting for about half. While much has been written about the uses, motivations, and limitations of ILSAs, there had been no accounting of their costs. Our recent paper, Pay to Play, aimed to address that gap and posed a straightforward question: What does a system (the US) pay to participate in PISA?

To answer the question of cost, there are two factors: (1) Overhead (i.e., what a system pays as an annual assessed fee to the OECD determined by the size of the economy) and (2) National implementation (i.e., what a system pays to implement the assessment). Finding this information in the US proved to be difficult, which we discussed in the FreshEd podcast, meaning although information related to costs in the US is publicly available, it doesn’t necessarily mean the information is easily accessible. And for the vast majority of education systems in the world, we still don’t know the costs largely because they are often difficult to uncover.

As the world’s largest economy, the US pays the highest PISA participation fee to the OECD, which is about $1 million a year. Although we do not know how much other OECD countries pay (that information is not public; the OECD indicates it differs greatly among members and that non-members pay on average around $50,000/year), we do know that new countries in PISA 2024 (which we assume are non-OECD countries) will pay approximately $58,000 per year in overhead–regardless of the size of the economy. Let’s pause on this for a moment: Regardless of the size of the economy, these countries will only pay about 6% of what the US pays. Arguably being a large country, the US requires more resources from the OECD; however, because PISA draws a similar size sample in each country and is almost never a census, the amount of the OECD’s work is about the same for all countries. Implementation costs, however, are a different story. The cost of implementing PISA 2012 in the US, for example, was about $6.7 million. These costs cover things like drawing a sample, recruiting schools, administering the assessment, scoring open-ended items, and in the US, providing payments to students and schools. Yep! American students are paid to take PISA.

To put these figures in context, the US has a robust testing industry; the 2011 annual cost of the National Assessment of Educational Progress (NAEP) was $129 million. A study of state contracts with testing vendors across 45 US states shows that states are spending a combined $669 million per year on testing (projecting a total cost of $1.7 billion/year for all 50 states). In the state of Indiana, the cost of a three-year contract for Indiana’s Learning Evaluation Readiness Network (ILEARN) is $45 million, with tests only being administered two of those years. So, while the US overhead costs for PISA may seem high, the total cost of its participation seems to be lower than many national or state assessments. That said, we are hardly comparing apples to apples. NAEP, for example, assesses 10 subjects at three grade levels nationally and four of those subjects are sampled at the state/district level. Similarly, Indiana’s ILEARN assesses six subjects in multiple grades with far more students taking the exam than those taking PISA. We could do some rough calculations on the actual cost per student by combining both the implementation and operational costs of each assessment but that data would also be misleading because the subject, age of student, content covered and geographic location would all have to be taken into consideration when calculating and interpreting the findings.

When costs are calculated, the looming question is unavoidable: Are the costs worth it? And specific to our research we were left with the question: Is PISA worth it for the US? This, of course, is a multifaceted question for any educational system and only complicated by the sheer size and federal structure of the US. As many readers know, the US does not have a national curriculum or set of standards; each state is generally responsible for funding and managing its own education system. In fact, when we talk with state education policy makers and administrators about national PISA results, they often are not familiar with PISA or do not see the relevance of PISA to their current context. Even if they do see the relevance, they frequently do not draw on evidence or results from PISA in any systematic way. And the truth is, in the US, international large scale assessments are indeed a hard sell. In a federal system where education is the purview of the states, there is very little relevant state level policy information resulting from a nationally sampled test. In fact, the high variation of NAEP scores between states is an ample warning that there is too much variance in the US to validly interpret PISA results for local or state policy-making.

However, national PISA scores may be able to inform federal policy makers and administrators, as well as select educational researchers. Further, ILSAs can and have influenced national debates in the US concerning the importance of education. A small number of states have even participated in previous cycles of PISA as separate entities, motivated by different purported benefits. However, understanding and/or quantifying those benefits (as would be needed for a typical cost benefit analysis) is not for the faint of heart. In fact, quantifying the benefits of participating in an international assessment for such a large and diverse system would likely lead to inconclusive results. Findings would largely be subjective and dependent on the views of a variety of stakeholders including the US Department of Education and national teacher unions, all of which have little influence on what is being taught in local schools.

What ultimately troubled us once we uncovered the actual cost of PISA in the US is that we were left with more questions than answers. For example, is PISA a bargain given the robust cross-national information that can be gleaned? Should the US be paying the most for PISA if it seems to inform so few policy makers? For other systems that do not have advanced assessment mechanisms at the national or even sub-national level, do ILSAs like PISA become a viable and in fact, cost effective alternative? On the other hand, do the risks, especially risks emerging from the misuses of PISA, outweigh the benefits? Would the money be better spent elsewhere? Wrestling over costs of assessments is of course not unique to PISA; policy-makers and researchers are often grappling with the benefits and costs of various educational assessments.

Although questions abound, uncovering costs is an important endeavor and we encourage researchers in other countries to do the same. We learned a lot from the process and hope our work brought to light some important points. For example, few in the research and ILSA community seemed to know that the US pays students to take PISA. To us, this is a huge threat to the validity of comparing results between countries that pay their students and countries that do not. In addition, we also learned that the US had a difficult time finding schools and even doubling their payments was not enough to entice them to participate. Finally, we gained insights about the process of implementing ILSAs in a national context and the stakeholders involved, revealing how the PISA architecture is built differently in each country.

 

This blog draws on a paper by Laura Engel and David Rutkowski. Here is the link to the journal paper.  

 

Laura Engel is an Associate Professor of International Education and International Affairs at the George Washington University. Her research focuses on global education policy trends in federal systems, including the internationalization of schooling and education policy uses of international assessments.

David Rutkowski is an Associate Professor of Education Policy at Indiana University. His research is focused on methodological and policy issues around standardized assessment.

 

To reference this blog: Engel, Laura and David Rutkowski. 2019. Is PISA Worth its Cost? Some Challenges Facing Cost-Benefit Analysis of ILSAs. Laboratory of International Assessment Studies blog series. Published on 28th March 2019. Accessed at https://bit.ly/2Wp3c9S 

 

Twitter