In this Featured Research Project blog post, David and Leslie Rutkowski discuss their Embracing Heterogeneity project, which is based at the University of Oslo. You can learn more about their project, team members, partner members and work at https://embracingheterogeneity.com/
Over a 60 year history, modern international large-scale assessments (ILSAs) have become influential educational policy tools, with rapid growth in the number of participating countries, measured content areas, and testing platforms. As international studies grow, so too do differences across participants, reflecting more and diverse languages, cultures, geographies, and levels of economic development. Although it is reasonable to expect variation across cultures, apparent differences can also be the result of nothing more than the way constructs are measured, obscuring our understanding of what study participants know, think, and feel. To date, these measurement issues have only been accounted for in limited ways and only in PISA. Other studies have been slow to acknowledge and account for cross-cultural measurement differences, which can have important impacts on achievement and other scale score estimates.
Generally, our Norwegian Research Council-funded project focuses on optimal designs and methods to account for cross-cultural measurement differences in international educational assessments and surveys. To address our overall research goals, we first developed a software package that could simulate data that adequately reflect the complexity of large scale assessment data. Developing this software was challenging; however, our research team, and specially Dr. Tyler Matta, were able to rapidly develop the software as a package for use in the freely available statistical software R. The R package, lsasim, is now available to the public and can be found here: https://github.com/tmatta/lsasim and through usual R installation channels.
This software will allow us to create synthetic data that allows us to fully control relevant study parameters to create a lab-like setting. The lab qualities afforded through simulation allow us to develop and test improvements in assessment design and modeling, since the ‘true’ state of our simulated study subjects is known. Our innovations can be validated through simulated and existing empirical data to test the degree to which our solutions work in practice.
Although the project is less than a year old our team has written a number of papers, both applied and methodological that focus on the general goals for the project. Select titles of our papers include: Improving the comparability of international assessments: A look back and a way forward; The impact of collapsing categories on invariance investigations; Back to the drawing board: Can we compare background scales; and Bridging validity & evaluation to help understand ILSA utility, value, and meaning for various stakeholders.
In addition, as part of the project, our PhD student, Mr. Kondwani Mughogho has commenced a very exciting strand of research where he is examining the quality, value, and comparability of international HIV-AIDS scales used in the SAQMEQ assessment. Through his research, Kondwani hopes to provide guidance on current methods and to inform intervention policy and practice on an extraordinarily important public health concern.
We hope the improvements to current methods and designs developed by this project will provide policy makers and the public with a more accurate picture of educational achievement and correlates and how education systems compare internationally. In Norway, this is especially relevant given the importance of international assessments in both public and policy debates around education. Further, we hope that this productive area of research will assist the Norwegian government in meeting its goals of improving educational assessment research within the country. Finally, the international collaborative part of this project is aimed at knowledge sharing between some of the top educational assessment researchers in the world.