Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-qsmjn Total loading time: 0 Render date: 2024-04-19T10:32:38.983Z Has data issue: false hasContentIssue false

6 - Broadening the Bases of Methodological Rigor in Cross-Cultural Educational Assessment

from Part 2 - Individual Differences across Cultures

Published online by Cambridge University Press:  21 January 2021

Michael Bender
Affiliation:
Universiteit van Tilburg, The Netherlands
Byron G. Adams
Affiliation:
Universiteit van Tilburg, The Netherlands
Get access

Summary

In educational assessment, both cognitive and contextual variables are important to help us understand how students can be successful in school and society. In cross-cultural comparisons, we strive to capture cultural similarities and differences in target constructs while minimizing cultural bias to reach valid conclusions. We provide an overview of this endeavor with an integration of three perspectives. We first introduce the use of quantitative methodological rigor to address bias in national and international educational assessment. We then describe qualitative and mixed-methods research to deepen our understanding of the root cause of bias. Lastly, we discuss the effect of language in learning and assessments of minority students’ achievement. We conclude that educational assessment research would benefit from a further integration of multiple approaches, and taking full account of variations in contextual factors such as language proficiency.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2021

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Arikan, S., van de Vijver, F. J. R., & Yagmur, K. (2017). PISA mathematics and reading performance differences of mainstream and Turkish immigrant students. Educational Assessment, Evaluation and Accountability, 29, 229246.Google Scholar
Arikan, S., van de Vijver, F.J. R., & Yagmur, K. (2018). Propensity score matching helps to understand sources of DIF and mathematics performance differences of Indonesian, Turkish, Australian, and Dutch students in PISA. International Journal of Research in Education and Science (IJRES), 4, 6981.CrossRefGoogle Scholar
Austin, B., Adesope, O. O., French, B. F., Gotch, C., Bélanger, J., & Kubacka, K. (2015). Examining School Context and Its Influence on Teachers. Paris: OECD Publishing.Google Scholar
Benítez, I., He, J., van de Vijver, A. J. R., & Padilla, J. L. (2016). Linking extreme response styles to response processes: A cross-cultural mixed methods approach. International Journal of Psychology, 51, 464473.Google Scholar
Benítez, I., & Padilla, J. L. (2014). Analysis of nonequivalent assessments across different linguistic groups using a mixed methods approach: Understanding the causes of differential item functioning by cognitive interviewing. Journal of Mixed Methods Research, 8, 5268.Google Scholar
Benítez, I., Padilla, J. L., Hidalgo, M. D., & Sireci, S.G. (2016). Using mixed methods to interpret differential item functioning. Applied Measurement in Education, 29, 116.Google Scholar
Benítez, I., Padilla, J. L., van de Vijver, J. R., & Cuevas, A. (2008). What cognitive interviews tell us about bias when measuring quality-of-life. Field Methods, 30, 277294.Google Scholar
Boehnke, K., Lietz, P., Schreier, M., & Wilhelm, A. (2011). Sampling: The selection of cases for culturally comparative psychological research. In Matsumoto, D. & van de Vijver, F. J. R. (eds.), Cross-Cultural Research Methods in Psychology (pp. 101129). New York: Cambridge University Press.Google Scholar
Boer, D., Hanke, K., & He, J. (2018). On detecting systematic measurement error in cross-cultural research: A review and critical reflection on equivalence and invariance tests. Journal of Cross-Cultural Psychology, 49, 713734.Google Scholar
Brislin, R. W., Lonner, W. J., & Thorndike, R. (1973). Cross-Cultural Research Methods. New York: Wiley.Google Scholar
Bronfenbrenner, U. (1979). The Ecology of Human Development: Experiments by Nature and Design. Cambridge, MA: Harvard University Press.Google Scholar
Byrne, B. M., Oakland, T., Leong, F. T., van de Vijver, F. J., Hambleton, R. K., Cheung, F. M., & Bartram, D. (2009). A critical analysis of cross-cultural research and testing practices: Implications for improved education and training in psychology. Training and Education in Professional Psychology, 3, 94.Google Scholar
Byrne, B. M., Shavelson, R. J., & Muthén, B. (1989). Testing for the equivalence of factor covariance and mean structures: The issue of partial measurement invariance. Psychological Bulletin, 105, 456466.Google Scholar
Cees, G., & Khurrem, J. (2014). Modeling country-specific differential item functioning. In Rutkowski, L., von Davier, M., & Rutkowski, D. (eds.), Handbook of International Large-Scale Assessment (pp. 97115). Boca Raton, FL: Chapman and Hall.Google Scholar
Cheung, G. W., & Rensvold, R. B. (2002). Evaluating goodness-of-fit indexes for testing measurement invariance. Structural Equation Modeling, 9, 233255.CrossRefGoogle Scholar
Crul, M., Schneider, J., & Lelie, F. (eds.) (2012). The European Second Generation Compared: Does the Integration Context Matter? Amsterdam: Amsterdam University Press.Google Scholar
Cummins, J. (2014). Language and identity in multilingual schools: Constructing evidence based instructional policies. In Little, D., Leung, C., & van Avermaet, P. (eds.), Managing Diversity in Education: Languages, Policies, Pedagogies (pp. 326). Bristol, Buffalo, and Toronto: Multilingual Matters.Google Scholar
Dagevos, J., Gijsberts, M., & van Praag, C. (2003). Rapportage minderheden 2003. Onderwijs, arbeid en sociaal-culturele integratie. Den Haag: Sociaal en Cultureel Planbureau.Google Scholar
Driessen, G., & Merry, M. S. (2011). The effects of integration and generation of immigrants on language and numeracy achievement. Educational Studies, 37, 581592.CrossRefGoogle Scholar
Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44, 237251.CrossRefGoogle ScholarPubMed
Elosua, P., & López-Jaúregui, A. (2007). Potential sources of differential item functioning in the adaptation of tests. International Journal of Testing, 7, 3952.CrossRefGoogle Scholar
Ercikan, K., & Koh, K. (2005). Examining the construct comparability of the English and French versions of TIMSS. International Journal of Testing, 5, 2335.Google Scholar
Euromosaic (1996). The Production and Reproduction of the Minority Language Groups in the European Union. Luxembourg: Office for Official Publications of the European Communities.Google Scholar
Extra, G., & Yagmur, K. (2010). Language proficiency and socio-cultural orientation of Turkish and Moroccan youngsters in the Netherlands. Language and Education, 24, 117132.CrossRefGoogle Scholar
He, J., Buchholz, J., & Klieme, E. (2017). Effects of anchoring vignettes on comparability and predictive validity of student self-reports in 64 cultures. Journal of Cross-Cultural Psychology, 48, 319334.CrossRefGoogle Scholar
He, J., & Kubacka, K. (2015). Data comparability in the Teaching and Learning International Survey (TALIS) 2008 and 2013. OECD Education Working Paper Paris: OECD.Google Scholar
He, J., & van de Vijver, F. J. R. (2012). Bias and equivalence in cross-cultural research. Online Readings in Psychology and Culture, 2.Google Scholar
Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33, 6183.Google Scholar
Hopkins, D. J., & King, G. (2010). Improving anchoring vignettes: Designing surveys to correct interpersonal incomparability. Public Opinion Quarterly, 74, 201222.Google Scholar
Hsu, F. L. K. (1971). Filial piety in Japan and China. Journal of Comparative Family Studies, 2, 6774.Google Scholar
Huang, X., Wilson, M., & Wang, L. (2016). Exploring plausible causes of differential item functioning in the PISA science assessment: Language, curriculum or culture. Educational Psychology, 36, 378390.Google Scholar
Jungbluth, P. (2003). De ongelijke basisschool: etniciteit, sociaal milieu, sekse, verborgen differentiatie, segregatie, onderwijskansen en schooleffectiviteit. Nijmegen: ITS.Google Scholar
Kankaraš, M., & Moors, G. (2014). Analysis of cross-cultural comparability of PISA 2009 scores. Journal of Cross-Cultural Psychology, 45, 381399.CrossRefGoogle Scholar
Klieme, E., & Baumert, J. (2001). Identifying national cultures of mathematics education: Analysis of cognitive demands and differential item functioning in TIMSS. European Journal of Psychology of Education, 16, 385.Google Scholar
Klieme, E., & Kuger, S. (2015). PISA 2015 Draft Questionnaire Framework. Paris: OECD.Google Scholar
Kuger, S., Klieme, E., Jude, N., & Kaplan, D. (eds.). (2016). Assessing Contexts of Learning: An International Perspective. Cham: Springer.Google Scholar
Kyllonen, P. C., & Bertling, J. J. (2014). Innovative questionnaire assessment methods to increase cross-country comparability. In Rutkowski, L., von Davier, M., & Rutkowski, D. (eds.), Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis (pp. 277286). Boca Raton, FL: CRC Press.Google Scholar
Leseman, P. (2000). Bilingual vocabulary development of Turkish preschoolers in the Netherlands. Journal of Multilingual and Multicultural Development, 21, 93112,Google Scholar
Leseman, P. P. M., & van den Boom, D. C. (1999). Effects of quantity and quality of home proximal processes on Dutch, Surinamese-Dutch, and Turkish-Dutch preschoolers’ cognitive development. Infant and Child Development, 8, 1938.Google Scholar
Liu, Y., Zumbo, B. D., Gustafson, P., Huang, Y., Kroc, E., & Wu, A. D. (2016). Investigating causal DIF via propensity score methods. Practical Assessment, Research & Evaluation, 21, 24.Google Scholar
Lubke, G. H., & Muthén, B. O. (2014). Applying multigroup confirmatory factor models for continuous outcomes to Likert scale data complicates meaningful group comparisons. Structural Equation Modeling, 11, 514534.Google Scholar
McNamara, T. (2011). Multilingualism in education: A poststructuralist critique. The Modern Language Journal, 95, 430–441.Google Scholar
Merenda, P. F. (2005). Cross-cultural adaptation of educational and psychological testing. In Hambleton, R., Merenda, P. F., & Spielberger, C.D. (eds.), Adapting Educational and Psychological Tests for Cross-Cultural Assessment (pp. 321342). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
Morren, M., Gelissen, J. P., & Vermunt, J. K. (2012). Response strategies and response styles in cross-cultural surveys. Cross-Cultural Research, 46, 255279.Google Scholar
Mullis, I. V. S., & Martin, M. O. (eds.). (2013). TIMSS 2015 Assessment Frameworks. Boston College: TIMSS & PIRLS International Study Center.Google Scholar
Muthen, B., & Asparouhov, T. (2012). Bayesian SEM: A more flexible representation of substantive theory. Psychological Methods, 17, 313335.Google Scholar
Nielsen, M., Haun, D., Kärtner, J., & Legare, C. H. (2017). The persistent sampling bias in developmental psychology: A call to action. Journal of Experimental Child Psychology, 162, 3138.Google Scholar
OECD. (2010). TALIS 2008 Technical Report. Paris: OCED Publishing.Google Scholar
OECD. (2014). TALIS 2013 Technical Report. Paris: OECD Publishing.Google Scholar
OECD. (2017). PISA 2015 Technical Report. Paris: OECD Publishing.Google Scholar
Padilla, J. L., Benitez, I., & van de Vijver, F. J. R. (2018). Addressing equivalence and bias in cross-cultural survey research within a mixed methods framework. In Johnson, T. P., Pennell, B., Stoop, I., & Dorer, B. (eds.), Advances in Comparative Survey Methods: Multinational, Multiregional and Multicultural Contexts (3MC) (pp. 4564). New York: John Wiley and Sons.Google Scholar
Park, H., Sha, M. M., & Pan, Y. (2014). Investigating validity and effectiveness of cognitive interviewing as a pretesting method for non-English questionnaires: Findings from Korean cognitive interviews. International Journal of Social Research Methodology, 17, 643658.Google Scholar
Paulhus, D. L. (1991). Measurement and control of response biases. In Robinson, J., Shaver, P., & Wrightsman, L. (eds.), Measures of Personality and Social Psychological Attitudes (vol. 1, pp. 1759). San Diego, CA: Academic Press.Google Scholar
Poortinga, Y. H., van de Vijver, F. J. R., Joe, R. C., & van de Koppel, J. M. H. (1987). Peeling the onion called culture: A synopsis. In Kagitcibasi, C. (ed.), Growth and Progress in Cross-Cultural Psychology (pp. 2234). Lisse: Swets & Zeitlinger.Google Scholar
Rutkowski, L., & Svetina, D. (2014). Assessing the hypothesis of measurement invariance in the context of large-scale international surveys. Educational and Psychological Measurement, 74, 3157.CrossRefGoogle Scholar
Scheele, A. F., Leseman, P. M., & Mayo, A. Y. (2010). The home language environment of monolingual and bilingual children and their language proficiency. Applied Psycholinguistics, 31, 117140.Google Scholar
Schilt-Mol, T. (2007). Differential item functioning en itembias in de Cito-Eindtoets Basisonderwijs. (Studies in Multilingualism, No. 7.) Amsterdam: Aksant.Google Scholar
Schuelka, M. J. (2013). Excluding students with disabilities from the culture of achievement: The case of the TIMSS, PIRLS, and PISA. Journal of Education Policy, 28, 216230.Google Scholar
Schwartz, M., Kozminsky, E., & Leikin, M. (2009). Socio-linguistic factors in second language lexical knowledge: the case of second-generation children of Russian-Jewish immigrants in Israel. Language, Culture and Curriculum, 22, 1528.Google Scholar
Van de Vijver, F. J. R. (2015). Methodological aspects of cross-cultural research. In Gelfand, M., Hong, Y., & Chiu, C. Y. (eds.), Handbook of Advances in Culture & Psychology (vol. 5, pp. 101160). New York: Oxford University Press.Google Scholar
Van de Vijver, F. J. R., & He, J. (2016). Bias assessment and prevention in noncognitive outcome measures in context assessments. In Kuger, S., Klieme, E., Jude, N., & Kaplan, D. (eds.), Assessing Contexts of Learning: An International Perspective (pp. 229253). Cham: Springer International.Google Scholar
Van de Vijver, F. J. R., & Leung, K. (1997). Methods and Data Analysis of Comparative Research. Thousand Oaks, CA: Sage.Google Scholar
Van de Vijver, F. J. R., & Leung, K. (2000). Methodological issues in psychological research on culture. Journal of Cross-Cultural Psychology, 31, 3351.CrossRefGoogle Scholar
Van de Vijver, F. J. R., & Tanzer, N. K. (2004). Bias and equivalence in cross-cultural assessment: An overview. Revue Européenne de Psychologie Appliquée/European Review of Applied Psychology, 54, 119135.Google Scholar
Van der Veen, I. (2001). Successful Turkish and Morocaan students in the Netherlands. Leuven/Apeldoorn: Garant.Google Scholar
Verhoeven, L. T. (2000). Components in early second language reading and spelling. Scientific Studies of Reading, 4, 313–30.Google Scholar
von Davier, M., Shin, H.-J., Khorramdel, L., & Stankov, L. (2017). The effects of vignette scoring on reliability and validity of self-reports. Applied Psychological Measurement, 42, 291306.Google Scholar
Yagmur, K. (2017) Multilingualism in Immigrant Communities. In Cenoz, J., Gorter, D., & May, S. (eds.), Language Awareness and Multilingualism. Encyclopedia of Language and Education (3rd ed.). Springer, Cham.Google Scholar
Zumbo, B. D. (1999). A Handbook on the Theory and Methods of Differential Item Functioning (DIF): Logistic Regression Modeling as a Unitary Framework for Binary and Likert-Type (Ordinal) Item Scores. Ottawa: Directorate of Human Resources Research and Evaluation, Department of National Defense.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×