No full text
Unpublished conference/Abstract (Scientific congresses, symposiums and conference proceedings)
Validation and Psychometric Analysis of 32 cognitive item models spanning Grades 1 to 7 in the mathematical domain of numbers & operations
Michels, Michael Andreas; Hornung, Caroline; Gamo, Sylvie et al.
2022Luxembourg Educational Research Association Conference 2022
 

Files


Full Text
No document available.

Send to



Details



Keywords :
item development; Automatic Item Generation; cognitive models; mathematics; numbers & operation
Abstract :
[en] Today’s educational field has a tremendous hunger for valid and psychometrically sound items to reliably track and model students’ learning processes. Educational large-scale assessments, formative classroom assessment, and lately, digital learning platforms require a constant stream of high-quality, and unbiased items. However, traditional development of test items ties up a significant amount of time from subject matter experts, pedagogues and psychometricians and might not be suited anymore to nowadays demands. Salvation is sought in automatic item generation (AIG) which provides the possibility of generating multiple items within a short period of time based on the development of cognitively sound item templates by using algorithms (Gierl & Haladyna, 2013; Gierl et al., 2015). The present study psychometrically analyses 35 cognitive item models that were developed by a team of national subject matter experts and psychometricians and then used for algorithmically producing items for the mathematical domain of numbers & shapes for Grades 1, 3, 5, and 7 of the Luxembourgish school system. Each item model was administered in 6 experimentally varied versions to investigate the impact of a) the context the mathematical problem was presented in, and b) problem characteristics which cognitive psychology identified to influence the problem solving process. Based on samples from Grade 1 (n = 5963), Grade 3 (n = 5527), Grade 5 (n = 5291), and Grade 7 (n = 3018) collected within the annual Épreuves standardisées, this design allows for evaluating whether psychometric characteristics of produced items per model are a) stable, b) can be predicted by problem characteristics, and c) are unbiased towards subgroups of students (known to be disadvantaged in the Luxembourgish school system). After item calibration using the 1-PL model, each cognitive model was analyzed in-depth by descriptive comparisons of resulting IRT parameters, and the estimation of manipulated problem characteristics’ impact on item difficulty by using the linear logistic test model (LLTM, Fischer, 1972). Results are truly promising and show negligible effects of different problem contexts on item difficulty and reasonably stable effects of altered problem characteristics. Thus, the majority of developed cognitive models could be used to generate a huge number of items (> 10.000.000) for the domain of numbers & operations with known psychometric properties without the need for expensive field-trials. We end with discussing lessons learned from item difficulty prediction per model and highlighting differences between the Grades. References: Fischer, G. H. (1973). The linear logistic test model as an instrument in educational research. Acta Psychologica, 36, 359-374. Gierl, M. J., & Haladyna, T. M. (Eds.). (2013). Automatic item generation: Theory and practice. New York, NY: Routledge. Gierl, M. J., Lai, H., Hogan, J., & Matovinovic, D. (2015). A Method for Generating Educational Test Items That Are Aligned to the Common Core State Standards. Journal of Applied Testing Technology, 16(1), 1–18.
Disciplines :
Theoretical & cognitive psychology
Author, co-author :
Michels, Michael Andreas ;  University of Luxembourg > Faculty of Humanities, Education and Social Sciences (FHSE) > LUCET
Hornung, Caroline ;  University of Luxembourg > Faculty of Language and Literature, Humanities, Arts and Education (FLSHASE) > Luxembourg Centre for Educational Testing (LUCET)
Gamo, Sylvie ;  University of Luxembourg > Faculty of Humanities, Education and Social Sciences (FHSE) > LUCET
Roeder, Michel
Gierl, Mark
Cardoso-Leite, Pedro ;  University of Luxembourg > Faculty of Humanities, Education and Social Sciences (FHSE) > Department of Behavioural and Cognitive Sciences (DBCS)
Fischbach, Antoine  ;  University of Luxembourg > Faculty of Humanities, Education and Social Sciences (FHSE) > Department of Education and Social Work (DESW)
Sonnleitner, Philipp  ;  University of Luxembourg > Faculty of Humanities, Education and Social Sciences (FHSE) > LUCET
External co-authors :
yes
Language :
English
Title :
Validation and Psychometric Analysis of 32 cognitive item models spanning Grades 1 to 7 in the mathematical domain of numbers & operations
Publication date :
November 2022
Event name :
Luxembourg Educational Research Association Conference 2022
Event place :
Esch-sur-Alzette, Luxembourg
Event date :
from 09-11-2022 to 10-11-2022
Audience :
International
FnR Project :
FNR13650128 - Fairness Of Latest Innovations In Item And Test Development In Mathematics, 2019 (01/09/2020-31/08/2023) - Philipp Sonnleitner
Available on ORBilu :
since 17 November 2022

Statistics


Number of views
251 (14 by Unilu)
Number of downloads
0 (0 by Unilu)

Bibliography


Similar publications



Contact ORBilu