References of "Levy, Jessica 50008801"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailWhat Primary Schools Are Doing Right: Educational Value-Added in Luxembourg
Emslander, Valentin UL; Levy, Jessica UL; Fischbach, Antoine UL

Poster (2022, November 10)

In such a diverse context as Luxembourg, educational inequalities can arise from diverse languages spoken at home, a migration background, or a family’s socioeconomic status. This diversity leads to ... [more ▼]

In such a diverse context as Luxembourg, educational inequalities can arise from diverse languages spoken at home, a migration background, or a family’s socioeconomic status. This diversity leads to different preconditions for learning math and languages (e.g. the language of instruction) and thus shapes the school careers of students (Hadjar & Backes, 2021). The aim of the project Systematic Identification of High Value-Added in Educational Contexts (SIVA) was to answer the questions (1) what highly effective schools are doing “right” or differently and (2) what other schools can learn from them in alleviating inequalities. In collaboration with the Observatoire National de la Qualité Scolaire, we investigated the differences of schools with stable high value-added (VA) scores to those with stable medium or low VA scores from multiple perspectives. VA is a statistical regression method usually used to fairly estimate schools’ effectiveness considering diverse student backgrounds. First, we identified 16 schools which had a stable high, medium, or low VA scores over two years. Second, we collected data on their pedagogical strategies, student background, and school climate through questionnaires and classroom observations. Third, we matched our data to results from the Luxembourg School Monitoring Programme ÉpStan (LUCET, 2021). We selected the variables based on learning models focusing on aspects such as school organization or classroom management (e.g., Hattie, 2008; Helmke et al., 2008; Klieme et al., 2001). We further investigated specificities about the Luxembourgish school system, which are not represented in international school learning models (such as the division into two-year learning cycles, the multilingual school setting, or the diverse student population). We will discuss the SIVA-project, its goals, and its data collection leading to data from observations in 49 classroom and questionnaires with over 500 second graders, their parents, their teachers, as well as school presidents and regional directors. Literature Hadjar, A., & Backes, S. (2021). Bildungsungleichheiten am Übergang in die Sekundarschule in Luxemburg. https://doi.org/10.48746/BB2021LU-DE-21A Hattie, J. (2008). Visible Learning: A synthesis of over 800 meta-analyses relating to achievement (0 ed.). Routledge. https://doi.org/10.4324/9780203887332 Helmke, A., Rindermann, H., & Schrader, F.-W. (2008). Wirkfaktoren akademischer Leistungen in Schule und Hochschule [Determinants of academic achievement in school and university]. In M. Schneider & M. Hasselhorn (Eds.), Handbuch der pädagogischen Psychologie (Vol. 10, pp. 145–155). Hogrefe. Klieme, E., Schümer, G., & Knoll, S. (2001). Mathematikunterricht in der Sekundarstufe I: “Aufgabenkultur” und Unterrichtsgestaltung. TIMSS - Impulse für Schule und Unterricht, 43–57. LUCET. (2021). Épreuves Standardisées (ÉpStan). https://epstan.lu [less ▲]

Detailed reference viewed: 36 (7 UL)
Full Text
Peer Reviewed
See detailHow sensitive are the evaluations of a school's effectiveness to the selection of covariates in the applied value‑added model?
Levy, Jessica UL; Brunner, Martin; Keller, Ulrich UL et al

in Educational Assessment, Evaluation and Accountability (2022)

There is no final consensus regarding which covariates should be used (in addition to prior achievement) when estimating value-added (VA) scores to evaluate a school’s effectiveness. Therefore, we ... [more ▼]

There is no final consensus regarding which covariates should be used (in addition to prior achievement) when estimating value-added (VA) scores to evaluate a school’s effectiveness. Therefore, we examined the sensitivity of evaluations of schools’ effectiveness in math and language achievement to covariate selection in the applied VA model. Four covariate sets were systematically combined, including prior achievement from the same or different domain, sociodemographic and sociocultural background characteristics, and domain-specific achievement motivation. School VA scores were estimated using longitudinal data from the Luxembourg School Monitoring Programme with some 3600 students attending 153 primary schools in Grades 1 and 3. VA scores varied considerably, despite high correlations between VA scores based on the different sets of covariates (.66 < r < 1.00). The explained variance and consistency of school VA scores substantially improved when including prior math and prior language achievement in VA models for math and prior language achievement with sociodemographic and sociocultural background characteristics in VA models for language. These findings suggest that prior achievement in the same subject, the most commonly used covariate to date, may be insufficient to control for between-school differences in student intake when estimating school VA scores. We thus recommend using VA models with caution and applying VA scores for informative purposes rather than as a mean to base accountability decisions upon. [less ▲]

Detailed reference viewed: 28 (6 UL)
Peer Reviewed
See detailAre Value-Added Scores Stable Enough for High-Stakes Decisions?
Emslander, Valentin UL; Levy, Jessica UL; Scherer, Ronny et al

Scientific Conference (2022, March)

Theoretical Background: Can we quantify the effectiveness of a teacher or a school with a single number? Researchers in the field of value-added (VA) models may argue just that (e.g., Chetty et al., 2014 ... [more ▼]

Theoretical Background: Can we quantify the effectiveness of a teacher or a school with a single number? Researchers in the field of value-added (VA) models may argue just that (e.g., Chetty et al., 2014; Kane et al., 2013). VA models are widely used for accountability purposes in education and quantify the value a teacher or a school adds to their students’ achievement. For this purpose, these models predict achievement over time and attempt to control for factors that cannot be influenced by schools or teachers (i.e., sociodemographic & sociocultural background). Following this logic, what is left must be due to teacher or school differences (see, e.g., Braun, 2005). To utilize VA models for high-stakes decision-making (e.g., teachers’ tenure, the allocation of funding), these models would need to be highly stable over time. School-level stability over time, however, has hardly been researched at all and the resulting findings are mixed, with some studies indicating high stability of school VA scores over time (Ferrão, 2012; Thomas et al., 2007) and others reporting a lack of stability (e.g., Gorard et al., 2013; Perry, 2016). Furthermore, as there is no consensus on which variables to use as independent or dependent variables in VA models (Everson, 2017; Levy et al., 2019), the stability of VA could vary between different outcome measures (e.g., language or mathematics). If VA models lack stability over time and across outcome measures, their use as the primary information for high-stakes decision-making is in question, and the inferences drawn from them could be compromised. Questions: With these uncertainties in mind, we examine the stability of school VA model scores over time and investigate the differences between language and mathematics achievement as outcome variables. Additionally, we demonstrate the real-life implications of (in)stable VA scores for single schools and point out an alternative, more constructive use of school VA models in educational research. Method: To study the stability of VA scores on school level over time and across outcomes, we drew on a sample of 146 primary schools, using representative longitudinal data from the standardized achievement tests of the Luxembourg School Monitoring Programme (LUCET, 2021). These schools included a heterogeneous and multilingual sample of 7016 students. To determine the stability of VA scores in the subject of mathematics and in languages over time, we based our analysis on two longitudinal datasets (from 2015 to 2017 and from 2017 to 2019, respectively) and generated two VA scores per dataset, one for language and one for mathematics achievement. We further analyzed how many schools displayed stable VA scores in the respective outcomes over two years, and compared the rank correlations of VA scores between language and mathematics achievement as an outcome variable. Results and Their Significance: Only 34-38 % of the schools showed stable VA scores from grade 1 to 3 with moderate rank correlations of r = .37 with language and r = .34 with mathematics achievement. We therefore discourage using VA models as the only information for high-stakes educational decisions. Nonetheless, we argue that VA models could be employed to find genuinely effective teaching or school practices—especially in heterogeneous student populations, such as Luxembourg, in which educational disparities are an important topic already in primary school (Hoffmann et al., 2018). Consequently, we contrast the school climate and instructional quality, which might be a driver of the differences between schools with stable high vs. low VA scores. Literature Braun, H. (2005). Using student progress to evaluate teachers: A primer on value-added models. Educational Testing Service. Chetty, R., Friedman, J. N., & Rockoff, J. E. (2014). Measuring the impacts of teachers I: Evaluating bias in teacher value-added estimates. American Economic Review, 104(9), 2593–2632. https://doi.org/10.1257/aer.104.9.2593 Everson, K. C. (2017). Value-added modeling and educational accountability: Are we answering the real questions? Review of Educational Research, 87(1), 35–70. https://doi.org/10.3102/0034654316637199 Ferrão, M. E. (2012). On the stability of value added indicators. Quality & Quantity, 46(2), 627–637. https://doi.org/10.1007/s11135-010-9417-6 Gorard, S., Hordosy, R., & Siddiqui, N. (2013). How unstable are “school effects” assessed by a value-added technique? International Education Studies, 6(1), 1–9. https://doi.org/10.5539/ies.v6n1p1 Kane, T. J., McCaffrey, D. F., Miller, T., & Staiger, D. O. (2013). Have We Identified Effective Teachers? Validating Measures of Effective Teaching Using Random Assignment. Research Paper. MET Project. Bill & Melinda Gates Foundation. https://files.eric.ed.gov/fulltext/ED540959.pdf Levy, J., Brunner, M., Keller, U., & Fischbach, A. (2019). Methodological issues in value-added modeling: An international review from 26 countries. Educational Assessment, Evaluation and Accountability, 31(3), 257–287. https://doi.org/10.1007/s11092-019-09303-w LUCET. (2021). Épreuves Standardisées (ÉpStan). https://epstan.lu Perry, T. (2016). English value-added measures: Examining the limitations of school performance measurement. British Educational Research Journal, 42(6), 1056–1080. https://doi.org/10.1002/berj.3247 Thomas, S., Peng, W. J., & Gray, J. (2007). Modelling patterns of improvement over time: Value added trends in English secondary school performance across ten cohorts. Oxford Review of Education, 33(3), 261–295. https://doi.org/10.1080/03054980701366116 [less ▲]

Detailed reference viewed: 87 (7 UL)
Full Text
Peer Reviewed
See detailRésultats du monitoring scolaire national ÉpStan dans le contexte de la pandémie de COVID-19
Fischbach, Antoine UL; Colling, Joanne UL; Levy, Jessica UL et al

in LUCET; SCRIPT (Eds.) Rapport national sur l’éducation au Luxembourg 2021 (2021)

Detailed reference viewed: 37 (3 UL)
Peer Reviewed
See detailRésultats du monitoring scolaire national ÉpStan dans le contexte de la pandémie de COVID-19 (Matériels supplémentaires)
Fischbach, Antoine UL; Colling, Joanne UL; Levy, Jessica UL et al

in LUCET; SCRIPT (Eds.) Rapport National sur l´Éducation au Luxembourg 2021 (2021)

Detailed reference viewed: 51 (12 UL)
Peer Reviewed
See detailBefunde aus dem nationalen Bildungsmonitoring ÉpStan vor dem Hintergrund der COVID-19 Pandemie (Supplement)
Fischbach, Antoine UL; Colling, Joanne UL; Levy, Jessica UL et al

in LUCET; SCRIPT (Eds.) Nationaler Bildungsbericht Luxemburg 2021 (2021)

Detailed reference viewed: 46 (10 UL)
Full Text
Peer Reviewed
See detailBefunde aus dem nationalen Bildungsmonitoring ÉpStan vor dem Hintergrund der COVID-19- Pandemie
Fischbach, Antoine UL; Colling, Joanne UL; Levy, Jessica UL et al

in LUCET; SCRIPT (Eds.) Nationaler Bildungsbericht Luxemburg 2021 (2021)

Detailed reference viewed: 55 (21 UL)
Peer Reviewed
See detailThe Impact of the COVID-19 Pandemic on the Luxembourgish Education System: Differences between students based on background characteristics in elementary and secondary school
Fischbach, Antoine UL; Colling, Joanne UL; Levy, Jessica UL et al

Scientific Conference (2021, November)

Policy responses to the COVID-19 pandemic (e.g., school closure, home-schooling) have affected students at various stages of education all over the world and were found to increase inequalities in ... [more ▼]

Policy responses to the COVID-19 pandemic (e.g., school closure, home-schooling) have affected students at various stages of education all over the world and were found to increase inequalities in academic achievement (OECD, 2021). The present study is based on fully representative large-scale data from the Luxembourg School Monitoring Programme (Épreuves Standardisées; ÉpStan; LUCET, 2021). The ÉpStan are assessing key competencies of primary and secondary school students in different subjects (e.g., German, French and Math). To allow a fair performance comparison, socio-economic and socio-cultural backgrounds of students (e.g., gender, migration and language background) are systematically taken into consideration. The ÉpStan 2020 entail data from approximatively 25.000 students from five different grades (elementary and secondary school), from 15.000 parents (elementary school) and comparative data from 160.000 students from previous cohorts, thus providing key empirical findings on the pandemic’s impact on the Luxembourgish education system. In the present contribution, we analyze a) how the results of standardized achievement tests compare to previous cohorts and under consideration of students’ socio-economical and socio-cultural background, as well as b) how parents and students perceived home-schooling with regard to aspects such as coping, technical equipment, motivation or contact to teachers. First results indicate that in Grades 1, 5, 7 and 9, standardized achievement scores were generally stable in comparison to previous years. However, in Grade 3, students’ competency scores in German (primary language of instruction in elementary school) listening comprehension worsened substantially. Furthermore, third graders from socio-economically disadvantaged households and/or students that do not speak Luxembourgish/German at home did worse in German reading comprehension than their peers from socio-economically advantaged households and/or speaking Luxembourgish/German at home. Concerning the perception of home-schooling, students coped rather well with the situation, with German being a bit more challenging in primary school and math in secondary school. Findings concerning motivation and enjoyment of home-schooling were mixed, with primary school students’ motivation being comparably to the regular school setting but approximately half of the secondary school students being less motivated than in the regular school setting. Furthermore, all households seem to have been well equipped, with the situation being slightly more favorable in socio-economically advantaged households. For the majority of students, the contact with teachers was frequent, with teachers having adapted their type of support to the needs of their students (e.g., more personal contact towards students from socio-economically disadvantaged households). To conclude, it can be said that no systematic negative trend has been identified in students’ achievement scores. Only German listening comprehension in Grade 3 has worsened substantially and these skills should therefore be fostered as early as possible. Overall, students coped rather well with home-schooling without, however, particularly enjoying it. While students entering the pandemic with favorable background characteristics (e.g., higher socio-economic status, speaking a language of instruction at home) managed better both regarding competencies and perception of home-schooling, students with less favorable background characteristics have received more differentiated support. These findings underline that already existing inequalities in the Luxembourgish school system have in parts been intensified by the pandemic. References LUCET. (2021). Épreuves Standardisées (ÉpStan). https://epstan.lu OECD. (2021). The State of School Education: One Year into the COVID Pandemic. OECD Publishing. https://doi.org/10.1787/201dde84-en [less ▲]

Detailed reference viewed: 169 (12 UL)
Peer Reviewed
See detailStability of Primary School Value-Added Scores over Time: A Comparison Between Math and Language Achievement as Outcome Variables
Emslander, Valentin UL; Levy, Jessica UL; Scherer, Ronny et al

Scientific Conference (2021, November)

Value-added (VA) models are widely used for accountability purposes in education. Tracking a teacher’s or a school’s VA score over time forms oftentimes the basis for high-stakes decision-making and can ... [more ▼]

Value-added (VA) models are widely used for accountability purposes in education. Tracking a teacher’s or a school’s VA score over time forms oftentimes the basis for high-stakes decision-making and can determine whether teachers can keep their jobs or schools may receive certain funding. Despite their high-stakes application, the stability of VA scores over time has not yet been investigated for primary schools. Moreover, it is unclear whether different outcome measures (e.g., language and mathematics) may differ in their stability over time. In the present study, we aimed to clarify the stability of VA scores over time and investigate the differences across outcome variables. Furthermore, we wanted to showcase the real-life implications of (in)stable VA scores for single schools, with a focus on an informative use of VA scores rather than an evaluative way. The exploration of school VA scores in primary schools is especially relevant for heterogeneous student populations, for instance, in Luxembourg. Thus, we drew on representative longitudinal data from the standardized achievement tests of the Luxembourg School Monitoring Programme and examined the stability of school VA scores over two years in 146 schools (N = 7016 students). The overall stability, as measured by correlation coefficients, was moderate with r = .37 for VA scores in language and r = .34 for VA scores in mathematics from grade one to grade three. Real-life implications for schools will be discussed. [less ▲]

Detailed reference viewed: 87 (20 UL)
Full Text
Peer Reviewed
See detailStability of Value-Added Models: Comparing Classical and Machine Learning Approaches
Emslander, Valentin UL; Levy, Jessica UL; Scherer, Ronny et al

Scientific Conference (2021, September)

Background: What is the value that teachers or schools add to the evolution of students’ performance? Value-added (VA) modeling aims to answer this question by quantifying the effect of pedagogical ... [more ▼]

Background: What is the value that teachers or schools add to the evolution of students’ performance? Value-added (VA) modeling aims to answer this question by quantifying the effect of pedagogical actions on students’ achievement, independent of students’ backgrounds (e.g., Braun, 2005). A plethora of VA models exist, and several outcome measures are in use to estimate VA scores, yet without consensus on the model specification (Everson, 2017; Levy et al., 2019). Furthermore, it is unclear whether the most frequently used VA models (i.e., multi-level, linear regression, and random forest models) and outcome measures (i.e., language and mathematics achievement) indicate a similar stability of VA scores over time. Objectives: Drawing from the data of a highly diverse and multilingual school setting, where leveling out the influence of students’ backgrounds is of special interest, we aim to (a) clarify the stability of school VA scores over time; (b) shed light on the sensitivity toward different statistical models and outcome variables; and (c) evaluate the practical implications of (in)stable VA scores for individual schools. Method: Utilizing the representative, longitudinal data from the Luxembourg School Monitoring Programme (LUCET, 2021), we examined the stability of school VA scores. We drew on two longitudinal data sets of students who participated in the standardized achievement tests in Grade 1 in 2014 or 2016 and then again in Grade 3 two years later (i.e., 2016 and 2018, respectively), with a total of 5875 students in 146 schools. School VA scores were calculated using classical approaches (i.e., linear regression and multilevel models) and one of the most commonly used machine learning approaches in educational research (i.e., random forests). Results and Discussion: The overall stability over time across the VA models was moderate, with multilevel models showing greater stability than linear regression models and random forests. Stability differed across outcome measures and was higher for VA models with language achievement as an outcome variable as compared to those with mathematics achievement. Practical implications for schools and teachers will be discussed. [less ▲]

Detailed reference viewed: 224 (9 UL)
Full Text
See detailTertium non datur: Various aspects of value-added (VA) models used as measures of educational effectiveness
Levy, Jessica UL

Doctoral thesis (2020)

Value-added (VA) models are used as measures of educational effectiveness which aim to find the “value” that has been added by teachers or schools to students’ achievement, independent of students’ ... [more ▼]

Value-added (VA) models are used as measures of educational effectiveness which aim to find the “value” that has been added by teachers or schools to students’ achievement, independent of students’ backgrounds. Statistically speaking, teacher or school VA scores are calculated as the part of an outcome variable that cannot be explained by the covariates that are in the VA model (i.e., the residual). Teachers or schools are classified as effective (or ineffective) if they have a positive (or negative) effect on students’ achievement compared to a previously specified norm value. Although VA models have gained popularity in recent years, there is a lack of consensus concerning various aspects of VA scores. The present dissertation aims at shedding light on these aspects, including the state of the art of VA research in the international literature, covariate choice, and model selection for the estimation of VA scores. In a first step, a systematic literature review was conducted, in which 370 studies from 26 countries were classified, focusing on methodological issues (Study 1 of the present dissertation). Results indicated no consensus concerning the applied statistical model type (the majority applied a linear regression, followed by multilevel models). Concerning the covariate choice, most studies used prior achievement as a covariate, cognitive and/or motivational student data were hardly considered, and there was no consensus on the in- or exclusion of students’ background variables. Based on these findings, it was suggested that VA models are better suited to improve the quality of teaching than for accountability and decision-making purposes. Secondly, based on one of the open questions resulting from Study 1 (i.e., covariate choice), the aim of Study 2 was to systematically compare different covariate combinations in the estimation of school VA models. Based on longitudinal data from primary school students participating in the Luxembourg School Monitoring Programme in Grades 1 and 3, three covariate sets were found to be essential when calculating school VA scores with math or language achievement as dependent variables: prior language achievement, prior math achievement, and students’ sociodemographic and sociocultural background. However, the evaluation of individual schools’ effectiveness varied widely depending on the covariate set that was chosen, casting further doubt on the use of VA scores for accountability purposes. Thirdly, the aim of Study 3 was to investigate statistical model selection, as Study 1 showed no consensus on which model types are most suitable for the estimation of VA scores, with the majority of studies applying linear regression or multilevel models. These classical linear models, along with nonlinear models and different types of machine learning models were systematically compared to each other. Covariates were kept constant (based on the results from Study 2) across models. Multilevel models led to the most accurate prediction of students’ achievement. However, as school VA scores varied depending on specific model choices and as these results can be only generalized for a Luxembourgish sample, it was suggested for future research that the model selection process should be made transparent and should include different specifications in order to obtain ranges of potential VA scores. In conclusion, all three studies imply that the application of VA models for decision-making and accountability should be critically discussed and that VA scores should not be used as the only measure for accountability or high-stakes decisions. In addition, it can be concluded that VA scores are more suitable for informative purposes. Thus, the findings from the present dissertation prepare the ground for future research, where schools with stable high VA scores can be part of further investigations (both qualitatively and quantitatively) to study their pedagogical strategies and learn from them. [less ▲]

Detailed reference viewed: 180 (26 UL)
Peer Reviewed
See detailTackling educational inequalities using school effectiveness measures
Levy, Jessica UL; Mussack, Dominic UL; Brunner, Martin et al

Scientific Conference (2020, November 11)

Detailed reference viewed: 89 (12 UL)
Full Text
Peer Reviewed
See detailContrasting Classical and Machine Learning Approaches in the Estimation of Value-Added Scores in Large-Scale Educational Data
Levy, Jessica UL; Mussack, Dominic UL; Brunner, Martin et al

in Frontiers in Psychology (2020), 11

There is no consensus on which statistical model estimates school value-added (VA) most accurately. To date, the two most common statistical models used for the calculation of VA scores are two classical ... [more ▼]

There is no consensus on which statistical model estimates school value-added (VA) most accurately. To date, the two most common statistical models used for the calculation of VA scores are two classical methods: linear regression and multilevel models. These models have the advantage of being relatively transparent and thus understandable for most researchers and practitioners. However, these statistical models are bound to certain assumptions (e.g., linearity) that might limit their prediction accuracy. Machine learning methods, which have yielded spectacular results in numerous fields, may be a valuable alternative to these classical models. Although big data is not new in general, it is relatively new in the realm of social sciences and education. New types of data require new data analytical approaches. Such techniques have already evolved in fields with a long tradition in crunching big data (e.g., gene technology). The objective of the present paper is to competently apply these “imported” techniques to education data, more precisely VA scores, and assess when and how they can extend or replace the classical psychometrics toolbox. The different models include linear and non-linear methods and extend classical models with the most commonly used machine learning methods (i.e., random forest, neural networks, support vector machines, and boosting). We used representative data of 3,026 students in 153 schools who took part in the standardized achievement tests of the Luxembourg School Monitoring Program in grades 1 and 3. Multilevel models outperformed classical linear and polynomial regressions, as well as different machine learning models. However, it could be observed that across all schools, school VA scores from different model types correlated highly. Yet, the percentage of disagreements as compared to multilevel models was not trivial and real-life implications for individual schools may still be dramatic depending on the model type used. Implications of these results and possible ethical concerns regarding the use of machine learning methods for decision-making in education are discussed. [less ▲]

Detailed reference viewed: 157 (16 UL)
Peer Reviewed
See detailSimilarities and differences of value-added scores from models with different covariates: A cluster analysis
Levy, Jessica UL; Brunner, Martin; Keller, Ulrich UL et al

Scientific Conference (2019, November 06)

Detailed reference viewed: 94 (8 UL)
Peer Reviewed
See detailValue-added models: To what extent do estimates of school effectiveness depend on the selection of covariates?
Levy, Jessica UL; Brunner, Martin; Keller, Ulrich UL et al

Scientific Conference (2019, September)

Detailed reference viewed: 101 (6 UL)