References of "Krkovic, Katarina 50002141"
     in
Bookmark and Share    
Peer Reviewed
See detailATC21S and OECD PISA. Comparative approaches to the assessment of collaborative problem solving in Germany and Australia
Griffin, P.; Greiff, Samuel UL; Care, E. et al

Scientific Conference (2015, September)

Detailed reference viewed: 101 (3 UL)
Full Text
Peer Reviewed
See detailMoving towards the assessment of collaborative problem solving skills with a tangible user interface
Ras, Eric; Krkovic, Katarina UL; Greiff, Samuel UL et al

in Turkish Online Journal of Educational Technology (2014), 13(4), 95-104

The research on the assessment of collaborative problem solving (ColPS), as one crucial 21st Century Skill, is still in its beginnings. Using Tangible User Interfaces (TUI) for this purpose has only been ... [more ▼]

The research on the assessment of collaborative problem solving (ColPS), as one crucial 21st Century Skill, is still in its beginnings. Using Tangible User Interfaces (TUI) for this purpose has only been marginally investigated in technology-based assessment. Our first empirical studies focused on light-weight performance measurements, usability, user experience, and gesture analysis to increase our understanding of how people interact with TUI in an assessment context. In this paper we propose a research agenda for assessing ColPS of individuals using the Microworlds methodology implemented on TUIs. In a first example item, we use so-called Micro- DYN items, which are independent microworld scenarios that rely on structural linear equations as underlying model. As the MicroDYN approach has been thoroughly empirically investigated for the assessment of complex problem solving of individuals, it offers a good basis for a reliable and valid assessment. We describe how this approach was applied to create an assessment item for a collaborative setting. This item described in this paper implements a simplified model of a MicroDYN item related to climate change using knowledge of previous studies. Therefore, the focus of the item’s construction lies on meeting the requirements for a standardised high quality assessment. Finally, a research agenda is proposed to sketch the main research issues. [less ▲]

Detailed reference viewed: 94 (4 UL)
Peer Reviewed
See detailHow gender influences performance assessment. Teacher-student gender interaction in focus.
Krkovic, Katarina UL; Greiff, Samuel UL; Kupiainen, Sirkku et al

Scientific Conference (2014, August)

Detailed reference viewed: 91 (1 UL)
See detailCollaborative problem solving. Concept, assessment, and first results.
Krkovic, Katarina UL; Greiff, Samuel UL

Scientific Conference (2014, April 30)

Detailed reference viewed: 44 (0 UL)
See detailNew technologies in psychological assessment. The example of computer-based collaborative problem solving assessment.
Krkovic, Katarina UL; Pasztor-Kovacs, Anita; Molnar, Gyöngyvér et al

in International Journal of e-Assessment (2014), 1

Detailed reference viewed: 97 (2 UL)
Full Text
Peer Reviewed
See detailTeacher evaluation of student ability: what roles do teacher gender, student gender, and their interaction play?
Krkovic, Katarina UL; Greiff, Samuel UL; Kupiainen, Sirkku et al

in Educational Research (2014)

Background: Recent decades have been marked by an extensive movement to analyze bias in people’s thinking, especially in gender-related issues. Studies have addressed the question of gender bias in ... [more ▼]

Background: Recent decades have been marked by an extensive movement to analyze bias in people’s thinking, especially in gender-related issues. Studies have addressed the question of gender bias in classrooms on different levels—the use of gender in books, learning opportunities determined by students’ gender, or teachers’ gender preferences. Purpose: In this study, we aim to answer the question of whether and under which circumstances the interaction between teacher gender and student gender positively or negatively influences teachers’ evaluations of students’ performance, while controlling for objective measures of students’ performance. For instance, it could be possible that a teacher with the same gender as a student evaluates the student as better than opposite-gender students, independent of their objective performance. Sample: The sample consisted of n > 1,500 Finnish 6th grade students (Mage= 12.67) and their respective class teachers. Design and methods: Students completed several academic skills tests, including a mathematical thinking test, reading comprehension test, and scientific reasoning test. Furthermore, teachers provided their evaluation of each student, evaluating students’ performance in different school subjects and answering questions regarding their probability of academic success. To test whether the teacher-student gender interaction had an effect on the criterion variable, i.e. teachers’ evaluation of the students’ performance, multilevel analyses accounting for between- and within-class effects were applied. Thereby, the effect of students’ objective performance on teachers’ evaluation of the students and main effects of gender were controlled for as covariates. Results: The main results indicated that the interaction between student and teacher gender did not influence teachers’ evaluation of the students. However, regardless of their gender, teachers tended to evaluate girls as better than boys in first language performance (i.e. Finnish language) and potential for success in school. Teacher gender did not influence the evaluation. Conclusions: The results of the study suggest that the interaction between teacher and student gender is unlikely to be a source of possible bias in the evaluations of students in the Finnish educational system. [less ▲]

Detailed reference viewed: 149 (29 UL)
Full Text
Peer Reviewed
See detailThe systematic variation of task characteristics facilitates the understanding of task difficulty: A cognitive diagnostic modeling approach to complex problem solving
Greiff, Samuel UL; Krkovic, Katarina UL; Nagy, Gabriel

in Psychological Test and Assessment Modeling (2014), 56(1), 83-103

Since the 1960ies, when pioneering research on Item Response Theory (IRT) was published, considerable progress has been made with regard to the psychometrical quality of psychological assessment tools ... [more ▼]

Since the 1960ies, when pioneering research on Item Response Theory (IRT) was published, considerable progress has been made with regard to the psychometrical quality of psychological assessment tools. One recent development building upon IRT is the introduction of Cognitive Diagnostic Modeling (CDM). The major goal of introducing CDM was to develop methods that allow for examining which cognitive processes are involved when a person is working on a specific assessment task. More precisely, CDM enables researchers to investigate whether assumed task characteristics drive item difficulty and, thus, person ability parameters. This may – at least according to the assumption inherent in CDM - allow conclusions about cognitive processes involved in assessment tasks. In this study, out of the numerous CDMs available the Least Square Distance Method (LSDM; Dimitrov, 2012) was applied to investigate psychometrical qualities of an assessment instrument measuring Complex Problem Solving (CPS) skills. For the purpose of the study, two task characteristics essential for mastering CPS tasks were identified ex-ante – degree of connectivity and presence of indirect effects by adding eigendynamics to the task. The study examined whether and how the two hypothesized task characteristics drive item difficulty of two CPS dimensions, knowledge acquisition and knowledge application. The sample consisted of 490 German high school students, who completed the computer-based CPS assessment instrument MicroDYN. The two task characteristics in MicroDYN items were varied systematically. Results obtained in LSDM indicated that the two hypothesized task characteristics, degree of connectivity and introducing indirect effects, drove item difficulty only for knowledge acquisition. Hence, other task characteristics that may determine item difficulty of knowledge application need to be investigated in future studies in order to provide a sound measurement of CPS. [less ▲]

Detailed reference viewed: 418 (33 UL)
Peer Reviewed
See detailKomplexe Problemlösekompetenz. Einfluss von Alter und Schulform
Rudolph, Julia UL; Krkovic, Katarina UL; Greiff, Samuel UL

Scientific Conference (2013, September)

Detailed reference viewed: 69 (5 UL)
Peer Reviewed
See detailGeschlechterbias in der Bildung. Die Interaktion der Geschlechter als Ursache
Krkovic, Katarina UL; Greiff, Samuel UL

Scientific Conference (2013, September)

Detailed reference viewed: 59 (2 UL)
Peer Reviewed
See detailNew technologies in psychological assessment: The example of computer-based collaborative problem solving assessment
Krkovic, Katarina UL; Pasztor-Kovacs, A.; Molnar, G. et al

Scientific Conference (2013, July 09)

Detailed reference viewed: 67 (4 UL)
See detailCollaborative Problem Solving: current assessment possibilities and issues.
Krkovic, Katarina UL; Rudolph, Julia UL; Greiff, Samuel UL

Scientific Conference (2013, April 16)

Detailed reference viewed: 50 (6 UL)