References of "Greiff, Samuel 50001890"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailChallenges for education in a connected world. Inaugural to the special issue Digital learning, data rich environments, and computer-based assessment
Ifenthaler, D.; Adcock, A. B.; Erlandson, B. E. et al

in Technology, Knowledge and Learning (2014), 19

Detailed reference viewed: 138 (58 UL)
Full Text
Peer Reviewed
See detailAssessment with microworlds: factor structure, invariance, and latent mean comparison of the MicroDYN test
Greiff, Samuel UL; Wüstenberg, Sascha UL

in European Journal of Psychological Assessment (2014), 30

Detailed reference viewed: 89 (8 UL)
Full Text
Peer Reviewed
See detailEditorial to the special section Computer-based assessment of cross-curricular skills and processes.
Greiff, Samuel UL; Martin, Romain UL; Spinath, Birgit

in Journal of Educational Psychology (2014), 106

Detailed reference viewed: 181 (73 UL)
Full Text
Peer Reviewed
See detailEditorial zum Themenheft Problemlösen in der Pädagogischen Psychologie
Greiff, Samuel UL; Kretzschmar, André UL; Leutner, Detlev

in Zeitschrift für Pädagogische Psychologie (2014), 28

Detailed reference viewed: 138 (64 UL)
See detailExplaining Complex Problem Solving with a set of non-curricular cognitive competence tasks and task interest in low-stakes assessment
Kupiainen, Sirkku; Vainikainen, M. P.; Hautamäki, Jarkko et al

Presentation (2014)

Detailed reference viewed: 45 (1 UL)
Full Text
Peer Reviewed
See detailThe systematic variation of task characteristics facilitates the understanding of task difficulty: A cognitive diagnostic modeling approach to complex problem solving
Greiff, Samuel UL; Krkovic, Katarina UL; Nagy, Gabriel

in Psychological Test and Assessment Modeling (2014), 56(1), 83-103

Since the 1960ies, when pioneering research on Item Response Theory (IRT) was published, considerable progress has been made with regard to the psychometrical quality of psychological assessment tools ... [more ▼]

Since the 1960ies, when pioneering research on Item Response Theory (IRT) was published, considerable progress has been made with regard to the psychometrical quality of psychological assessment tools. One recent development building upon IRT is the introduction of Cognitive Diagnostic Modeling (CDM). The major goal of introducing CDM was to develop methods that allow for examining which cognitive processes are involved when a person is working on a specific assessment task. More precisely, CDM enables researchers to investigate whether assumed task characteristics drive item difficulty and, thus, person ability parameters. This may – at least according to the assumption inherent in CDM - allow conclusions about cognitive processes involved in assessment tasks. In this study, out of the numerous CDMs available the Least Square Distance Method (LSDM; Dimitrov, 2012) was applied to investigate psychometrical qualities of an assessment instrument measuring Complex Problem Solving (CPS) skills. For the purpose of the study, two task characteristics essential for mastering CPS tasks were identified ex-ante – degree of connectivity and presence of indirect effects by adding eigendynamics to the task. The study examined whether and how the two hypothesized task characteristics drive item difficulty of two CPS dimensions, knowledge acquisition and knowledge application. The sample consisted of 490 German high school students, who completed the computer-based CPS assessment instrument MicroDYN. The two task characteristics in MicroDYN items were varied systematically. Results obtained in LSDM indicated that the two hypothesized task characteristics, degree of connectivity and introducing indirect effects, drove item difficulty only for knowledge acquisition. Hence, other task characteristics that may determine item difficulty of knowledge application need to be investigated in future studies in order to provide a sound measurement of CPS. [less ▲]

Detailed reference viewed: 450 (33 UL)
Full Text
Peer Reviewed
See detailExtending the Assessment of Complex Problem Solving to Finite State Automata: Embracing Heterogeneity
Neubert, Jonas UL; Kretzschmar, André UL; Wüstenberg, Sascha UL et al

in European Journal of Psychological Assessment (2014), Advance Online Publication

Recent advancements in the assessment of Complex Problem Solving (CPS) build on the use of homogenous tasks that enable the reliable estimation of CPS skills. The range of problems featured in established ... [more ▼]

Recent advancements in the assessment of Complex Problem Solving (CPS) build on the use of homogenous tasks that enable the reliable estimation of CPS skills. The range of problems featured in established instruments such as MicroDYN is consequently limited to a specific subset of homogeneous complex problems. This restriction is problematic when looking at domain-specific examples of complex problems, which feature characteristics absent from current assessment instruments (e.g., threshold states). We propose to utilize the formal framework of Finite State Automata (FSA) to extend the range of problems included in CPS assessment. An approach based on FSA, called MicroFIN, is presented, translated into specific tasks, and empirically investigated. We conducted an empirical study (N = 576), (1) inspecting the psychometric features of MicroFIN, (2) relating it to MicroDYN, and (3) investigating the relations to a measure of reasoning (i.e., CogAT). MicroFIN (1) exhibited adequate measurement characteristics and multitrait- multimethod models indicated (2) the convergence of latent dimensions measured with MicroDYN. Relations to reasoning (3) were moderate and comparable to the ones previously found for MicroDYN. Empirical results and corresponding explanations are discussed. More importantly, MicroFIN highlights the feasibility of expanding CPS assessment to a larger spectrum of complex problems. [less ▲]

Detailed reference viewed: 288 (54 UL)
Peer Reviewed
See detailDiscovering Students’ Complex Problem Solving Strategies in Educational Assessment
Toth, K.; Greiff, Samuel UL; Wüstenberg, Sascha UL et al

in Proceedings of the 7th International Conference on Educational Data Mining. (pp. 225-228) (2014)

Detailed reference viewed: 30 (5 UL)
Full Text
Peer Reviewed
See detailDomain-general problem solving skills and education in the 21st century
Greiff, Samuel UL; Wüstenberg, Sascha UL; Csapo, B. et al

in Educational Research Review (2014), 13

Detailed reference viewed: 475 (105 UL)
Full Text
Peer Reviewed
See detailKomplexes Problemlösen, schulfachliche Kompetenzen und ihre Relation zu Schulnoten
Kretzschmar, André UL; Neubert, Jonas UL; Greiff, Samuel UL

in Zeitschrift für Pädagogische Psychologie (2014), 28(4), 205215

The importance of Complex Problem Solving (CPS) within the educational context is well established. This is one of the reasons why CPS plays a prominent role in educational large-scale assessments (e.g ... [more ▼]

The importance of Complex Problem Solving (CPS) within the educational context is well established. This is one of the reasons why CPS plays a prominent role in educational large-scale assessments (e.g., PISA) besides school competencies. However, recent research on CPS and its connection to school performance did not include such school competencies as such, which have proven to be strong predictors of school grades. Consequently, the aim of this study is to close this gap and to examine the relation between CPS and competencies in mathematics and reading. Based on a sample of N=1908 Finish high school students, structural equation modeling was used to analyse the relation of CPS, school competencies, and school grades. In general, the results showed an incremental predictive power of CPS over and above school competencies on school grades in mathematics and mother language. However, differential effects showed a higher importance of CPS in the mathematic domain in comparison to the language domain especially if controlled for reasoning. Implications for the construct of CPS and its importance within the educational context are discussed. [less ▲]

Detailed reference viewed: 287 (82 UL)
Peer Reviewed
See detailAnalysis of student’s problem solving behavior in PISA 2012
Greiff, Samuel UL

Scientific Conference (2014)

Detailed reference viewed: 42 (2 UL)
See detailNew technologies in psychological assessment. The example of computer-based collaborative problem solving assessment.
Krkovic, Katarina UL; Pasztor-Kovacs, Anita; Molnar, Gyöngyvér et al

in International Journal of e-Assessment (2014), 1

Detailed reference viewed: 119 (2 UL)
Peer Reviewed
See detailComputer-assisted testing
Greiff, Samuel UL; Martin, Romain UL

in Mayer, L. (Ed.) Oxford Bibliographies in Education (2014)

Detailed reference viewed: 127 (16 UL)
Full Text
See detailDynamisches Problemlösen stärkt Innovationskompetenz
Ederer, P.; Warnke, A. J.; Greiff, Samuel UL et al

in Rosenberg, B. (Ed.) Strategisches Personalmanagement (2014)

Detailed reference viewed: 47 (5 UL)
Full Text
Peer Reviewed
See detailDigital learning, data rich environments, and computer-based assessment. Special issue
Ifenthaler, D.; Adcock, A. B.; Erlandson, B. E. et al

in Technology, Knowledge and Learning (2014)

Detailed reference viewed: 94 (1 UL)
Full Text
Peer Reviewed
See detailOn the relation of Complex Problem Solving, personality, fluid intelligence, and academic achievement
Greiff, Samuel UL; Neubert, Jonas UL

in Learning & Individual Differences (2014), 36

Detailed reference viewed: 120 (3 UL)
Full Text
Peer Reviewed
See detailWhat You See Is What You (Don’t) Get: A Comment on Funke’s (2014) Opinion Paper
Greiff, Samuel UL; Martin, Romain UL

in Frontiers in Psychology (2014), 5

Detailed reference viewed: 155 (9 UL)