Pas de texte intégral
Eprint diffusé à l'origine sur un autre site (E-prints, Working papers et Carnets de recherche)
A Mixture of Generative Models Strategy Helps Humans Generalize across Tasks
Herce Castañón, Santiago; CARDOSO-LEITE, Pedro; Altarelli, Irene et al.
2021
 

Documents


Texte intégral
Aucun document disponible.

Envoyer vers



Détails



Résumé :
[en] What role do generative models play in generalization of learning in humans? Our novel multi-task prediction paradigm—where participants complete four sequence learning tasks, each being a different instance of a common generative family—allows the separate study of within-task learning (i.e., finding the solution to each of the tasks), and across-task learning (i.e., learning a task differently because of past experiences). The very first responses participants make in each task are not yet affected by within-task learning and thus reflect their priors. Our results show that these priors change across successive tasks, increasingly resembling the underlying generative family. We conceptualize multi-task learning as arising from a mixture-of-generative-models learning strategy, whereby participants simultaneously entertain multiple candidate models which compete against each other to explain the experienced sequences. This framework predicts specific error patterns, as well as a gating mechanism for learning, both of which are observed in the data.
Disciplines :
Neurosciences & comportement
Auteur, co-auteur :
Herce Castañón, Santiago
CARDOSO-LEITE, Pedro ;  University of Luxembourg > Faculty of Humanities, Education and Social Sciences (FHSE) > Department of Behavioural and Cognitive Sciences (DBCS)
Altarelli, Irene
Green, C. Shawn
Schrater, Paul
Bavelier, Daphne
Langue du document :
Anglais
Titre :
A Mixture of Generative Models Strategy Helps Humans Generalize across Tasks
Date de publication/diffusion :
2021
Projet FnR :
FNR11242114 - Scientifically Validated Digital Learning Environments, 2016 (01/06/2017-31/01/2023) - Pedro Cardoso-leite
Commentaire :
preprint
Disponible sur ORBilu :
depuis le 04 mars 2021

Statistiques


Nombre de vues
195 (dont 8 Unilu)
Nombre de téléchargements
0 (dont 0 Unilu)

OpenCitations
 
1
citations OpenAlex
 
3

Bibliographie


Publications similaires



Contacter ORBilu