Article (Scientific journals)
Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical Investigation
LATIFZADEH, Kayhan; GOZALPOUR, Nima; Traver, V. Javier et al.
2024In ACM Transactions on Multimedia Computing, Communications, and Applications
Peer Reviewed verified by ORBi
 

Files


Full Text
3663669.pdf
Author postprint (2.16 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
BCI; EEG; Neurophysiology; Afect; Emotions; Videos
Abstract :
[en] Affect decoding through brain-computer interfacing (BCI) holds great potential to capture users’ feelings and emotional responses via non-invasive electroencephalogram (EEG) sensing. Yet, little research has been conducted to understand efficient decoding when users are exposed to dynamic audiovisual contents. In this regard, we study EEG-based affect decoding from videos in arousal and valence classification tasks, considering the impact of signal length, window size for feature extraction, and frequency bands. We train both classic Machine Learning models (SVMs and k-NNs) and modern Deep Learning models (FCNNs and GTNs). Our results show that: (1) affect can be effectively decoded using less than 1 minute of EEG signal; (2) temporal windows of 6 and 10 seconds provide the best classification performance for classic Machine Learning models but Deep Learning models benefit from much shorter windows of 2 seconds; and (3) any model trained on the Beta band alone achieves similar (sometimes better) performance than when trained on all frequency bands. Taken together, our results indicate that affect decoding can work in more realistic conditions than currently assumed, thus becoming a viable technology for creating better interfaces and user models.
Research center :
ULHPC - University of Luxembourg: High Performance Computing
Disciplines :
Computer science
Author, co-author :
LATIFZADEH, Kayhan   ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
GOZALPOUR, Nima   ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Traver, V. Javier ;  INIT, Jaume I University, Castellon De La Plana, Spain
Ruotsalo, Tuukka ;  University of Copenhagen, Kobenhavn, Denmark ; LUT University, Lappeenranta Finland
Kawala-Sterniuk, Aleksandra ;  Opole University of Technology, Opole Poland
LEIVA, Luis A.  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
 These authors have contributed equally to this work.
External co-authors :
yes
Language :
English
Title :
Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical Investigation
Publication date :
03 May 2024
Journal title :
ACM Transactions on Multimedia Computing, Communications, and Applications
ISSN :
1551-6857
eISSN :
1551-6865
Publisher :
Association for Computing Machinery (ACM)
Peer reviewed :
Peer Reviewed verified by ORBi
Focus Area :
Computational Sciences
European Projects :
HE - 101071147 - SYMBIOTIK - Context-aware adaptive visualizations for critical decision making
FnR Project :
FNR15722813 - Brainsourcing For Affective Attention Estimation, 2021 (01/02/2022-31/01/2025) - Luis Leiva
Funders :
Union Européenne
Funding text :
Research supported by the Horizon 2020 FET program of the European Union (BANANA, grant CHIST-ERA-20-BCI-001), the European Innovation Council Pathfinder program (SYMBIOTIK, grant 101071147), the Academy of Finland (grants 313610, 322653, 328875), and the National Science Centre, Poland, under Grant Agreement no. 2021/03/Y/ST7/00008. This research is part of the project PCI2021-122036-2A, funded by MCIN/AEI/10.13039/501100011033 and by the European Union NextGenerationEU/PRTR.
Available on ORBilu :
since 12 July 2024

Statistics


Number of views
110 (14 by Unilu)
Number of downloads
46 (0 by Unilu)

Scopus citations®
 
5
Scopus citations®
without self-citations
3
OpenAlex citations
 
3

Bibliography


Similar publications



Contact ORBilu