[en] Affect decoding through brain-computer interfacing (BCI) holds great potential to capture users’ feelings and emotional responses via non-invasive electroencephalogram (EEG) sensing. Yet, little research has been conducted to understand efficient decoding when users are exposed to dynamic audiovisual contents. In this regard, we study EEG-based affect decoding from videos in arousal and valence classification tasks, considering the impact of signal length, window size for feature extraction, and frequency bands. We train both classic Machine Learning models (SVMs and k-NNs) and modern Deep Learning models (FCNNs and GTNs). Our results show that: (1) affect can be effectively decoded using less than 1 minute of EEG signal; (2) temporal windows of 6 and 10 seconds provide the best classification performance for classic Machine Learning models but Deep Learning models benefit from much shorter windows of 2 seconds; and (3) any model trained on the Beta band alone achieves similar (sometimes better) performance than when trained on all frequency bands. Taken together, our results indicate that affect decoding can work in more realistic conditions than currently assumed, thus becoming a viable technology for creating better interfaces and user models.
Research center :
ULHPC - University of Luxembourg: High Performance Computing
Disciplines :
Computer science
Author, co-author :
LATIFZADEH, Kayhan ✱; University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
GOZALPOUR, Nima ✱; University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Traver, V. Javier ; INIT, Jaume I University, Castellon De La Plana, Spain
Ruotsalo, Tuukka ; University of Copenhagen, Kobenhavn, Denmark ; LUT University, Lappeenranta Finland
Kawala-Sterniuk, Aleksandra ; Opole University of Technology, Opole Poland
LEIVA, Luis A. ; University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
✱ These authors have contributed equally to this work.
External co-authors :
yes
Language :
English
Title :
Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical Investigation
Publication date :
03 May 2024
Journal title :
ACM Transactions on Multimedia Computing, Communications, and Applications
HE - 101071147 - SYMBIOTIK - Context-aware adaptive visualizations for critical decision making
FnR Project :
FNR15722813 - Brainsourcing For Affective Attention Estimation, 2021 (01/02/2022-31/01/2025) - Luis Leiva
Funders :
Union Européenne
Funding text :
Research supported by the Horizon 2020 FET program of the European Union (BANANA, grant CHIST-ERA-20-BCI-001), the European Innovation Council Pathfinder program (SYMBIOTIK, grant 101071147), the Academy of Finland (grants 313610, 322653, 328875), and the National Science Centre, Poland, under Grant Agreement no. 2021/03/Y/ST7/00008. This research is part of the project PCI2021-122036-2A, funded by MCIN/AEI/10.13039/501100011033 and by the European Union NextGenerationEU/PRTR.
A. Al-Nafjan, M. Hosny, Y. Al-Ohali, and A. Al-Wabil. 2017. Review and classification of emotion recognition based on EEG brain-computer interface system research: A systematic review. Appl. Sci. 7, 12 (2017).
T. B. Alakus, M. Gonen, and I. Turkoglu. 2020. Database for an emotion recognition system based on EEG signals and various computer games - GAMEEMO. Biomed. Signal Process. Control 60, 12 (2020), 1239.
S. M. Alarcao and M. J. Fonseca. 2017. Emotions recognition using EEG signals: A survey. IEEE Trans. Affect. Comput. 10, 3 (2017).
A. Appriou, A. Cichocki, and F. Lotte. 2020. Modern machine-learning algorithms: for classifying cognitive and affective states from electroencephalography signals. IIEEE Trans. Syst. Man Cybern. Syst. 6, 3 (2020).
R. V. Aranha, C. G. Corrêa, and F. L. Nunes. 2019. Adapting software with affective computing: A systematic review. IEEE Trans. Affect. Comput. 12, 4 (2019).
J. Atkinson and D. Campos. 2016. Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers. Expert Syst. Appl. 47 (2016), 35–41.
S. Bagherzadeh, K. Maghooli, A. Shalbaf, and A. Maghsoudi. 2022. Emotion recognition using effective connectivity and pre-trained convolutional neural networks in EEG signals. Cogn. Neurodynamics 16, 5 (2022).
L. Bai, J. Guo, T. Xu, and M. Yang. 2020. Emotional monitoring of learners based on EEG signal recognition. Procedia Comput. Sci. 174 (2020), 364–368.
O. Bertrand, F. Perrin, and J. Pernier. 1985. A theoretical justification of the average reference in topographic evoked potential studies. Electroencephalogr. Clin. Neurophysiol. 62, 6 (1985).
P. E. G. Bestelmeyer, S. A. Kotz, and P. Belin. 2017. Effects of emotional valence and arousal on the voice perception network. Soc. Cogn. Affect. Neurosci. 12, 8 (2017).
A. M. Bhatti, M. Majid, S. M. Anwar, and B. Khan. 2016. Human emotion recognition and analysis in response to audio music using brain signals. Comput. Hum. Behav. 65 (2016), 267–275.
D. Blanco-Mora., A. Aldridge., C. Jorge., A. Vourvopoulos., P. Figueiredo., and S. Bermúdez i Badia.2021. Finding the optimal time window for increased classification accuracy during motor imagery. In Proceedings of the 14th International Conference on Biomedical Electronics and Devices.
S. Brave and C. Nass. 2007. Emotion in human-computer interaction. In The Human-computer Interaction Handbook.
E. Brochu, V. M. Cora, and N. De Freitas. 2010. A tutorial on bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. arXiv:1012.2599. Retrieved from https://arxiv.org/abs/1012.2599
E. A. Butler. 2017. Emotions are temporal interpersonal systems. Curr. Opin. Psychol. 17 (2017).
S. Chang and H. Jun. 2019. Hybrid deep-learning model to recognise emotional responses of users towards architectural design alternatives. J. Asian Archit. Build. Eng. 18, 5 (2019).
J. Chen, P. Zhang, Z. Mao, Y. Huang, D. Jiang, and Y. Zhang. 2019. Accurate EEG-based emotion recognition on combined features using deep convolutional neural networks. IEEE Access 7 (2019), 44317–44328.
L. S. Chen and T. S. Huang. 2000. Emotional expressions in audiovisual human computer interaction. In Proceedings of the International Conference on Multimedia and Expo.
Y. Cimtay and E. Ekmekcioglu. 2020. Investigating the use of pretrained convolutional neural network on cross-subject and cross-dataset EEG emotion recognition. Sensors 20, 7 (2020).
K. M. Davis, L. Kangassalo, M. M. A. Spapé, and T. Ruotsalo. 2020. Brainsourcing: Crowdsourcing recognition tasks via collaborative brain-computer interfacing. In Proceedings of the CHI, Regina Bernhaupt, Florian ‘Floyd’ Mueller, David Verweij, Josh Andres, Joanna McGrenere, Andy Cockburn, Ignacio Avellino, Alix Goguey, Pernille Bjøn, Shengdong Zhao, Briane Paul Samson, and Rafal Kocielnik (Eds.).
D. Devi, S. Sophia, and S. Boselin Prabhu. 2021. Chapter 4 - deep learning-based cognitive state prediction analysis using brain wave signal. In Cognitive Computing for Human-Robot Interaction, Mamta Mittal, Rajiv Ratn Shah, and Sudipta Roy (Eds.).
G. Du, W. Zhou, C. Li, D. Li, and P. X. Liu. 2020. An emotion recognition method for game evaluation based on electroencephalogram. IEEE Trans. Affect. Comput. 14, 1 (2020), 591–602.
R.-N. Duan, J.-Y. Zhu, and B.-L. Lu. 2013. Differential entropy feature for EEG-based emotion classification. In Proceedings of the 2013 6th International IEEE/EMBS Conference on Neural Engineering.
R. O. Duda, P. E. Hart, and D. G. Stork. 2001. Pattern Classification (second ed. ed.). John Wiley & Sons.
E. Duffy. 1934. Emotion: An example of the need for reorientation in psychology.Psychol. Rev. 41, 2 (1934).
M. Egger, M. Ley, and S. Hanke. 2019. Emotion recognition from physiological signal analysis: A review. Electron. Notes Theor. Comput. 343 (2019), 35–55.
P. C. Ellsworth and K. R. Scherer. 2003. Appraisal Processes in Emotion.Oxford University Press.
O. Fasil and R. Rajesh. 2019. Time-domain exponential energy for epileptic EEG signal classification. Neurosci. Lett. 694 (2019), 1–8.
F. Feradov, I. Mporas, and T. Ganchev. 2020. Evaluation of features in detection of dislike responses to audio–visual stimuli from EEG signals. Computers 9, 2 (2020).
F. Galvão, S. M. Alarcão, and M. J. Fonseca. 2021. Predicting exact valence and arousal values from EEG. Sensors 21, 10 (2021).
S. S. Gilakjani and H. Al Osman. 2023. A graph neural network for EEG-based emotion recognition with contrastive learning and generative adversarial neural network data augmentation. IEEE Access 12 (2023), 113–130.
W. M. B. Henia and Z. Lachiri. 2017. Emotion classification in arousal-valence dimension using discrete affective keywords tagging. In Proceedings of the 2017 international conference on engineering & MIS.
R. N. Henson. 2003. Neuroimaging studies of priming. Prog. Neurobiol. 70, 1 (2003).
B. Hjorth. 1970. EEG analysis based on time domain properties. Electroencephalogr. Clin. Neurophysiol. 29, 3 (1970).
G. E. Holder, G. G. Celesia, Y. Miyake, S. Tobimatsu, R. G. Weleber. 2010. International federation of clinical neurophysiology: Recommendations for visual system testing. Clin. Neurophysiol. 121, 9 (2010).
W. Hu, G. Huang, L. Li, L. Zhang, Z. Zhang, and Z. Liang. 2020. Video-triggered EEG-emotion public databases and current methods: A survey. Brain Sci. Adv. 6, 3 (2020).
M. Imani and G. A. Montazer. 2019. A survey of emotion recognition methods with emphasis on E-Learning environments. J. Netw. Comput. Appl. 147 (2019), 102423.
M. P. Kalashami, M. M. Pedram, and H. Sadr. 2022. EEG feature extraction and data augmentation in emotion recognition. Comput. Intell. Neurosci. 2022 (2022), 7028517.
N. Kamel and A. S. Malik. 2015. EEG/ERP Analysis: Methods and Applications. CRC Press, Taylor & Francis.
S. Katsigiannis and N. Ramzan. 2017. DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inform. 22, 1 (2017).
A. Kawala-Sterniuk, N. Browarska, A. Al-Bakri, M. Pelc, J. Zygarlicki, M. Sidikova, R. Martinek, and E. J. Gorzelanczyk. 2021. Summary of over fifty years with brain-computer interfaces-a review. Brain Sci. 11, 1 (2021).
S.-H. Kim, H.-J. Yang, N. A. T. Nguyen, S. K. Prabhakar, and S.-W. Lee. 2021. WeDea: A new EEG-based framework for emotion recognition. IEEE J. Biomed. Health Inform. 26, 1 (2021).
D.-H. Ko, D.-H. Shin, and T.-E. Kam. 2021. Attention-based spatio-temporal-spectral feature learning for subject-specific EEG classification. In Proceedings of the 2021 9th International Winter Conference on Brain-Computer Interface.
S. Koelstra, C. Muhl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras. 2011. DEAP: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3, 1 (2011).
J. Kosiński, K. Szklanny, A. Wieczorkowska, and M. Wichrowski. 2018. An analysis of game-related emotions using EMOTIV EPOC. In Proceedings of the 2018 Federated Conference on Computer Science and Information System.
N. Kos’myna and F. Tarpin-Bernard. 2013. Evaluation and comparison of a multimodal combination of BCI paradigms and eye tracking with affordable consumer-grade hardware in a gaming context. IEEE Trans. Comput. Intell. AI Games 5, 2 (2013).
M. Kostyunina and M. Kulikov. 1996. Frequency characteristics of EEG spectra in the emotions. Neurosci. Behav. Physiol. 26, 4 (1996).
A. Kumar and A. Kumar. 2021. DEEPHER: Human emotion recognition using an EEG-Based DEEP learning network model. Eng. Proc. 10, 1 (2021).
N. Kumar, K. Khaund, and S. M. Hazarika. 2016. Bispectral analysis of EEG for emotion recognition. Procedia Comput. Sci. 84 (2016), 31–35.
P. J. Lang. 1995. The emotion probe. studies of motivation and attention. Am. Psychol. 50 (1995), 372–385.
R. J. Larsen and E. Diener. 1992. Promises and problems with the circumplex model of emotion. In Proceedings of the Review of Personality and Social Psychology, M. Clark (Ed.). Vol. 13.
E. Lashgari, D. Liang, and U. Maoz. 2020. Data augmentation for deep-learning-based electroencephalography. J. Neurosci. Methods 346 (2020), 108885.
K. Latifzadeh and L. A. Leiva. 2022. Gustav: Cross-device cross-computer synchronization of sensory signals. In ProceedingsoftheAdjunctProceedingsofthe35thAnnualACMSymposiumonUserInterfaceSoftwareandTechnology.
R. W. Levenson. 2003. Blood, sweat, and fears: The autonomic architecture of emotion. Ann. N. Y. Acad. Sci. 1000, 1 (2003).
C. Li, Z. Zhang, R. Song, J. Cheng, Y. Liu, and X. Chen. 2021. EEG-based emotion recognition via neural architecture search. IEEE Trans. Affect. Comput. 14, 2 (2021).
M. Li, H. Xu, X. Liu, and S. Lu. 2018. Emotion recognition from multichannel EEG signals using k-nearest neighbor classification. Technol. Health Care 26, S1 (2018).
X. Li, D. Song, P. Zhang, Y. Zhang, Y. Hou, and B. Hu. 2018. Exploring EEG features in cross-subject emotion recognition. Front. Neurosci. 12 (2018), 162.
X. Li, Y. Zhang, P. Tiwari, D. Song, B. Hu, M. Yang, Z. Zhao, N. Kumar, and P. Marttinen. 2022. EEG based emotion recognition: A tutorial and review. ACM Comput. Surv. 55, 4 (2022), 1–57.
Y. Li, J. Chen, F. Li, B. Fu, H. Wu, Y. Ji, Y. Zhou, Y. Niu, G. Shi, and W. Zheng. 2022. GMSS: Graph-based multi-task self-supervised learning for EEG emotion recognition. IEEE Trans. Affect. Comput. 14, 3 (2022), 2512–2525.
C. L. Lisetti and F. Nasoz. 2002. MAUI: A multimodal affective user interface. In Proceedings of the 10th ACM international conference on Multimedia.
M. Liu, S. Ren, S. Ma, J. Jiao, Y. Chen, Z. Wang, and W. Song. 2021. Gated transformer networks for multivariate time series classification. arXiv:2103.14438. Retrieved from https://arxiv.org/abs/2103.14438
Y. Liu and O. Sourina. 2014. EEG-based subject-dependent emotion recognition algorithm using fractal dimension. In Proceedings of the International Conference on Systems, Man, and Cybernetics.
I. Lopatovska and I. Arapakis. 2011. Theories, methods and current research on emotions in library and information science, information retrieval and human–computer interaction. Inf. Process. Manag. 47, 4 (2011).
J.-M. López-Gil, J. Virgili-Gomá, R. Gil, T. Guilera, I. Batalla, J. Soler-González, and R. García. 2016. Method for improving EEG based emotion recognition by combining it with synchronized biometric and eye tracking technologies in a non-invasive and low cost way. Front. Comput. Neurosci. 10 (2016), 85.
Y. Luo and B.-L. Lu. 2018. EEG data augmentation for emotion recognition using a conditional wasserstein GAN. In Proceedings of the 2018 40th annual international conference of the IEEE engineering in medicine and biology society.
Y. Luo, L.-Z. Zhu, and B.-L. Lu. 2019. A GAN-based data augmentation method for multimodal emotion recognition. In Proceedings of the Advances in Neural Networks–ISNN 2019: 16th International Symposium on Neural Networks.
R. Mahini, Y. Li, W. Ding, R. Fu, T. Ristaniemi, A. K. Nandi, G. Chen, and F. Cong. 2020. Determination of the time window of event-related potential using multiple-set consensus clustering. Front. Neurosci. 14 (2020), 521595.
I. Mazumder. 2019. An analytical approach of EEG analysis for emotion recognition. In Proceedings of the 2019 Devices for Integrated Circuit.
R. M. Mehmood, M. Bilal, S. Vimal, and S.-W. Lee. 2022. EEG-based affective state recognition from human brain signals by using hjorth-activity. Measurement 202 (2022), 111738.
M. L. R. Menezes, A. Samara, L. Galway, A. Sant’Anna, A. Verikas, F. Alonso-Fernandez, H. Wang, and R. Bond. 2017. Towards emotion recognition for virtual environments: An evaluation of EEG features on benchmark dataset. Pers. Ubiquit. Comput. 21 (2017), 1003–1013.
Z. Mohammadi, J. Frounchi, and M. Amiri. 2017. Wavelet-based emotion recognition system using EEG signal. Neural Comput. Appl. 28, 8 (2017).
Y. Moreno-Alcayde, V. J. Traver, and L. Leiva. 2023. Sneaky emotions: Impact of data partitions in affective computing experiments with brain-computer interfacing. Biomed. Eng. Lett. 14, 1 (2023), 103–113.
T. Mullen, C. Kothe, Y.-M. Chi, A. Ojeda, T. Kerth, S. Makeig, G. Cauwenberghs, and T.-P. Jung. 2013. Real-time modeling and 3D visualization of source dynamics and connectivity using wearable EEG. In Proceedings of the 2013 35th annual international conference of the IEEE engineering in medicine and biology society.
D. A. Norman. 2004. Emotional Design: Why we Love (or Hate) Everyday Things. Civitas Books.
D. Ouyang, Y. Yuan, G. Li, and Z. Guo. 2022. The effect of time window length on EEG-based emotion recognition. Sensors 22, 13 (2022).
M. S. Özerdem and H. Polat. 2017. Emotion recognition based on EEG features in movie clips with channel selection. Brain Inform. 4, 4 (2017).
E. S. Pane, A. D. Wibawa, and M. H. Pumomo. 2018. Channel selection of EEG emotion recognition using stepwise discriminant analysis. In Proceedings of the 2018 International Conference on Computer Engineering, Network and Intelligent Multimedia.
V. Peterson, C. Galván, H. Hernández, and R. Spies. 2020. A feasibility study of a complete low-cost consumer-grade brain-computer interface system. Heliyon 6, 3 (2020).
R. W. Picard. 2000. Affective Computing. MIT press.
R. W. Picard and J. Klein. 2002. Computers that recognise and respond to user emotion: Theoretical and practical implications. Interact. Comput. 14, 2 (2002).
R. W. Picard, E. Vyzas, and J. Healey. 2001. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 23, 10 (2001).
M. A. Rahman, M. F. Hossain, M. Hossain, and R. Ahmmed. 2020. Employing PCA and t-statistical approach for feature extraction and classification of emotion from multichannel EEG signal. Egypt. Inform. J. 21, 1 (2020), 23–35.
S. Rayatdoost, D. Rudrauf, and M. Soleymani. 2020. Expression-guided EEG representation learning for emotion recognition. In Proceedings of the International Conference on Acoustics, Speech and Signal Processing.
J. A. Russell. 1980. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 6 (1980).
S. Saha and M. Baumert. 2019. Intra- and inter-subject variability in EEG-based sensorimotor brain computer interface: A review. Frontiers Comput. Neurosci. 13 (2019), 87.
S. Saha, K. A. Mamun, K. Ahmed, R. Mostafa, G. R. Naik, S. Darvishi, A. H. Khandoker, and M. Baumert. 2021. Progress in brain computer interface: Challenges and opportunities. Front. Syst. Neurosci. 15, 578875 (2021).
E. S. Salama, R. A. El-Khoribi, M. E. Shoman, and M. A. W. Shalaby. 2018. EEG-based emotion recognition using 3D convolutional neural networks. Int. J. Adv. Comput. Sci. Appl. 9, 8 (2018).
L. Shaw and A. Routray. 2016. Statistical features extraction for multivariate pattern analysis in meditation EEG using PCA. In Proceedings of the International Student Conference.
X. Shen, X. Liu, X. Hu, D. Zhang, and S. Song. 2022. Contrastive learning of subject-invariant EEG representations for cross-subject emotion recognition. IEEE Trans. Affect. Comput. 14, 3 (2022), 2496–2511.
L. Shu, J. Xie, M. Yang, Z. Li, Z. Li, D. Liao, X. Xu, and X. Yang. 2018. A review of emotion recognition using physiological signals. Sensors 18, 7 (2018).
N. Singh Malan and S. Sharma. 2021. Time window and frequency band optimization using regularized neighbourhood component analysis for multi-view motor imagery EEG classification. Biomed. Signal Process. Control 67 (2021), 102550.
M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic. 2011. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3, 1 (2011).
T. Song, W. Zheng, P. Song, and Z. Cui. 2020. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans. Affect. Comput. 11, 3 (2020).
H. A. Spelt, J. H. Westerink, L. Frank, J. Ham, and W. A. IJsselsteijn. 2022. Physiology-based personalization of persuasive technology: A user modeling perspective. User Model. User-Adapt. Interact. 32, 1 (2022).
N. Thammasan, K.-i. Fukui, and M. Numao. 2016. Application of deep belief networks in EEG-based dynamic music-emotion recognition. In Proceedings of the International Joint Conference on Neural Networks.
S. Thejaswini, K. M. R. Kumar, and A. N. J. L. 2019. Analysis of EEG based emotion detection of DEAP and SEED-IV databases using SVM. In Proceedings of the ICETSE.
E. P. Torres, E. A. Torres, M. Hernández-Álvarez, and S. G. Yoo. 2021. Real-time emotion recognition for EEG signals recollected from online poker game participants. In Proceedings of the Advances in Artificial Intelligence, Software and Systems Engineering, Tareq Z. Ahram, Waldemar Karwowski, and Jay Kalra (Eds.).
E. P. Torres, E. A. Torres, M. Hernández-Álvarez, and S. G. Yoo. 2020. EEG-based BCI emotion recognition: A survey. Sensors 20, 18 (2020), 5083.
V. J. Traver, J. Zorío, and L. A. Leiva. 2021. Glimpse: A gaze-based measure of temporal salience. Sensors 21, 9 (2021).
K. D. Tzimourta, N. Giannakeas, A. T. Tzallas, L. G. Astrakas, T. Afrantou, P. Ioannidis, N. Grigoriadis, P. Angelidis, D. G. Tsalikakis, and M. G. Tsipouras. 2019. EEG window length evaluation for the detection of alzheimer’s disease over different brain regions. Brain Sci. 9, 4 (2019).
K. P. Wagh and K. Vasanth. 2019. Electroencephalograph (EEG) based emotion recognition system: A review. Innov. Electron. Commun. Eng. 33 (2019), 37–59.
K. P. Wagh and K. Vasanth. 2022. Performance evaluation of multi-channel electroencephalogram signal (EEG) based time frequency analysis for human emotion recognition. Biomed. Signal Process. Control 78 (2022), 103966.
J. Wagner, J. Kim, and E. André. 2005. From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification. In Proceedings of the International Conference on Multimedia and Expo.
F. Wang, S.-h. Zhong, J. Peng, J. Jiang, and Y. Liu. 2018. Data augmentation for EEG-based emotion recognition with deep convolutional neural networks. In Proceedings of the MultiMedia Modeling: 24th International Conference.
P. Wang, Z. Song, H. Chen, T. Fang, Y. Zhang, X. Zhang, S. Wang, H. Li, Y. Lin, J. Jia, L. Zhang, and X. Kang. 2021. Application of combined brain computer interface and eye tracking. In Proceedings of the 2021 9th International Winter Conference on Brain-Computer Interface.
Z.-M. Wang, S.-Y. Hu, and H. Song. 2019. Channel selection method for EEG emotion recognition using normalized mutual information. IEEE Access 7 (2019), 143303–143311.
M. B. H. Wiem and Z. Lachiri. 2017. Emotion classification in arousal valence model using MAHNOB-HCI database. Int. J. Adv. Comput. Sci. Appl. 8, 3 (2017).
T. Xu, Y. Zhou, Z. Wang, and Y. Peng. 2018. Learning emotions EEG-based recognition and brain activity: A survey study on BCI for intelligent tutoring system. Procedia Comput. Sci. 130 (2018), 376–382.
X. Xu, F. Wei, Z. Zhu, J. Liu, and X. Wu. 2020. EEG feature selection using orthogonal regression: Application to emotion recognition. In Proceedings of the International Conference on Acoustics, Speech and Signal Processing.
J. Yan, S. Chen, and S. Deng. 2019. A EEG-based emotion recognition model with rhythm and time characteristics. Brain Inform. 6, 1 (2019).
Z. Zeng, M. Pantic, G. I. Roisman, and T. S. Huang. 2007. A survey of affect recognition methods: Audio, visual and spontaneous expressions. In Proceedings of the Proceedings of the 9th International Conference on Multimodal Interfaces.
J. Zhang, M. Chen, S. Zhao, S. Hu, Z. Shi, and Y. Cao. 2016. ReliefF-based EEG sensor selection methods for emotion recognition. Sensors 16, 10 (2016).
Y. Zhang, J. Chen, J. H. Tan, Y. Chen, Y. Chen, D. Li, L. Yang, J. Su, X. Huang, and W. Che. 2020. An investigation of deep learning models for EEG-based emotion recognition. Front. Neurosci. 14 (2020), 622759.
Y. Zhang, G. Yan, W. Chang, W. Huang, and Y. Yuan. 2023. EEG-based multi-frequency band functional connectivity analysis and the application of spatio-temporal features in emotion recognition. Biomed. Signal Process. Control 79 (2023), 104157.
Z. Zhang, S.-h. Zhong, and Y. Liu. 2022. GANSER: A self-supervised data augmentation framework for EEG-based emotion recognition. IEEE Trans. Affect. Comput. 14, 3 (2022), 2048–2063.
W.-L. Zheng, B.-N. Dong, and B.-L. Lu. 2014. Multimodal emotion recognition using EEG and eye tracking data. In Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.
W.-L. Zheng and B.-L. Lu. 2015. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 7, 3 (2015).
W.-L. Zheng, J.-Y. Zhu, and B.-L. Lu. 2019. Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans. Affect. Comput. 10, 3 (2019).
P. Zhong, D. Wang, and C. Miao. 2022. EEG-based emotion recognition using regularized graph neural networks. IEEE Trans. Affect. Comput. 13, 3 (2022).
Y. Zhou, T. Xu, S. Li, and R. Shi. 2019. Beyond engagement: An EEG-based methodology for assessing user’s confusion in an educational game. Univers. Access Inf. Soc. 18, 3 (2019).