[en] Conducting user studies that involve physiological and behavioral measurements is very time-consuming and expensive, as it not only involves a careful experiment design, device calibration, etc. but also a careful software testing. We propose Thalamus, a software toolkit for collecting and simulating multimodal signals that can help the experimenters to prepare in advance for unexpected situations before reaching out to the actual study participants and even before having to install or purchase a specific device. Among other features, Thalamus allows the experimenter to modify, synchronize, and broadcast physiological signals (as coming from various data streams) from different devices simultaneously and not necessarily located in the same place. Thalamus is cross-platform, cross-device, and simple to use, making it thus a valuable asset for HCI research.
Disciplines :
Computer science
Author, co-author :
LATIFZADEH, Kayhan ; University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
LEIVA, Luis A. ; University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
External co-authors :
no
Language :
English
Title :
Thalamus: A User Simulation Toolkit for Prototyping Multimodal Sensing Studies
Publication date :
12 June 2025
Event name :
The 33rd ACM Conference on User Modeling, Adaptation and Personalization (UMAP '25)
Event place :
New York, United States - New York
Event date :
June 16 - 19, 2025
Audience :
International
Main work title :
UMAP Adjunct '25: Adjunct Proceedings of the 33rd ACM Conference on User Modeling, Adaptation and Personalization
Publisher :
Association for Computing Machinery, New York, United States - New York
HE - 101071147 - SYMBIOTIK - Context-aware adaptive visualizations for critical decision making
FnR Project :
FNR15722813 - BANANA - Brainsourcing For Affective Attention Estimation, 2021 (01/02/2022-31/01/2025) - Luis Leiva
Funders :
Horizon 2020 FET program of the European Union Horizon Europe’s European Innovation Council through the Pathfinder program European Union
Funding text :
This work is supported by the Horizon 2020 FET program of the European Union through the ERA-NET Cofund funding (BANANA, grant CHIST-ERA-20-BCI-001) and Horizon Europe’s European
Innovation Council through the Pathfinder program (SYMBIOTIK, grant 101071147).
P. Boekgaard, M. K. Petersen, and J. E. Larsen. 2014. In the twinkling of an eye: Synchronization of EEG and eye tracking based on blink signatures. In Proc. Intl. Workshop on Cognitive Information Processing (CIP).
N. Dillen, M. Ilievski, E. Law, L. E. Nacke, K. Czarnecki, and O. Schneider. 2020. Keep calm and ride along: Passenger comfort and anxiety as physiological responses to autonomous driving styles. In Proceedings of the 2020 CHI conference on human factors in computing systems. 1-13.
S. Katsigiannis and N. Ramzan. 2017. DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-The-shelf devices. IEEE journal of biomedical and health informatics 22, 1 (2017), 98-107.
S. Kim, K. A. E. Patra, A. Kim, K.-P. Lee, A. Segev, and U. Lee. 2017. Sensors know which photos are memorable. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 2706-2713.
E. Y. Kimchi, B. F. Coughlin, B. E. Shanahan, G. Piantoni, J. Pezaris, and S. S. Cash. 2020. OpBox: Open source tools for simultaneous EEG and EMG acquisition from multiple subjects. eNeuro 7, 5 (2020).
K. Latifzadeh and L. A. Leiva. 2022. Gustav: Cross-device Cross-computer Synchronization of Sensory Signals. In The Adjunct Publication of the 35th Annual ACM Symposium on User Interface Software and Technology. 1-3.
Q. Li, R. Li, K. Ji, and W. Dai. 2015. Kalman filter and its application. In 2015 8th International Conference on Intelligent Networks and Intelligent Systems (ICINIS). IEEE, 74-77.
P. Lopes, L. L. Chuang, and P. Maes. 2021. Physiological I/O. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. 1-4.
A. Momeni and D.Wessel. 2003. Characterizing and Controlling Musical Material Intuitively with Graphical Models. In Proceedings of the New Interfaces for Musical Expression Conference.
R. Murray-Smith, A. Oulasvirta, A. Howes, J. Mller, A. Ikkala, M. Bachinski, A. Fleig, F. Fischer, and M. Klar. 2022. What simulation can do for HCI research. Interactions 29, 6 (2022), 48-53.
G. M. Notaro and S. G. Diamond. 2018. Simultaneous EEG, eye-Tracking, behavioral, and screen-capture data during online German language learning. Data Brief 21 (2018).
X. Peng, C. Meng, X. Xie, J. Huang, H. Chen, and H. Wang. 2022. Detecting challenge from physiological signals: A primary study with a typical game scenario. In CHI Conference on Human Factors in Computing Systems Extended Abstracts. 1-7.
M. Ragot, N. Martin, S. Em, N. Pallamin, and J.-M. Diverrez. 2017. Emotion Recognition Using Physiological Signals: Laboratory vs. Wearable Sensors. In Proc. Intl. Conf. Applied Human Factors and Ergonomics (AHFE).
A. Riegler, A. Riener, and C. Holzmann. 2019. AutoWSD: Virtual reality automated driving simulator for rapid HCI prototyping. In Proceedings of Mensch und Computer 2019. 853-857.
A. Savitzky and M. J. Golay. 1964. Smoothing and differentiation of data by simplified least squares procedures. Analytical chemistry 36, 8 (1964), 1627-1639.
E. J. Shah, J. Y. Chow, and M. J. Lee. 2020. Anxiety on Quiet Eye and Performance of Youth Pistol Shooters. J. Sport Exerc. Psychol. 42, 4 (2020).
N. J. Shoumy, L.-M. Ang, K. P. Seng, D. M. Rahaman, and T. Zia. 2020. Multimodal big data affective analytics: A comprehensive survey using text, audio, visual and physiological signals. Journal of Network and Computer Applications 149 (2020), 102447.
M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic. 2011. A multimodal database for affect recognition and implicit tagging. IEEE transactions on affective computing 3, 1 (2011), 42-55.
Y. Sun, F. Guo, F. Kaffashi, F. J. Jacono, M. DeGeorgia, and K. A. Loparo. 2020. INSMA: An integrated system for multimodal data acquisition and analysis in the intensive care unit. Journal of biomedical informatics 106 (2020), 103434.
D. Szajerman, P. Napieralski, and J.-P. Lecointe. 2018. Joint analysis of simultaneous EEG and eye tracking data for video images. In Proc. IEEE ISEF.
R. Taib, B. Itzstein, and K. Yu. 2014. Synchronising Physiological and Behavioural Sensors in a Driving Simulator. In Proc. Intl. Conf. Multimodal Interaction (ICMI).
F. Wolling, C. D. Huynh, and K. Van Laerhoven. 2021. IBSync: Intra-body synchronization of wearable devices using artificial ECG landmarks. In Proc. Intl. Symposium on Wearable Computers (ISWC).
R. Xiao, C. Ding, and X. Hu. 2022. Time Synchronization of Multimodal Physiological Signals through Alignment of Common Signal Types and Its Technical Considerations in Digital Health. J. Imaging 8, 5 (2022).
J. Xue, C. Quan, C. Li, J. Yue, and C. Zhang. 2017. A crucial temporal accuracy test of combining EEG and Tobii eye tracker. Medicine 96, 13 (2017).
W.-L. Zheng and B.-L. Lu. 2015. Investigating Critical Frequency Bands and Channels for EEG-based Emotion Recognition with Deep Neural Networks. IEEE Transactions on Autonomous Mental Development 7, 3 (2015). https://doi.org/10. 1109/TAMD.2015.2431497
P. H. Zimmerman, J. E. Bolhuis, A. Willemsen, E. S. Meyer, and L. P. Noldus. 2009. The Observer XT: A tool for the integration and synchronization of multimodal signals. Behav. Res. Methods 41, 3 (2009).