Abstract :
[en] The present notion of visual similarity is based on features derived from image contents. This ignores the users' emotional or affective experiences toward the content, and how users feel when they search for images. Here we consider valence, a positive or negative quantification of affective appraisal, as a novel dimension of image similarity. We report the largest neuroimaging experiment that quantifies and predicts the valence of visual content by using functional near-infrared spectroscopy from brain-computer interfacing. We show that affective similarity can be (1)∼decoded directly from brain signals in response to visual stimuli, (2)∼utilized for predicting affective image similarity with an average accuracy of 0.58 and an accuracy of 0.65 for high-arousal stimuli, and (3)∼effectively used to complement affective similarity estimates of content-based models; for example when fused fNIRS and image rankings the retrieval F-measure@20 is 0.70. Our work opens new research avenues for affective multimedia analysis, retrieval, and user modeling.
Funding text :
This work is supported by the Academy of Finland (grants 352915, 350323, 336085, 322653), the Horizon 2020 FET program of the European Union through the ERA-NET Cofund funding grant CHISTERA-20-BCI-001, and the European Innovation Council Pathfinder program (SYMBIOTIK project, grant 101071147).
Scopus citations®
without self-citations
3