Video clips below show subjects selected from our study in which visual cue-evoked emotional responses were assessed with EEG, ECG, Audio and other data. Results in these experiments demonstrate the correlation of neural activity patterns with the valence (positive, neutral or negative) of the images presented. In combination with ECG and contextual data, this type of analysis enables Acrovirt to generate virtual emotion libraries that is used to predict responses and preferences in real life situations.
This next section is dedicated to research findings in our field of industry and please note these articles are third party institution studies.
A brain-computer interface to detect responses to affective audiovisual stimuli from electroencephalogram.
Real-time EEG-based emotion recognition and its applications.
Correlation of EEG images and speech signals for emotion analysis.
Classification of emotional states from electrocardiogram signals: A non-linear approach based on hurst.
Scientists Locate Brain Area Where Value Decisions Are Made
Original Source: Cell. ARTICLE| VOLUME 177, ISSUE 7, P1858-1872.E15, JUNE 13, 2019
Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals.
Core affect and the psychological construction of emotion.
Emotion recognition based on weighted fusion strategy of multichannel physiological signals.
Orienting and emotional perceptions: Facilitation, attenuation, and interference.
Looking at pictures: Affective, facial, visceral and behavioral reactions