Artificial intelligence can emulate human normative judgments on emotional visual scenes

Affective reactions have deep biological foundations; however, in humans, the development of emotion concepts is also shaped by language and higher-order cognition. A recent breakthrough in artificial intelligence (AI) has been the creation of multimodal language models that exhibit impressive intel...

Full description

Saved in:
Bibliographic Details
Published inRoyal Society open science Vol. 12; no. 7; pp. 250128 - 7
Main Authors Romeo, Zaira, Testolin, Alberto
Format Journal Article
LanguageEnglish
Published England The Royal Society Publishing 01.07.2025
The Royal Society
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Affective reactions have deep biological foundations; however, in humans, the development of emotion concepts is also shaped by language and higher-order cognition. A recent breakthrough in artificial intelligence (AI) has been the creation of multimodal language models that exhibit impressive intellectual capabilities, but their responses to affective stimuli have not been investigated. Here, we study whether state-of-the-art multimodal systems can emulate human emotional ratings on a standardized set of images, in terms of affective dimensions and basic discrete emotions. The AI judgements correlate surprisingly well with the average human ratings: given that these systems were not explicitly trained to match human affective reactions, this suggests that the ability to visually judge emotional content can emerge from statistical learning over large-scale databases of images paired with linguistic descriptions. Besides showing that language can support the development of rich emotion concepts in AI, these findings have broad implications for sensitive use of multimodal AI technology.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2054-5703
2054-5703
DOI:10.1098/rsos.250128