In bot we trust? Personality traits and reciprocity in human-bot trust games

People are increasingly interacting with forms of artificial intelligence (AI). It is crucial to understand whether accepted evidence for human-human reciprocity holds true for human-bot interactions. In a pre-registered online experiment ( N = 539) we first replicate recent studies, finding that th...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in behavioral economics Vol. 2
Main Authors Upadhyaya, Nitish, Galizzi, Matteo M.
Format Journal Article
LanguageEnglish
Published 28.08.2023
Online AccessGet full text

Cover

Loading…
More Information
Summary:People are increasingly interacting with forms of artificial intelligence (AI). It is crucial to understand whether accepted evidence for human-human reciprocity holds true for human-bot interactions. In a pre-registered online experiment ( N = 539) we first replicate recent studies, finding that the identity of a player's counterpart in a one-shot binary Trust Game has a significant effect on the rate of reciprocity, with bot counterparts receiving lower levels of returned amounts than human counterparts. We then explore whether individual differences in a player's personality traits—in particular Agreeableness, Extraversion, Honesty-Humility and Openness —moderate the effect of the identity of the player's counterpart on the rate of reciprocity. In line with literature on human-human interactions, participants exhibiting higher levels of Honesty-Humility , and to a lesser extent Agreeableness , are found to reciprocate more, regardless of the identity of their counterpart. No personality trait, however, moderates the effect of interacting with a bot. Finally, we consider whether general attitudes to AI affect the reciprocity but find no significant relationship.
ISSN:2813-5296
2813-5296
DOI:10.3389/frbhe.2023.1164259