Effects of morality and reputation on sharing behaviors in human-robot teams
The relationship between robots and humans is becoming increasingly close and will become an inseparable part of work and life with humans and robots working together. Sharing, which involves distributing goods between individuals and others, involves individuals as potential beneficiaries and the p...
Saved in:
Published in | Frontiers in psychology Vol. 14; p. 1280127 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
Frontiers Media S.A
2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The relationship between robots and humans is becoming increasingly close and will become an inseparable part of work and life with humans and robots working together. Sharing, which involves distributing goods between individuals and others, involves individuals as potential beneficiaries and the possibility of giving up the interests of others. In human teams, individual sharing behaviors are influenced by morality and reputation. However, the impact on individuals' sharing behaviors in human-robot collaborative teams remains unclear-individuals may consider morality and reputation differently when sharing with robot or human partners. In this study, three experiments were conducted using the dictator game paradigm, aiming to compare the effects and mechanisms of morality and reputation on sharing behaviors in human and human-robot teams.
Experiment 1 involving 18 participants was conducted. Experiment 2 involving 74 participants was conducted. Experiment 3 involving 128 participants was conducted.
Experiment 1 validated the differences in human sharing behaviors when the agents were robots and humans. Experiment 2 verifies that moral constraints and reputation constraints affect sharing behaviors in human-robot teams. Experiment 3 further reveals the mechanism of differences in sharing behaviors in human-robot teams, where reputation concern plays a mediating role in the impact of moral constraint on sharing behaviors, and the agent type plays a moderating role in the impact of moral constraint on reputation concern and sharing behaviors.
The results of this study contribute to a better understanding of the interaction mechanism of human-robot teams. In the future, the formulation of human-robot collaborative team rules and the setting of interaction environments can consider the potential motivation of human behavior from both morality and reputation perspectives and achieve better work performance. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1664-1078 1664-1078 |
DOI: | 10.3389/fpsyg.2023.1280127 |