Empowering Trust: The Role of Adaptable Design in AI Systems

The integration of AI (artificial intelligence) in workplaces has increased automation but often at the cost of transparency, potentially undermining user trust. Adaptable, user-centered systems address this challenge by enhancing users' understanding of the system and tailoring interactions to...

Full description

Saved in:
Bibliographic Details
Published inIEEE International Inter-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (Print) pp. 40 - 47
Main Authors Staab, V., Hein, I., Ramrath, M., Schluter, L., Stuckstatte, A., Hohn, M., Sieberg, P., Liebherr, M.
Format Conference Proceeding
LanguageEnglish
Published IEEE 02.06.2025
Subjects
Online AccessGet full text
ISSN2379-1675
DOI10.1109/CogSIMA64436.2025.11079500

Cover

More Information
Summary:The integration of AI (artificial intelligence) in workplaces has increased automation but often at the cost of transparency, potentially undermining user trust. Adaptable, user-centered systems address this challenge by enhancing users' understanding of the system and tailoring interactions to their needs through adaptable elements that allow them to customize system settings. In an online study with a between-subjects design, 197 participants interacted with an adaptable or non-adaptable system for decision-making to examine the influence of its features on perception (transparency, fairness, control), trust, and intention to use. Results showed that adaptability positively affected perceived transparency, fairness, and especially control. Furthermore, the intention to use was positively influenced both directly by perceived fairness and indirectly through two serial mediation pathways: one from perceived fairness through trust, and another from perceived control through trust. This study underscores the importance of adaptable design elements in human-computer interaction, demonstrating that they enhance user perception and intention to use, whereas AI systems that restrict user involvement and autonomy risk diminishing both trust and intention to use.
ISSN:2379-1675
DOI:10.1109/CogSIMA64436.2025.11079500