Enhancing Human-AI perceptual alignment through visual-haptic feedback system for autonomous drones

Artificial Intelligence (AI) has emerged as an effective agent for controlling autonomous drones in navigation and target search tasks across various applications with minimal human intervention. Despite their advantages, significant challenges exist in aligning human operators' perceptual unde...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of industrial ergonomics Vol. 109; p. 103780
Main Authors Wu, Jiahao, Sun, Bowen, You, Hengxu, Du, Jing
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.09.2025
Subjects
Online AccessGet full text
ISSN0169-8141
DOI10.1016/j.ergon.2025.103780

Cover

Loading…
More Information
Summary:Artificial Intelligence (AI) has emerged as an effective agent for controlling autonomous drones in navigation and target search tasks across various applications with minimal human intervention. Despite their advantages, significant challenges exist in aligning human operators' perceptual understanding with autonomous drone AI's assessment of environmental changes, particularly in dynamic and complex urban settings. This study addresses this issue by proposing a human-machine sensory sharing system that integrates visual and haptic feedback to enhance situational awareness, reduce cognitive load, and improve trust in the AI agent that controls the drones. By bridging the perceptual gap between humans and AI, our approach fosters a more cohesive and responsive interaction, enabling operators to make informed decisions in real-time. Through a human-subject experiment (N = 30) in a simulated urban environment, participants assessed environmental changes and adjusted drone AI parameters based on multimodal sensory feedback. Eye-tracking data were collected to evaluate cognitive load and engagement under different feedback conditions. Results show that combining visual and haptic feedback significantly enhances user performance, satisfaction, and decision-making speed, reducing perceptual misalignment between humans and AI. Participants using multimodal feedback demonstrated faster response times and higher environmental assessment accuracy than single-modality feedback. This research advances the design of intuitive human-drone interaction systems, emphasizing the role of multimodal sensory integration and physiological monitoring in improving human-machine collaboration. These findings have implications for applications in logistics, search and rescue, surveillance, and environmental monitoring, where operator engagement and performance are critical. •Developed a multimodal feedback system integrating visual and haptic feedback for perceiving environmental conditions.•Improved human-drone interaction, reducing cognitive load and enhancing situational awareness during drone operation.•Reduced perceptual misalignment between human operators and autonomous drone AI systems.•Validated system effectiveness through a human-subject experiment in a simulated urban environment.
ISSN:0169-8141
DOI:10.1016/j.ergon.2025.103780