System for Analyzing User Interest Based on Eye Gaze Responses to Enhance Empathy with Users

Recently, robotic systems that offer practical services in everyday life, such as smart home systems, have been developed. The latest trend moves beyond user-controlled internal systems through traditional methods, such as remote controls, to systems that autonomously understand user contexts and ev...

Full description

Saved in:
Bibliographic Details
Published inJournal of advanced computational intelligence and intelligent informatics Vol. 29; no. 3; pp. 641 - 648
Main Authors Woo, Jinseok, Hu, Jiaren
Format Journal Article
LanguageEnglish
Published Tokyo Fuji Technology Press Co. Ltd 01.05.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently, robotic systems that offer practical services in everyday life, such as smart home systems, have been developed. The latest trend moves beyond user-controlled internal systems through traditional methods, such as remote controls, to systems that autonomously understand user contexts and evolve to provide a more comfortable living environment. Therefore, this research aims to advance this field by exploring the potential of a system capable of understanding human conditions and behavior, and by proposing or executing actions that align with an individual’s intended actions. To investigate a system capable of achieving this goal, we focused on analyzing the gazes of users and developed an eyeglass-type wearable device. The primary objective of this study was to track a specific user’s gaze, identify the object of focus, and analyze the user’s level of attention and interest in that object. Therefore, for the sensory configuration of the system, an analysis was performed using data collected from camera sensors for eye tracking and sensors for measuring environmental information. Based on the analysis results, we evaluated whether the system could accurately interpret and anticipate the actions that user intended to perform.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1343-0130
1883-8014
DOI:10.20965/jaciii.2025.p0641