A novel multimodal interface for improving visually impaired people's web accessibility

This paper introduces a novel interface designed to help blind and visually impaired people to explore and navigate on the Web. In contrast to traditionally used assistive tools, such as screen readers and magnifiers, the new interface employs a combination of both audio and haptic features to provi...

Full description

Saved in:
Bibliographic Details
Published inVirtual reality : the journal of the Virtual Reality Society Vol. 9; no. 2-3; p. 133
Main Authors Yu, Wai, Kuber, Ravi, Murphy, Emma, Strain, Philip, McAllister, Graham
Format Journal Article
LanguageEnglish
Published Godalming, Surrey Springer Nature B.V 01.08.2006
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper introduces a novel interface designed to help blind and visually impaired people to explore and navigate on the Web. In contrast to traditionally used assistive tools, such as screen readers and magnifiers, the new interface employs a combination of both audio and haptic features to provide spatial and navigational information to users. The haptic features are presented via a low-cost force feedback mouse allowing blind people to interact with the Web, in a similar fashion to their sighted counterparts. The audio provides navigational and textual information through the use of non-speech sounds and synthesised speech. Interacting with the multimodal interface offers a novel experience to target users, especially to those with total blindness. A series of experiments have been conducted to ascertain the usability of the interface and compare its performance to that of a traditional screen reader. Results have shown the advantages that the new multimodal interface offers blind and visually impaired people. This includes the enhanced perception of the spatial layout of Web pages, and navigation towards elements on a page. Certain issues regarding the design of the haptic and audio features raised in the evaluation are discussed and presented in terms of recommendations for future work. [PUBLICATION ABSTRACT]
ISSN:1359-4338
1434-9957
DOI:10.1007/s10055-005-0009-z