The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Android-Based Visual Tag Detection for Visually Impaired Users: System Design and Testing
|
Author(s): Hao Dong (University of Massachusetts, Amherst, USA), Jieqi Kang (University of Massachusetts, Amherst, USA), James Schafer (University of Massachusetts, Amherst, USA)and Aura Ganz (University of Massachusetts, Amherst, USA)
Copyright: 2018
Pages: 18
Source title:
Ophthalmology: Breakthroughs in Research and Practice
Source Author(s)/Editor(s): Information Resources Management Association (USA)
DOI: 10.4018/978-1-5225-5195-9.ch019
Purchase
|
Abstract
In this paper the authors introduce PERCEPT-V indoor navigation for the blind system. PERCEPT-V enhances PERCEPT system by enabling visually impaired users to navigate in open indoor spaces that differ in size and lighting conditions. The authors deploy visual tags in the environment at specific landmarks and introduce a visual tag detection algorithm using a sampling probe and cascading approach. The authors provide guidelines for the visual tag size, which is a function of various environmental, and usage scenarios, which differ in lighting, dimensions of the indoor environment and angle of usage. The authors also developed a Smartphone based user interface for the visually impaired users that uses Android accessibility features.
Related Content
Sharon L. Burton.
© 2024.
25 pages.
|
Laura Ann Jones, Ian McAndrew.
© 2024.
24 pages.
|
Olayinka Creighton-Randall.
© 2024.
14 pages.
|
Stacey L. Morin.
© 2024.
11 pages.
|
N. Nagashri, L. Archana, Ramya Raghavan.
© 2024.
22 pages.
|
Esther Gani, Foluso Ayeni, Victor Mbarika, Abdullahi I. Musa, Oneurine Ngwa.
© 2024.
25 pages.
|
Sia Gholami, Marwan Omar.
© 2024.
18 pages.
|
|
|