Smart assistive navigation system for visually impaired people

Gabriel Iluebe Okolo, Turke Althobaiti*, Naeem Ramzan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

12 Downloads (Pure)

Abstract

It can be difficult for a visually impaired person to move on their own and use a typical white cane to identify items such as people, animals, cross-walks, pavements, and uneven terrain. A smart assistive navigation system is proposed in this research that combines voice-over with object detection. The system aims to promote the independence of visually impaired navigation by offering guidance with auditory feedback and tactile input to the visually impaired user upon object recognition. The created mobile application uses voice and audio only to provide navigational guidance to users. The object detection model used is YOLOV8, implemented on a Raspberry Pi equipped with a camera, speaker, ultrasonic sensor, and moisture sensor. The average accuracy score of nine tested obstacles using YOLOV8 was 91.70%. In addition to identifying objects, the built prototype allows the user to move between locations while giving them access to environmental data such as item distance, orientation, and surface wetness. The goal of this research is to enable people who are visually impaired to move about in both indoor and outdoor spaces securely and freely.
Original languageEnglish
Article number20240086
Number of pages10
JournalJournal of Disability Research
Volume4
Issue number1
DOIs
Publication statusPublished - 3 Jan 2025

Keywords

  • object detection
  • artificial intelligence (AI)
  • visually impaired person
  • internet of things (IoT)

Fingerprint

Dive into the research topics of 'Smart assistive navigation system for visually impaired people'. Together they form a unique fingerprint.

Cite this