Smart Visual System for Visually Impaired People

In recent years with the development of cameras and scene detection algorithms, a wide variety of Electronic Travel Aids (ETAs) for visually impaired people (VIP) have been proposed. However, it is still challenging to convey scene information to visually impaired people efficiently. According to Wo

2025-06-28 16:24:58 - Adil Khan

Project Title

Smart Visual System for Visually Impaired People

Project Area of Specialization Electrical/Electronic EngineeringProject Summary

In recent years with the development of cameras and scene detection algorithms, a wide variety of Electronic Travel Aids (ETAs) for visually impaired people (VIP) have been proposed. However, it is still challenging to convey scene information to visually impaired people efficiently. According to World Health Organization, around 253 million people live with visual impairments in the world. Visually impaired people meet various difficulties when they travel in an unfamiliar environment due to their visual impairments. Early ETAs usually use ultrasonic sensors to detect obstacles and remind VIPs through vibration and beeps. Due to low spatial resolution, ultrasonic sensors can only acquire limited information in every single measurement, which is insufficient for VIPs to perceive environments in real-time. We present three independent auditory-based interaction methods to convey scene information to the VIP. We use a VR box with a camera and an attitude sensor as our hardware platform to acquire environmental information for all three methods. We use three Sonification methods i.e., distance sonification, Obstacle sonification as well as path sonification, which are distance information, obstacle information, and path information respectively for visually impaired people.

Project Objectives Project Implementation Method

A camera, an attitude sensor, an ultrasonic sensor, a raspberry pi, and a bone conduction headphone are integrated into a pair of smart glasses or VR box. The camera’s horizontal and vertical field of view (FoV) is 53 degrees and 41 degrees respectively, which is adequate to assist visually impaired people (VIP) in most scenarios. The attitude sensor is a standalone system that utilizes inertial and magnetic sensors to estimate 3D orientation in the ENU coordinate system. We use bone conduction headphones since they do not prevent VIPs from perceiving sound from environments. The short distance calculation can be sent through the ultrasonic sensor.

Next, we introduce the software implementation. There are several available algorithms for objection and recognition such as R-CNN, Fast R-CNN, Faster R-CNN, YOLO(V3-V5), YOLOR, and SSD. These algorithms have some pre-trained modules that can be used to identify different objects.

A camera, an attitude sensor, an ultrasonic sensor, a raspberry pi, and a bone conduction headphone are integrated into a pair of smart glasses or VR box. The camera’s horizontal and vertical field of view (FoV) is 53 degrees and 41 degrees respectively, which is adequate to assist visually impaired people (VIP) in most scenarios. The attitude sensor is a standalone system that utilizes inertial and magnetic sensors to estimate 3D orientation in the ENU coordinate system. We use bone conduction headphones since they do not prevent VIPs from perceiving sound from environments. The short distance calculation can be sent through the ultrasonic sensor.

Next, we introduce the software implementation. There are several available algorithms for objection and recognition such as R-CNN, Fast R-CNN, Faster R-CNN, YOLO(V3-V5), YOLOR, and SSD. These algorithms have some pre-trained modules that can be used to identify different objects.

Benefits of the Project

Technical Benefits:

The purpose of this concept is to assist the visually impaired person through a smart visual glass or VR box. The calculation of long-distance obstacle which is present in the real-time surrounding will be sent to the Raspberry Pi through stereo vision technology and the short distance calculation can be sent through the ultrasonic sensor.

When the obstacle is found the speaker will alert the user to take the precautions. All processes will work as an immediate response.

The user might be traveling in a dark location the IR camera will help to identify the path.

Social Benefits:

Humans can estimate the object size and distance of objects through their humans’ stereo vision eyes.

The main aim of this project is to develop a visual aid device using stereo vision and geometry base obstacle detection algorithm. As the devices like long canes used as conventional aid by the blind person are the most expensive and are not easily available. The purpose of this is to fulfill the basic needs of a blind person as it is interpreted that the population of blind people in the category of poor people is more and these glasses for blind people are easy to use and economically viable.

Technical Details of Final Deliverable

We propose three different auditory-based interaction methods that convey different levels of scene information to sound for assisting VIPs. We first introduce some preliminary sonification and then present three scene sonification methods in detail. The first method is distance sonification and it conveys distance information to VIPs. The latter two methods, obstacle sonification, and path sonification utilize scene detection algorithms and convey the detection results to VIP.

Final Deliverable of the Project HW/SW integrated systemCore Industry OthersOther IndustriesCore Technology Wearables and ImplantablesOther Technologies Artificial Intelligence(AI)Sustainable Development Goals Good Health and Well-Being for PeopleRequired Resources
Elapsed time in (days or weeks or month or quarter) since start of the project Milestone Deliverable
Month 1Literature reviewProject idea
Month 2Analysis of design & revisionsComplete analysis of the project
Month 3Analysis of design & revisionsComplete analysis of the project
Month 4Market SurveyMarket survey of the project.
Month 5Hardware AssemblingHardware design and implementation
Month 6Software DesigningImplementation of Algorithms and data structures
Month 7Software DesigningImplementation of Algorithms and data structures
Month 8Software DesigningImplementation of Algorithms and data structures
Month 9IntegrationHardware & software
Month 10Testing & debuggingHardware & software testing

More Posts