Mobility for blind people could be significantly improved with the arrival of a new device that uses artificial intelligence. The technology offers real-time guidance, detects obstacles and provides instructions through voice commands or tactile signals.
A new wearable electronic system has been developed to assist blind or visually impaired people in navigating environments. The technology, based on artificial intelligence (AI), transforms images captured by a camera into voice and vibration guidance. The study was published in the journal Nature Machine Intelligence.
Voice and vibration guidance
The device created by Leilei Gu and his team combines voice commands with tactile signals. The system uses an AI algorithm that analyzes video from an attached camera and determines obstacle-free routes.
Guidance is sent via headphones from driving bone, in addition to vibrations emitted by artificial skins installed on the wrists.
- A 14-year-old has developed an AI-powered app that can detect signs of heart disease in seconds.
- Scientists have developed a navigation system based on principles of quantum physics that promises to be up to 50 times more accurate than GPS — even in places without a satellite signal
- China's new spy satellite technology can capture faces from space using laser and radar technology
- You Can Now Own Your Own Military-Grade Personal Satellite for $5,5 Million
These skins vibrate as the user moves to avoid objects on the side, such as walls or furniture. The idea is that the person receives sound and tactile information about the environment in real time, facilitating movement with greater safety and autonomy.
Nature Machine Intelligence (2025). DOI: 10.1038/s42256-025-01018-6
Testing in real and virtual environments
To evaluate the effectiveness of the technology, the system was tested with humanoid robots and also with blind and visually impaired participants.
Testing took place in both simulated and real-world environments.
The results indicated significant improvements in the participants' mobility. They were able, for example, to navigate mazes without bumping into obstacles and to pick up objects in specific locations with greater precision.
Integration of the senses can expand use
Research suggests that combining the visual, auditory and tactile senses can make visual assistance systems more effective.
The study suggests that this integration improves usability and could be an important step for other assistive technologies in the future.
The authors advocate continuous improvement of the system and new applications in different contexts.