Innovating out of empathy, with a vision for the blind
When a friend described the daily hurdles of living without sight, four university students decided to turn their empathy into innovation.
Aiming to offer visually impaired individuals a new form of vision, National University of Singapore students Manas Bam, 21, and Sparsh, 19, along with Nanyang Technological University students Yajat Gulati, 19, and Shrivardhan Goenka, 20, created yaR – a wearable pendant driven by artificial intelligence (AI) to assist a person in navigating their surroundings.
The pendant houses a high-resolution camera and microphone, and leverages AI to recognise objects, read text and interpret surroundings.
A simple press of a button or voice command activates the device, and a concise answer is conveyed to the user through the built-in speaker or connected headphones using natural-sounding text-to-speech in just a few seconds.
The innovation may just get the nod from British inventor James Dyson.
The team submitted the device to the national competition for the James Dyson Award – given to aspiring engineers and inventors who solve real-world problems through innovative designs.
Across 1,917 entries from 29 countries, including 49 from five universities in Singapore, yaR was named one of the runners-up on Sept 11, earning a spot in the international competition.
The winning entry was developed by Luke Goh of NUS, who created Mammosense – a tool that enhances patient comfort during mammograms by optimising breast compression.
In October, Dyson engineers will evaluate the final 87 entries, with Mr James Dyson himself picking the winners from the top 20 in November. Winners receive $50,500 in prize money.
Unlike other devices such as smart glasses or apps that require constant interaction, yaR – which denotes “friend” in Hindi (“yaar”) – offers a hands-free and user-friendly experience.
“It was specifically designed to be sleek and lightweight for daily wear,” said Manas. “The team also integrated LTE (long-term evolution) connectivity, allowing the device to work seamlessly without relying on Wi-Fi.”
The group began their journey in January this year during a hackathon organised by NUS, where they brainstormed ideas addressing various societal needs.
A friend of one of the team members, Mr Ajay Minocha, who is in his 30s and works as a consultant at Deloitte, inspired the group to focus on assistive technology for the visually impaired. Mr Minocha is blind.
With diverse skill sets in hardware and software, the team began working on their first prototype. The initial version was far from perfect – a bulky, wire-filled device that resembled an arc reactor from Iron Man – but as Yajat put it: “It worked.”
Manas said yaR went through continuous user testing and feedback over seven months, adding that the team collaborated with the Singapore Association of Visually Handicapped to test their prototypes.
Early testers pointed out that the camera was too large, and the buttons were difficult to use, said Yajat. So, the team miniaturised the device, improved the camera’s field of view and added tactile feedback to the buttons.
The team continues to refine the device, with plans to integrate LiDAR (light detection and ranging) technology so yaR can measure distances more accurately, thus improving the AI’s ability to navigate complex environments.
“The goal is to make the technology globally accessible, with plans to manufacture and sell the device at a price expected to remain under $500,” said Sparsh.
Get The New Paper on your phone with the free TNP app. Download from the Apple App Store or Google Play Store now