Getting Around is an Android application that lets the visually-impaired navigate the world through sound, either by pinning sounds to places, or by creating audio-trails.
Motivation
So much of Augmented Reality today is focused on the narrow goal of augmented one's visual senses, in the form of digital overalays over camera feeds or transparent displays. I wanted to explore what was possible when an application focused on augmented sound instead of sight. I thought that was interesting because:
Even the most state-of-the-art optical see-through displays have a 40 degree field of view at best. With audio, you can do 360 degree sound spatialization with just two speakers. This means that spatial audio can provide an incredibly high level of immersion.
Thanks to smartphone-based AR solutions like ARCore and ARKit, you can can do localization inexpensively. That provides a platform for deploying applications to large numbers of people.
At the hackathon, I met an amazing team of people. Sunish, a visually-impaired engineer and UX professor wanted to build an AR application for accessibility. The rest of the team was interested in integrating film and design techniques with immersive media. Together, we decided to explore how spatial sound could help with navigation for the blind.
Implementation
Our application is an Android app built using ARCore. In its current form it has two main features.
Pinned ambisonics: A web app lets the user attach virtual sounds (either ambisonics or normal mp4s) to locations on a map. When the user walks past those places in the real-world, the relevant sound will play back. The app uses the phone's GPS and IMU to determine the user's 6DOF position and replay the ambisonic file correctly.
For the blind, objects that are not within arms reach and do not intrinsically produce sound are virtually undetectable. By augmenting silent objects with digital audio, blind users can easily detect their presence. For our demo, we walked around Cambridge and curated sounds for bus stops, post boxes, entrances to specific buildings, and crosswalks using our app. The creative challenge lay in finding sounds that are informative, yet not distracting.
Real-time breadcrumbing: Inspired by the Hansel and Gretel folktale, we adding a feature to create audio trails so that users can retrace their path later on. In our implementation, the breadcrumbs are audio orbs that emit sound at a volume inversely proportional to a function of the user's distance from them. To follow the path, the user simply follows the sound of the music, and a sudden silence implies they've ventured off the path. We use ARCore to localize the audio orbs, and calculate distances. We also researched the ideal volume drop-off function that takes into accound the width of the road, the 'safe zone' around the user, and the user's hearing ability. Sunish found this feature especially exciting because it would let him go hiking, and other perform activities that are traditionally difficult to do unassisted.