The UX Development Story

The Problem


And though i am blind

Situational Awareness for the Visually Impaired (VI) is severely hindered by the lack of visual sight. Not knowing who's in front of them leads to confusing situations. It takes time to remember people by their voices. 

The Solution


but now i see

We propose a Deep Learning Facial Recognition system like Facebook's (DeepFace) being integrated into the Google Glass, to inform the blind user of the people approaching them. 

Guided by the Glass: Enhancing the Accessibility of the Google Glass

Emily Dinnerman, Azad Balabanian

University of California, Santa Cruz



The goal of this paper is to develop an idea of a product that facilitates interactions between the visually impaired and the seeing by combining Google Glass’s Semi-Augmented Reality technology with a Facial Recognition System (FRS). As machine-learning gains momentum in the computer engineering field, the ability to detect, process, and display relevant results also becomes more accurate. Facebook’s DeepFace algorithm offers a 97.35% accuracy rate with facial recognition, while Google’s own research in artificial intelligence (AI) provides increased accuracy in object-image detection--both easy feats for humans, but near impossible tasks for computers until now. By incorporating the DeepFace facial recognition system, we propose a new system with Google Glass to help the visually impaired detect who and what is in front of them. The revised Google product, EyeSight, will satisfy each user’s physical and psychological needs by supplanting a visual aide for context cues to create a dynamic flow in common social interactions. 

Link to Paper


The Paper