A google glass app to help the blind in small talk

In this paper we present a wearable prototype that can automatically recognize affective cues such as number of people present, their age and gender distributions given an image. We customize the prototype in the context of helping people with visual impairments to better navigate social scenarios. Running an experiment to validate this technology in social scenarios remains part of our future work.