NavCog is a cognitive assistance app that aims to help the visually impaired navigate their environment through speech and audio sounds from their smartphone


Researchers from IBM Research and Carnegie Mellon University (CMU) have created a smartphone app called NavCog that allows the blind to better navigate through their surroundings for increased accessibility. The NavCog app is a toolkit that utilizes ‘cognitive technologies’ that augment weakened or missing abilities and is composed of a navigation app, map editing tool, and algorithms for localization.

NavCog has a number of interface options that the user can select for navigation, including speech through their earphones or audio sounds and vibrations. Once the user enters their destination information into the app, it will calculate the route using signals from existing Bluetooth sensors in the environment, and provide turn-by-turn navigation to the user with auditory feedback. NavCog will help the user identify their location, the direction they are facing, and additional environment information with an accuracy of 1.5 meters. Sighted people can also use the app to navigate with the visualized map function.

NavCog is part of an open platform that has been developed collaboratively by IBM Research and CMU with a goal to facilitate further research in cognitive assistance tools and improve the quality of life for the visually impaired. Additional research is being conducted by the NavCog team to add improved location accuracy, points of interest, facial recognition, and characterization of people’s activities in the environment. Although NavCog is currently being used only on CMU campus during its pilot testing, it will hopefully be adopted by other universities and public spaces in the future for ubiquitous use.

The NavCog app is available for free on the App Store as of October 16, 2015.



Carnegie Mellon University Source:

NavCog Website:





Written by Fiona Wong, PhD

Facebook Comments