A computer science engineering team, Kostas Bekris and Eelke Folmer, showed off their navigation system that is meant for people with visual impairments.
The engineering team of Kostas Bekris and Eelke Folmer presented their navigation system at two national conferences.
According to Science Daily, the team explained how they built a low-cost accessible navigation system by combining human-computer interaction as well as motion-planning research. The navigation system, called Navatar, is able to run on a standard smartphone.
According to Space Daily, Bekris explained that current indoor navigation systems usually require heavy sensors that are expensive, or a handheld reader that detects radio-frequency tags that are used to figure out the location of the user.
Bekris went onto say that this has made systems expensive, which is a reason why only a few systems have been deployed in the past.
According to RedOrbit, Navatar can guide people with visual impairments right down hallways and straight into rooms. This is possible because the system gives audible instructions.
Folmer did say that the smartphone's sensors can pick up false signals. He went onto say that Navatar detects landmarks such as doors, stairs, hallways, elevators etc., and the system is able to detect landmarks in their own environment by combining people, with visual impairments, natural capabilities and probabilistic algorithms.
Folmer also said that the system can assist users with whatever their usual routine for navigation is, and this includes canes. Navatar listens for users to give commands or to push a button a a device that has Bluetooth.
The team said that there are some improvements that they are considering making, such as the accuracy and having the system work from within a pocket.