This project discusses the design and implementation of an enhanced indoor navigation system to improve existing technology by aiding the visually impaired. Loss of vision can drastically impair an individual's sense of direction and mobility, especially in unfamiliar surroundings. Due to this, visually impaired individuals often find themselves needing further support and time to gain familiarity with new indoor settings. This report aims to present an enhanced indoor navigation system, customized to cater to the needs of the visually impaired. This has been achieved through the use of Bluetooth Low Energy (BLE) beacons, a BLE-supported Android device with in-built motion sensors, and an Android Mobile Application.
The mobile application can be operated in three different modes – regular navigation, assisted navigation, and free-roam mode. The current implementation of the application supports navigation at Level 1 of the School of Computer Science and Engineering (SCSE). BLE beacons are placed at important landmarks of this environment, and the user is provided with a list of destinations for navigation. User interaction, feedback, and the effectiveness of the proposed indoor navigation system have been evaluated in the report. Experimental results and observations indicate an improvement in indoor navigation with the introduction of VirtualEYE, the Android Mobile Application.
Video Demonstration Link: Video Link
Server
- Receive Instructions/Commands from the user via:
- User Interface (UI) input
- Kinetic input
- Voice input
- Interact with the BLE beacons placed around the test area to:
- Retrieve the landmark in the closest proximity to the user
- Calculate the distance between the user and the retrieved landmark
- Perform obstacle detection to:
- Warn the user under assisted navigation of obstacles in the path
- Provide Tactile and Audio feedback for the visually impaired users:
- Vibrate the phone to indicate the correct direction during navigation
- Give audio warnings and instructions using Text-To-Speech (TTS) to provide easier interaction with the application
- Display an interactable map of the test area for visual navigation:
- Present a list of available locations the user can select and navigate to
- Update the map with markers of selected locations and the path calculated
- Provide written directions for navigation
- Work on Server Side code - Shortest Path calculation
- Work on Server Side code - Create Docker Image
- Work on Server Side code - Push server to cloud
- Work on Client Side code - Create Indoor Map
- Work on Client Side code - Test BLE Connections
- Work on Client Side code - Create proximity estimation algorithm
- Work on Client Side code - Work on Obstacle Detection
- Work on Client Side code - Create voice input and recognition
- Work on Client Side code - Create navigation algorithm
- Set up Client-Server communication
- Test pipeline
- TODO: Video Demonstration
- TODO: Multi-floor navigation
- TODO: AR-Navigation Feature
- TODO: Proximity Estimation for Obstacles
Aishwarya Singh - Linkedin
Project Link: GitHub Repository
The author would like to thank her advisor and mentor Dr. Smitha Kavallur Pisharath Gopi for her guidance and support throughout the entirety of the research. Her advice and constructive feedback were invaluable in shaping the direction and quality of the work presented in this report.