An accessibility tool that converts user eye movements to communication messages.
Highlights of the project:
- Used a Keras libaray with Tensorflow backend to train a binary classifier Machine Learning model using dataset between open and closed eyes.
- Translated data input into Morse code and further decrypted data into audio output and keystrokes.
- Utilized Python OpenCV for webcam input and Numpy for AI Model Data Processing.
Remarks:
- This project is created and completed during hackathon HackTheNorth in September 2019.
- Collaboration with team members Nayan Saxena and Vicky Chen.
- More detailed documentation and a short demo video can be found here: https://devpost.com/software/dot-tfa5re