Skip to content

Latest commit

 

History

History
26 lines (16 loc) · 1.4 KB

README.md

File metadata and controls

26 lines (16 loc) · 1.4 KB

EyeTrackRobot

This repository hosts the code and documentation for a novel system designed to empower individuals with severe speech and motor impairment (SSMI). Utilizing eye-tracking technology, the system empowers these individuals to control a robotic arm and a smart wheelchair, thereby enhancing their ability to interact with their environment and gain a greater degree of independence.

Conference News: Thrilled to announce that our research paper has been accepted for presentation at the ICRoM 2023 conference. Link

Key Components

  • Eye-Tracking Module: Utilizes advanced image processing and machine learning techniques to accurately track eye movements.
  • Robotic Arm Control: Converts eye movement data into precise commands to control the movements of a robotic arm.
  • Wheelchair Navigation System: Interprets eye gestures for directional control of a smart wheelchair, with built-in safety features like automatic obstacle avoidance.

Performance Evaluation

  • Results of tests measuring the accuracy and responsiveness of the eye-tracking system.
  • User feedback and case studies illustrating the system's impact on individuals with SSMI.

Authors

  • Kiana Hooshanfar
  • Maryam Asad Samani
  • Helia Shams Jey

Contact Information

For any queries, please contact k.hooshanfar@ut.ac.ir.