This app is built with Streamlit and visualizes eye gaze data over a video. It overlays real-time eye gaze positions of selected subjects on the video as it plays, allowing you to see how each subject tracks the content. You can toggle individual subjects to visualize their gaze data.
- Video Player: Plays a video while displaying the corresponding eye gaze data.
- Subject Selection: Toggle individual subjects to display their gaze data.
- Gaze Tracking: Shows real-time eye gaze positions as the video plays.
The application relies on the following key packages:
- Streamlit
- Pandas
- Numpy
- video.js (for video rendering)
- Streamlit Component (for integrating the video player)
git clone https://github.com/cosanlab/peye_viewer.git
cd peye_viewer
You can use Conda to manage the environment for this app. Below are instructions to set up a Conda environment:
a. Create Conda environment
conda create -n peye_viewer python=3.11
b. Activate the environment
conda activate peye_viewer
c. Install dependencies
pip install -r requirements.txt
Once all dependencies are installed, run the Streamlit app by executing the following command:
streamlit run peye_viewer/viewer.py
Ensure that your video and gaze data files are correctly set up. The paths to these resources are configured in the code:
- Video URL: Defined in the viewer.py script
- Eye Gaze Data File: Defined in the DATA_FILE constant in viewer.py
The CSV data should include at least the following columns for eye gaze data:
- SID: Subject ID
- gaze_x: Gaze position on the x-axis (normalized between 0 and 1)
- gaze_y: Gaze position on the y-axis (normalized between 0 and 1)
- relative_timestamp: Time in seconds relative to the start of the video
Feel free to submit a pull request or open an issue if you encounter any problems or have suggestions.
This project is licensed under the MIT License.