Project for constructing deep NN to classify emotion in real-time virtual enviornments.
Datasets used: CEAP-360VR https://github.com/cwi-dis/CEAP-360VR-Dataset T. Xue, A. El Ali, T. Zhang, G. Ding, and P. Cesar, "CEAP-360VR: A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360° Videos," in IEEE Transactions on Multimedia, doi: 10.1109/TMM.2021.3124080.
VRFS https://github.com/vremotions/vrfs Vatsal, Ritik, et al. "An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming." arXiv preprint arXiv:2306.09690 (2023).
AVR https://data.mendeley.com/datasets/y76vbw92y9/3 Ramirez-Lechuga, Sharon; Alonso-Valerdi, Luz Maria; Ibarra-Zarate, David I (2023), “Audiovirtual Reality to induce anger and happiness emotions: A physiological response (EEG, GSR, BVP and TMP) database”, Mendeley Data, V3, doi: 10.17632/y76vbw92y9.3
The model can be implemented in real-time applications using LSL library (https://github.com/sccn/labstreaminglayer).