MHAD: Multimodal Home Activity Dataset with Multi-Angle Videos and Synchronized Physiological Signals (ICASSP 2025)
Here is MHAD: Multimodal Home Activity Dataset with Multi-Angle Videos and Synchronized Physiological Signals collected by JD health, Huazhong University of Science and Technology and Zhejiang University.
The MHAD Dataset is the first public dataset of subjects in a real home environment, with different shooting angles and various household scenarios. It contains the most comprehensive physiological signals to date, making it an invaluable resource for academic research in various fields, including computer vision, machine learning, and biomedical engineering.
Physiological Signals: Comprehensive data including heart rate, respiratory rate, and other vital signs.
Video Data: Multi-angle video recordings of subjects in various household scenarios.
Annotations: Detailed annotations for each scenario and physiological signal.
This dataset is built for academic use. Any commercial usage is banned.
To gain access to the MHAD Dataset, please follow these steps:
Use your official academic email address. Include the following information in your email:
Your full name
Your academic institution
Your position (e.g., PhD student, Professor)
A brief description of your research and how you intend to use the dataset
A statement confirming that the dataset will be used solely for academic purposes and not for commercial use.
Email your request to: feijintao3@jd.com Use the subject line: "MHAD Dataset Access Request"
Our team will review your request and verify your academic credentials. You will receive a response within 5-7 business days. Upon approval, you will receive an email with a secure download link to the MHAD Dataset. Please do not share this link with others. Each user must request access individually.
MHAD_Dataset
-----------------
MHAD/
|-- sub01/
| |-- a/ #Before exercise
| | |-- 1/ #Watching tv
| | | |-- output1.avi #fontal
| | | |-- output2.avi #90-degree side
| | | |-- output3.avi #45-degree side
| | | |-- sub01_a_1.csv #gt
| | |-- 2/ #Using phone
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_a_2.csv
| | |-- 3/ #Reading
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_a_3.csv
| | |-- 4/ #Talking
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_a_4.csv
| | |-- 5/ #Eating
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_a_5.csv
| | |-- 6/ #Drinking
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_a_6.csv
| |-- b/ #After exercise
| | |-- 1/
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_b_1.csv
| | |-- 2/
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_b_2.csv
| | |-- 3/
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_b_3.csv
| | |-- 4/
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_b_4.csv
| | |-- 5/
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_b_5.csv
| | |-- 6/
| | | |-- output1.avi
| | | |-- output2.avi
| | | |-- output3.avi
| | | |-- sub01_b_6.csv
|-- sub40/
| ...
-----------------
@misc{yu2024mhadmultimodalhomeactivity,
title={MHAD: Multimodal Home Activity Dataset with Multi-Angle Videos and Synchronized Physiological Signals},
author={Lei Yu and Jintao Fei and Xinyi Liu and Yang Yao and Jun Zhao and Guoxin Wang and Xin Li},
year={2024},
eprint={2409.09366},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2409.09366},
}