Dataset (Google Drive)
Data structure
Dataset root/
├── [Place_holder]/
| ├── [Place_holder].bvh # MoCap data from Noitom Axis Studio (PNStudio)
| ├── [Place_holder]_pos.csv # Every joint's roration, generated from `*_bvh`
| ├── [Place_holder]_rot.csv # Every joint's translation, generated from `*_bvh`
| ├── [Place_holder].pcap # Raw data from the LiDAR
| └── [Place_holder]_lidar_trajectory.txt # N×9 format file
├── ...
|
└── scenes/
├── [Place_holder].pcd
├── [Place_holder]_ground.pcd
├── ...
└── ...
- Place_holder can be replaced to
campus_raod
,climbing_gym
, andlab_building
. *_lidar_trajectory.txt
is generated by our Mapping method and manually calibrated with corresponding scenes.*_bvh
and*_pcap
are raw data from sensors. They will not be used in the following steps.- You can test your SLAM algorithm by using
*_pcap
captured from Ouster1-64 with 1024×20Hz.
Github
Copyright
The HSC4D dataset is published under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License.You must attribute the work in the manner specified by the authors, you may not use this work for commercial purposes and if you alter, transform, or build upon this work, you may distribute the resulting work only under the same license. Contact us if you are interested in commercial usage.
Citation
@InProceedings{Dai_2022_CVPR,
author = {Dai, Yudi and Lin, Yitai and Wen, Chenglu and Shen, Siqi and Xu, Lan and Yu, Jingyi and Ma, Yuexin and Wang, Cheng},
title = {HSC4D: Human-Centered 4D Scene Capture in Large-Scale Indoor-Outdoor Space Using Wearable IMUs and LiDAR},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {6792-6802}
}
Further information and commercial licensing
For further information, or for commercial licensing, please contact us at the following email addresses:
cwang@xmu.edu.cn
clwen@xmu.edu.cn