From 79b2665191115f3ed905e6afdf09990a8d484362 Mon Sep 17 00:00:00 2001 From: Zhongang Cai <62529255+caizhongang@users.noreply.github.com> Date: Fri, 10 Dec 2021 13:04:13 +0800 Subject: [PATCH] [Docs] Update documentations (#21) - [x] list supported body models - [x] provide link to download neutral smpl - [x] provide links to download preprocessed files for vibe - [x] citation modifications - [x] minor layout adjustments * Update cmu_mosh download link * Give example of body model renaming * add placeholders for vibe preprocessed data * Add download links to vibe preprocessed files Co-authored-by: panghuien --- CITATION.cff | 2 +- README.md | 17 +++++++++++++++++ README_CN.md | 25 ++++++++++++++++++++----- docs/getting_started.md | 3 +++ docs/preprocess_dataset.md | 31 ++++++++++++++++++++++++++++++- 5 files changed, 71 insertions(+), 7 deletions(-) diff --git a/CITATION.cff b/CITATION.cff index 0f97c1cd..0a6d84a3 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -2,7 +2,7 @@ cff-version: 1.2.0 message: "If you use this software, please cite it as below." authors: - name: "MMHuman3D Contributors" -title: "MMHuman3D: OpenMMLab Human Pose and Shape Estimation Toolbox and Benchmark" +title: "MMHuman3D: OpenMMLab 3D Human Parametric Model Toolbox and Benchmark" date-released: 2021-12-01 url: "https://github.com/open-mmlab/mmhuman3d" license: Apache-2.0 diff --git a/README.md b/README.md index 8b2e5c8b..e29884b9 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,13 @@ +
+
+
+ +
+ [![Documentation](https://readthedocs.org/projects/mmhuman3d/badge/?version=latest)](https://mmhuman3d.readthedocs.io/en/latest/?badge=latest) [![actions](https://github.com/open-mmlab/mmhuman3d/workflows/build/badge.svg)](https://github.com/open-mmlab/mmhuman3d/actions) [![codecov](https://codecov.io/gh/open-mmlab/mmhuman3d/branch/master/graph/badge.svg)](https://codecov.io/gh/open-mmlab/mmhuman3d) @@ -9,6 +15,7 @@ [![LICENSE](https://img.shields.io/github/license/open-mmlab/mmhuman3d.svg)](https://github.com/open-mmlab/mmhuman3d/blob/main/LICENSE) [![Percentage of issues still open](https://isitmaintained.com/badge/open/open-mmlab/mmhuman3d.svg)](https://github.com/open-mmlab/mmhuman3d/issues) +
## Introduction @@ -38,6 +45,16 @@ https://user-images.githubusercontent.com/62529255/144362861-e794b404-c48f-4ebe- More details can be found in [model_zoo.md](docs/model_zoo.md). +Supported body models: + +
+(click to collapse) + +- [x] [SMPL](https://smpl.is.tue.mpg.de/) (SIGGRAPH Asia'2015) +- [x] [SMPL-X](https://smpl-x.is.tue.mpg.de/) (CVPR'2019) + +
+ Supported methods:
diff --git a/README_CN.md b/README_CN.md index 739e880a..a2963276 100644 --- a/README_CN.md +++ b/README_CN.md @@ -1,16 +1,21 @@ +
+
- +[![LICENSE](https://img.shields.io/github/license/open-mmlab/mmhuman3d.svg)](https://github.com/open-mmlab/mmhuman3d/blob/main/LICENSE) +[![Percentage of issues still open](https://isitmaintained.com/badge/open/open-mmlab/mmhuman3d.svg)](https://github.com/open-mmlab/mmhuman3d/issues) -[![LICENSE](https://img.shields.io/github/license/open-mmlab/mmhuman3d.svg)](https://github.com/open-mmlab/mmhuman3d/blob/master/LICENSE) + ## 简介 @@ -40,6 +45,16 @@ https://user-images.githubusercontent.com/62529255/144362861-e794b404-c48f-4ebe- 更多详情可见 [模型库](docs/model_zoo.md)。 +已支持的人体参数化模型: + +
+(click to collapse) + +- [x] [SMPL](https://smpl.is.tue.mpg.de/) (SIGGRAPH Asia'2015) +- [x] [SMPL-X](https://smpl-x.is.tue.mpg.de/) (CVPR'2019) + +
+ 已支持的算法:
diff --git a/docs/getting_started.md b/docs/getting_started.md index 7dfd619a..16c6db38 100644 --- a/docs/getting_started.md +++ b/docs/getting_started.md @@ -25,6 +25,9 @@ Please refer to [data_preparation.md](./preprocess_dataset.md) for data preparat ## Body Model Preparation - [SMPL](https://smpl.is.tue.mpg.de/) v1.0 is used in our experiments. + - Neutral model can be downloaded from [SMPLify](https://smplify.is.tue.mpg.de/). + - All body models have to be renamed in `SMPL_{GENDER}.pkl` format.
+ For example, `mv basicModel_neutral_lbs_10_207_0_v1.0.0.pkl SMPL_NEUTRAL.pkl` - [J_regressor_extra.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/J_regressor_extra.npy?versionId=CAEQHhiBgIDD6c3V6xciIGIwZDEzYWI5NTBlOTRkODU4OTE1M2Y4YTI0NTVlZGM1) - [J_regressor_h36m.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/J_regressor_h36m.npy?versionId=CAEQHhiBgIDE6c3V6xciIDdjYzE3MzQ4MmU4MzQyNmRiZDA5YTg2YTI5YWFkNjRi) - [smpl_mean_params.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/smpl_mean_params.npz?versionId=CAEQHhiBgICN6M3V6xciIDU1MzUzNjZjZGNiOTQ3OWJiZTJmNThiZmY4NmMxMTM4) diff --git a/docs/preprocess_dataset.md b/docs/preprocess_dataset.md index a5a470bb..10b2bcf7 100644 --- a/docs/preprocess_dataset.md +++ b/docs/preprocess_dataset.md @@ -151,7 +151,7 @@ coco, pw3d, mpii, mpi_inf_3dhp, lsp_original, lsp_extended, h36m ``` **Alternatively**, you may download the preprocessed files directly: -- [cmu_mosh.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/cmu_mosh.npz?versionId=CAEQHhiBgMCglbPY6xciIDEyMTFmOGFkNWZjNDQxYjg4YjlhNjNmMjhhMjQzZTk0) +- [cmu_mosh.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/cmu_mosh.npz?versionId=CAEQHhiBgIDoof_37BciIDU0OGU0MGNhMjAxMjRiZWI5YzdkMWEzMzc3YzBiZDM2) - [coco_2014_train.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/coco_2014_train.npz?versionId=CAEQHhiBgICUrvbS6xciIDFmZmFhMDk5OGQ3YzQ5ZDE5NzJkMGQxNzdmMmQzZDdi) - [h36m_train.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/h36m_train.npz?versionId=CAEQHhiBgMDrrfbS6xciIGY2NjMxMjgwMWQzNjRkNWJhYTNkZTYyYWUxNWQ4ZTE5) - [lsp_train.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/lsp_train.npz?versionId=CAEQHhiBgICnq_bS6xciIDU4ZTRhMDIwZTBkZjQ1YTliYTY0NGFmMDVmOGVhZjMy) @@ -229,6 +229,35 @@ mmhuman3d ``` +For VIBE training and testing, the following datasets are required: + - [MPI-INF-3DHP](#mpi-inf-3dhp) + - [PW3D](#pw3d) + + +The data converters are currently not available. + +**Alternatively**, you may download the preprocessed files directly: +- [vibe_insta_variety.npz](https://pjlab-my.sharepoint.cn/:u:/g/personal/openmmlab_pjlab_org_cn/EYnlkp-69NBNlXDH-5ELZikBXDbSg8SZHqmdSX_3hK4EYg?e=QUl5nI) +- [vibe_mpi_inf_3dhp_train.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/vibe_mpi_inf_3dhp_train.npz?versionId=CAEQHhiBgICTnq3U6xciIGUwMTc5YWQ2MjNhZDQ3NGE5MmYxOWJhMGQxMTcwNTll) +- [vibe_pw3d_test.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/vibe_pw3d_test.npz?versionId=CAEQHhiBgMD5na3U6xciIGQ4MmU0MjczYTYzODQ1NDQ5M2JiNzY1N2E5MTNlOWY5) + + +The preprocessed datasets should have this structure: +```text +mmhuman3d +├── mmhuman3d +├── docs +├── tests +├── tools +├── configs +└── data + ├── datasets + └── preprocessed_datasets + ├── vibe_insta_variety.npz + ├── vibe_mpi_inf_3dhp_train.npz + └── vibe_pw3d_test.npz +``` + For HYBRIK training and testing, the following datasets are required: - [HybrIK](#hybrik) - [COCO](#coco)