Liang Pan
·
Jingbo Wang
·
Buzhen Huang
·
Junyu Zhang
·
Haofan Wang
·
Xu Tang
·
Yangang Wang
Southeast University Shanghai AI Laboratory Xiaohongshu Inc.
We propose InterScene, a novel method that generates physically plausible long-term motion sequences in 3D indoor scenes. Our approach enables physics-based characters to exhibit natural interaction-involved behaviors, such as sitting down (gray), getting up (blue), and walking while avoiding obstacles (pink).
- [2025-03-03] Update camera-ready paper and website.
- [2023-11-09] Release code for training and evaluating the sit policy.
- [2023-10-16] Paper got accepted by 3DV 2024.
To create the environment, follow the following instructions:
- We recommend to install all the requirements through Conda by
conda create -n rlgpu python=3.7
pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 torchaudio==0.8.1 -f https://download.pytorch.org/whl/torch_stable.html
pip install -r requirements.txt
- Download IsaacGym Preview 4 from the official site and install it via pip.
To prepare data for training/evaluating InterCon (sit & get-up policies), follow the following instructions:
-
Download SMPL-X v1.1 from the official site. Put them in the
body_models/smplx
folder. -
Download SAMP motion dataset from the official site. Put them in the
samp
folder. Please download Motion Clips (.pkl), which contains the SMPL-X parameters. -
The file structure should look like this:
|-- InterScene
|-- body_models
|-- smplx
|-- SMPLX_FEMALE.npz
|-- SMPLX_FEMALE.pkl
|-- SMPLX_MALE.npz
|-- ...
|-- samp
|-- chair_mo_stageII.pkl
|-- chair_mo001_stageII.pkl
|-- chair_mo002_stageII.pkl
|-- ...
- Run the following script to generate reference motion dataset:
python InterScene/data/dataset_samp_sit/generate_motion.py --samp_pkl_dir ./samp --smplx_dir ./body_models/smplx
- Run the following script to generate 3D object dataset:
python InterScene/data/dataset_samp_sit/generate_obj.py
## training sit policy
python InterScene/run.py --task HumanoidLocationSit --cfg_env InterScene/data/cfg/humanoid_location_sit.yaml --cfg_train InterScene/data/cfg/train/rlg/amp_task_location_sit.yaml --motion_file InterScene/data/dataset_samp_sit/dataset_samp_sit.yaml --num_envs 4096 --headless
## evaluating sit policy
python InterScene/run.py --task HumanoidLocationSit --cfg_env InterScene/data/cfg/humanoid_location_sit.yaml --cfg_train InterScene/data/cfg/train/rlg/amp_task_location_sit.yaml --motion_file InterScene/data/dataset_samp_sit/dataset_samp_sit.yaml --num_envs 4096 --headless --checkpoint InterScene/data/models/policy_sit.pth --test
@inproceedings{pan2024synthesizing,
title={Synthesizing physically plausible human motions in 3d scenes},
author={Pan, Liang and Wang, Jingbo and Huang, Buzhen and Zhang, Junyu and Wang, Haofan and Tang, Xu and Wang, Yangang},
booktitle={2024 International Conference on 3D Vision (3DV)},
pages={1498--1507},
year={2024},
organization={IEEE}
}
This repository is built on the top of the following amazing codebases:
Please follow the license of the above repositories for the usage of this project.