Skip to content

NUS-LinS-Lab/ManiFM

Repository files navigation

Zhixuan Xu* · Chongkai Gao* · Zixuan Liu* · Gang Yang*

Chenrui Tie · Haozhuo Zheng · Haoyu Zhou · Weikun Peng · Debang Wang · Tianyi Chen · Zhouliang Yu

Lin Shao

* denotes equal contribution

main

This is the official code repository of ManiFoundation Model for General-Purpose Robotic Manipulation of Contact Synthesis with Arbitrary Objects and Robots.

TODO

-[ ] Add dataset generation code.

-[ ] Add more hands.

-[ ] Upload 200K data.

Installation

1. Create virtual environment

conda create -n mani python==3.8.18
conda activate mani

Not sure if other python versions are OK.

2. Install

  1. Install PyTorch. ManiFM is tested on CUDA version 11.7 and PyTorch version 2.0.1. Not sure if other versions are OK.

  2. Install PyTorch3D. We recommend to use the following commands to install:

pip install git+https://github.com/facebookresearch/pytorch3d.git
  1. pip install -r requirements.txt

Dataset

The dataset should be put at the path specified in configs/train.json.

The 3k dataset can be downloaded here. The 20k dataset can be downloaded here.

You can visualize the dataset with scripts/vis_dataset.py.

Distributed Training

The distributed training is implemented with Accelerate integrated with DeepSpeed. The detailed tutorials can be found at the official website. The existing config file configs/acc_config.yaml is used to train on one single machine with two GPUs. You can generate your own config file with prompts using the following command to replace configs/acc_config.yaml.

accelerator config --config_file configs/acc_config.yaml

Training

bash scripts/train.sh

The saved models can be found at logs/models. Since Accelerate saves the model parameters as safetensors, you need to load your model in the way like

from safetensors.torch import load_model, save_model

...

load_model(model, "model.safetensors")
# Instead of model.load_state_dict(load_file("model.safetensors"))

Evaluation

Network

We provide a checkpoint trained on 200K dataset. You can use it to predict contact points and motion on rigid objects, clothes, rope, and plasticine. We provide a demo for each type of object in scripts/pred.py.

Optimization

We provide the optimization code for two type of hands, LeapHand and AllegroHand. Note that AllegroHand is not included in our training dataset.

Run scripts/pred_w_opt.py.

Cite

@misc{xu2024manifoundation,
      title={ManiFoundation Model for General-Purpose Robotic Manipulation of Contact Synthesis with Arbitrary Objects and Robots}, 
      author={Zhixuan Xu and Chongkai Gao and Zixuan Liu and Gang Yang and Chenrui Tie and Haozhuo Zheng and Haoyu Zhou and Weikun Peng and Debang Wang and Tianyi Chen and Zhouliang Yu and Lin Shao},
      year={2024},
      eprint={2405.06964},
      archivePrefix={arXiv},
      primaryClass={cs.RO}
}