Skip to content

Commit

Permalink
[update] update README for two types of retargeting
Browse files Browse the repository at this point in the history
  • Loading branch information
yzqin committed Sep 8, 2023
1 parent 24fd53b commit 32da0d0
Show file tree
Hide file tree
Showing 2 changed files with 53 additions and 46 deletions.
49 changes: 3 additions & 46 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,51 +24,8 @@ pip3 install -e ".[example]"

### Retargeting from human hand video

1. **Generate the robot joint pose trajectory from our pre-recorded video.**
[Tutorial on retargeting from human hand video](example/vector_retargeting/README.md)

```shell
cd example/vector_retargeting
python3 detect_from_video.py \
--robot-name allegro \
--video-path data/human_hand_video.mp4 \
--retargeting-type vector \
--hand-type right \
--output-path data/allegro_joints.pkl
```

This command will output the joint trajectory as a pickle file at the `output_path`.

The pickle file is a python dictionary with two keys: `meta_data` and `data`. `meta_data`, a dictionary, includes
details about the robot, while `data`, a list, contains the robotic joint positions for each frame. For additional
options, refer to the help information. Note that the time cost here includes both the hand pose detection from video,
and the hand pose retargeting in single process mode.

```shell
python3 detect_from_video.py --help
```

2. **Utilize the pickle file to produce a video of the robot**

```shell
python3 render_robot_hand.py \
--pickle-path data/allegro_joints.pkl \
--output-video-path data/retargeted_allegro.mp4 \
--headless
```

This command uses the data saved from the previous step to create a rendered video.

3. **Record a video of your own hand**

```bash
python3 capture_webcam.py --video-path example/vector_retargeting/data/my_human_hand_video.mp4

```

This command will access your webcam (which should be connected to your computer) and record the video stream in mp4
format. To end video recording, press `q` on the keyboard.

### Retargeting from hand-object pose trajectory
### Retarget from hand object pose dataset

Here we use the [DexYCB]() dataset to show that how can we retarget the human hand-object interaction trajectory to
robot hand-object interaction trajectory. After
[Tutorial on retargeting from hand-object pose dataset](example/position_retargeting/README.md)
50 changes: 50 additions & 0 deletions example/vector_retargeting/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
## Retarget Robot Motion from Human Hand Video

### Generate the robot joint pose trajectory from our pre-recorded video

```shell
cd example/vector_retargeting
python3 detect_from_video.py \
--robot-name allegro \
--video-path data/human_hand_video.mp4 \
--retargeting-type vector \
--hand-type right \
--output-path data/allegro_joints.pkl
```

This command will output the joint trajectory as a pickle file at the `output_path`.

The pickle file is a python dictionary with two keys: `meta_data` and `data`. `meta_data`, a dictionary, includes
details about the robot, while `data`, a list, contains the robotic joint positions for each frame. For additional
options, refer to the help information. Note that the time cost here includes both the hand pose detection from video,
and the hand pose retargeting in single process mode.

```shell
python3 detect_from_video.py --help
```

### Utilize the pickle file to produce a video of the robot

```shell
python3 render_robot_hand.py \
--pickle-path data/allegro_joints.pkl \
--output-video-path data/retargeted_allegro.mp4 \
--headless
```

This command uses the data saved from the previous step to create a rendered video.

### Record a video of your own hand

```bash
python3 capture_webcam.py --video-path example/vector_retargeting/data/my_human_hand_video.mp4

```

This command will access your webcam (which should be connected to your computer) and record the video stream in mp4
format. To end video recording, press `q` on the keyboard.

### Retargeting from hand-object pose trajectory

Here we use the [DexYCB]() dataset to show that how can we retarget the human hand-object interaction trajectory to
robot hand-object interaction trajectory. After

0 comments on commit 32da0d0

Please sign in to comment.