Project Website • PDF • Poster • Videos • ICRA 2023
(a) Generalize to various objects sets : 92% grasp success rate in cluttered objects.
(b) Enable a good control on the generated grasp poses (e.g., generate graps with the same specific contact location, as shown in the figure above).
(c) Real time detection: more than 2000 grasps/15 ms with a common GPU.
Please check for our paper for more details.
Step 1. Recommended: install conda
with (Note: if you want to visualize the graps with mayavi, you need to get Python 3.7 installed since vtk doesn't support 3.8+ based on my knowledge)
conda create -n edge_grasp (python=3.7)
conda activate edge_grasp
Install mayavi (you can skip thi step if you don't need to visulize the generated grasps, we will provide the visulization code for grasps later)
pip install opencv-python pillow scipy matplotlib
conda install mayavi -c conda-forge
Step 2. Install Pytorch
Step 3. Install PyG. Recommend to install from wheels
Step 4. Install other required packages (some packages are not listed here, but they should be easy to install with pip
)
pip install open3d
pip install pybullet==2.7.9
Step 1. Download the 3.4-Million grasp dataset from the Google drive or Generate your own training and validation dataset with the following scripts.
python clutter_grasp_data_generator.py --scene packed --object-set packed/train --sample_number 32
python clutter_grasp_data_generator.py --scene pile --object-set pile/train --sample_number 32
Step 2. Train a model:
Note that if you are using the 3.4-Million grasps dataset, it will take more than 10 minutes at the first time to preprocess the data and store it in the drive for reuse. Also, it needs at least 12GB free space to store the dataset in your drive.
To train a traditional PointNet++-based Model:
python train.py --train --verbose
To train a Vector-Neuron PointNet++-based Model:
python train.py --train --verbose --vn
To plot the validation loss curve and accuracy:
python plot_test_loss_and_accuracy.py
Please refer to the PointNet++ paper and the Vector Neuron paper if you like to have more background.
Step 3. To test on simulated grasping with pretrained parameters (We provided the pretrained parameters in the folds of edge_grasp_net_pretrained_para
and vn_edge_pretrained_para
). You can directly test the model without dataset (step 1) and training (step 2)!
python test_clutter_grasp.py (--vn for the vector neuron version)
Edge-Grasp-Net converges within 200 epochs while takes a shorter time for each step; VN-Edge-Grasp-Net converges within 100 epochs. Please read our paper for more details if needed.
- Instruction that you can make a change for your own gripper (we used panda gripper in simulation): coming soon
- Visualization Function to look at the grasps
- Grasp filtering example.py (to select the grasp contact location, approach direction and etc. before feeding them to the model)
- ROS package that takes the input as the raw point cloud and output the scored grasps