-
The presented code in this repository is the implementation of PhysGNN, available at: https://arxiv.org/pdf/2109.04352.pdf
-
Please cite as:
@article{salehi2021physgnn,
title={PhysGNN: A Physics-Driven Graph Neural Network Based Model for Predicting Soft Tissue Deformation in Image-Guided Neurosurgery},\
author={Salehi, Yasmin and Giannacopoulos, Dennis},\
journal={arXiv preprint arXiv:2109.04352},\
year={2021}
}
-
The code used for generating the datasets used in the paper is also included.
-
The tumour and brain model are taken from the paper: "A machine learning approach for real-time modelling of tissue deformation in image-guided neurosurgery" by Tonutti et al, available at: https://github.com/michetonu/MALTIDEM--Machine-Learning-for-Tissue-Deformation-Modelling
-
The
Graph_load_batch
function inpg_dataset.py
has been adapted from theGraph_load_batch
function indataset.py
of Position-aware Graph Neural Networks code by You et al. available at: https://github.com/JiaxuanYou/P-GNN
-
Please download:
-
FEBio 2.9.1 @ https://febio.org/febio/febio-downloads/
-
export_fig-master @ https://www.mathworks.com/matlabcentral/fileexchange/23629-export_fig
and place all the downloaded files in the "code" folder.
-
-
Please make the following installations:
pip install torch-scatter torch-sparse torch-cluster torch-spline-conv torch-geometric -f https://data.pyg.org/whl/torch-1.12.0+cu113.html
pip install tensorboardX
pip install networkx
-
Please run
data_generator_1.m
to generate Dataset 1, anddata_generator_2.m
to generate Dataset 2. Formatted data will be saved in dataset_1 and dataset_2 folders. -
Run
dataset_full.py
to generate the preprocessed data. -
Run
pg_dataset.py
to create 1/11 of dataset1/dataset2 and pickle them. Each run requires 22GB of RAM. Dataset 1 and Dataset 2 use up 16.79 GB of RAM each.
- Select the desired configuration from training in
main.py
and simply run the code.
- Select the desired configuration from
reproduce.py
and simply run the code.
(Requirement: A Google Colab Pro Account)
- Upload dataset_1 and dataset_2 folders to your Google Drive.
- Upload
Dataset_Generation.ipynb
to your Google Drive. - Set up the paths on your Colab notebook.
- Run all the cells consecutively. Ensure proper selection of the dataset to be configured as a pytorch geometric dataset and its corresponding pickle name.
- After one pytorch geometric dataset is created, restart the runtime to clear the RAM.
- Each section of Dataset 1 and Dataset 2 takes 1 hour and 15 minutes to be generated.
- After successful generation of the datasets, upload
PhysGNN.ipynb
to your Google Drive. - Run all the cells in a consecutive order.
- In the cell "Final Run for Training" select the dataset you wish to use
- In the "Final Run" section select the configuration you want to train.
- Run all the cells until "Final Run for Training". Run the "Reproducing the Results" cell instead.
- Select the Dataset and the model you wish to reproduce their results in the "Final Run" under the Reproducibility cell.
- Results generated from the pickled configurations can be saved by setting the
save
parameter to 1. - Setting
mean_mag_results
to 1 generates the mean Euclidean errors reported (Table 4). - Setting
max_error_results
to 1 generates the max error results reported (Table 5).