Skip to content

Latest commit

 

History

History
59 lines (33 loc) · 3.13 KB

README.md

File metadata and controls

59 lines (33 loc) · 3.13 KB

CrohnsDisease

Final year Masters project at Imperial College on tackling Crohn's Disease

Follow up project to R. Hollands work:

Classifying Crohn's disease from MRI volumes, using a 3D ResNet with a soft attention mechanism. Capable of an average f-1 score of >0.8 using both patient-specific localisation and population-specific localisation.

For description/details of old project, please view the original repository: https://github.com/RobbieHolland/CrohnsDisease

Repo Guide

Brief explanation of important files which are used in this iteration of the project (old files are left unchanged for reference).

Training

/run_crohns_pytorch.sh - Run config specifying training and model parameters (root of execution) for cross-validation experiment.

/run_crohns_pytorch_all.sh - Run batch of cross-validation experiments, testing each network configuration (with or without multimodal data, patient-specific localisatino, attention mechanism).

/run_pytorch.py - Parses config options and starts training procedure.

/pytorch/pytorch_train.py - Constructs and iteratively trains Pytorch network, logging the performance at each step.

/pytorch/mri_dataset.py - Loads data saved in dataset by /preprocessing/np_generator.py, performs data augmentation.

/pytorch/resnet.py - Specification for 3D Resnet, including soft attention mechanism

Preproprocessing pipeline

Files under /preprocessing/ generate the .npy training and testing datasets used in training.

/preprocessing/metadata.py - Loads labels and MRI data into memory.

/preprocessing/preprocess.py - Extracts region of interest from MRI volumes.

/preprocessing/np_generator.py - Generates a series of training and test '.npy' files for cross-fold evaluation.

/preprocessing/generate_np_datasets.py - Configures and executes the generation process (i.e. how many cross folds)

~Helpful notebooks

They were useful to me, they might be useful to you.

Much of the code is mini-experiments or tests I used when developing the project, so they may not work or serve an obvious purpose now.

Think of them as my scrap paper when solving the problems of this project, so they may not function as the project has iterated over time.

/preprocessing/multimodal_precossing_test.ipynb - Notebook going step by step through process in /preprocessing/generate_np_datasets.py, so images can be inspected at each stage.

/pytorch/test_numpy_dataset.ipynb - Load data into the MRIDataset of /pytorch/mri_dataset.py, test data augmentation methods.

/pytorch/view_numpy_dataset.ipynb - Manually load saved dataset from .npy file, visualise examples.

/pytorch/test_trained_model.py - Load saved model and relevant dataset for specific experiment, showing results.

/pytorch/compare_noise_amounts.py - Load datasets, investigate how noise effects intensity distribution.