Skip to content

Latest commit

 

History

History
82 lines (58 loc) · 3.1 KB

README.md

File metadata and controls

82 lines (58 loc) · 3.1 KB

Lightweight Event-based Optical Flow Estimation via Iterative Deblurring

Work accepted to 2024 IEEE International Conference on Robotics and Automation (ICRA'24) [paper, video].

idnet-graphical-abstract

id-viz

If you use this code in an academic context, please cite our work:

@InProceedings{Wu_2024_ICRA,
    author    = {Wu, Yilun and Paredes-Vall\'es, Federico and de Croon, Guido C. H. E.},
    title     = {Lightweight Event-based Optical Flow Estimation via Iterative Deblurring},
    booktitle = {Proceedings of IEEE International Conference on Robotics and Automation (ICRA'24)},
    month     = {May},
    year      = {2024},
    Note      = {To Appear}
}

Dependency

Create a conda env and install dependencies by running

conda env create --file environment.yml

Download (For Evaluation)

The DSEC dataset for optical flow can be downloaded here. Use script download_dsec_test.py for your convenience. It downloads the dataset directly into the DATA_DIRECTORY with the expected directory structure.

download_dsec_test.py <DATA_DIRECTORY>

Once downloaded, create a symbolic link called data pointing to the data directory:

ln -s <DATA_DIRECTORY> data/test

Download (For Training)

For training on DSEC, two more folders need to be downloaded:

or establish symbolic links under data/ pointing to the folders.

Download (MVSEC)

To run experiments on MVSEC, additionally download outdoor day sequences .h5 files from https://drive.google.com/open?id=1rwyRk26wtWeRgrAx_fgPc-ubUzTFThkV and place the files under data/ or point symbolic links pointing to the data files under data/.

Run Evaluation

To run eval:

cd idnet
conda activate IDNet
python -m idn.eval

Change the save directory for eval results in idn/config/validation/dsec_test.yaml if you prefer. The default is at /tmp/collect/XX.

To switch between models, change the model option in idn/config/id_eval.yaml to switch between id model with 1/4 and 1/8 resolution.

To eval TID model, change the function decorator above the main function in eval.py.

At the end of evaluation, a zip file containing the results will be created in the save directory, for which you can upload to the DSEC benchmark website to reproduce our results.

Run Training

To train IDNet, run:

cd idnet
conda activate IDNet
python -m idn.train

Similarly, switch between id-4x, id-8x and tid models and MVSEC training by changing the hydra.main() decorator in train.py and settings in the corresponding .yaml file.