Skip to content

Latest commit

 

History

History
73 lines (58 loc) · 2.86 KB

README.md

File metadata and controls

73 lines (58 loc) · 2.86 KB

Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar

Project Page - Paper

Teaser GIF Teaser GIF 2

News

Installation

# Set up conda environment & install dependencies
conda env create -f environment.yml
conda activate radarfields

# Install tinycudann
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch

# Install Radar Fields
pip install -e .
python -c "import radarfields; print(radarfields.__version__)" # Should print "1.0.0"

Running a Model

# Run a pre-trained demo model
python demo.py --config configs/radarfields.ini --demo --demo_name [DEMO_NAME]

# Run training job
python main.py --config configs/radarfields.ini --name [NAME] --seq [SEQUENCE NAME] --preprocess_file [PATH TO PREPROCESS .JSON]

# More help
python main.py --help

Pre-trained Models

We have several pre-trained models available for download.

These can be run without downloading any datasets.

Citation

@inproceedings{radarfields,
author = {Borts, David and Liang, Erich and Broedermann, Tim and Ramazzina, Andrea and Walz, Stefanie and Palladin, Edoardo and Sun, Jipeng and Brueggemann, David and Sakaridis, Christos and Van Gool, Luc and Bijelic, Mario and Heide, Felix},
title = {Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar},
year = {2024},
isbn = {9798400705250},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3641519.3657510},
doi = {10.1145/3641519.3657510},
booktitle = {ACM SIGGRAPH 2024 Conference Papers},
articleno = {130},
numpages = {10},
keywords = {neural rendering., radar},
location = {Denver, CO, USA},
series = {SIGGRAPH '24}
}

Acknowledgements

The general structure/layout of this codebase was inspired by LiDAR-NeRF & torch-ngp.

We also rely on tiny-cuda-nn for our networks and encodings, and on Nerfstudio for pose optimization.