Code for "Adaptive Gradient Quantization for Data-Parallel SGD", published in NeurIPS 2020.
-
Updated
Jan 14, 2021 - Jupyter Notebook
Code for "Adaptive Gradient Quantization for Data-Parallel SGD", published in NeurIPS 2020.
PyTorch Code for the paper "NUQSGD: Provably Communication-efficient Data-parallel SGD via Nonuniform Quantization"
Code for numerical results in the ICASSP 2020 paper "Decentralized optimization with non-identical sampling in presence of stragglers".
Add a description, image, and links to the data-parallel-sgd topic page so that developers can more easily learn about it.
To associate your repository with the data-parallel-sgd topic, visit your repo's landing page and select "manage topics."