[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity"; [ICML 2023] "Are Large Kernels Better Teachers than Transformers for ConvNets?"
-
Updated
Jul 5, 2023 - HTML
[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity"; [ICML 2023] "Are Large Kernels Better Teachers than Transformers for ConvNets?"
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Custom library for neural networks that integrates with TensorFlow & PyTorch. Dynamically adjusts the sparsity of connections during training to optimize resources.
Add a description, image, and links to the dynamic-sparsity topic page so that developers can more easily learn about it.
To associate your repository with the dynamic-sparsity topic, visit your repo's landing page and select "manage topics."