Training-Free Neural Architecture Search: A Review
Training-Free Neural Architecture Search (NAS) is an innovative approach within the field of NAS that aims to streamline the process of discovering optimal neural network architectures without relying on traditional training paradigms.
Unlike conventional NAS methods that involve iterative training and evaluation of numerous architectures to identify the best-performing one, training-free NAS explores architectures without training. Instead, it often leverages training-free score functions, zero-cost proxies, or analytical methods to estimate or directly predict the performance of neural network architectures, significantly reducing computational overhead and resource requirements.
These methods are designed to accelerate the architecture search process, making it more efficient, cost-effective, and accessible for researchers and practitioners. Training-free NAS contributes to advancing the field of deep learning by expediting the discovery of novel architectures and facilitating their deployment in various domains, including computer vision, natural language processing, etc.
This repository serves as a curated collection of research papers and benchmarks dedicated to training-free NAS, aiming to provide a comprehensive resource for understanding, exploring, and evaluating these innovative approaches in NAS.
RS | BO | RL | DSA | MA |
---|---|---|---|---|
Random Search | Bayesian Optimization | Reinforcement Learning | Differentiable Search Algorithm | Metaheuristic Algorithm |
Contributions and feedback are welcome! Feel free to open issues or pull requests to improve existing content.
This repository is licensed under the MIT License.
If you find this repository helpful, consider giving it a star ⭐️.
If you find this repository or the resources included herein helpful in your research or work, you can cite it using the following BibTeX entry:
@misc{Wu2023,
author = {Wu, Meng-Ting},
title = {Training-Free NAS},
year = {2023},
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/MarttiWu/Training-Free-NAS}},
}