(NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original ImageNet-1K val set.
-
Updated
Nov 6, 2024 - Python
(NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original ImageNet-1K val set.
Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)
An Efficient Dataset Condensation Plugin and Its Application to Continual Learning. NeurIPS, 2023.
A collection of dataset distillation papers.
Add a description, image, and links to the dataset-condensation topic page so that developers can more easily learn about it.
To associate your repository with the dataset-condensation topic, visit your repo's landing page and select "manage topics."