A simple and efficient implementation of Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture (I-JEPA)
-
Updated
Aug 16, 2024 - Python
A simple and efficient implementation of Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture (I-JEPA)
Enhancing LLMs with LoRAs
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
Self supervised learning through self distillation with no labels (DINO) with Vision Transformers on the PCAM dataset.
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
Bayesian Optimization Meets Self-Distillation, ICCV 2023
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
Official implementation of Self-Distillation for Gaussian Processes
Self-Distillation and Knowledge Distillation Experiments with PyTorch.
A generalized self-supervised training paradigm for unimodal and multimodal alignment and fusion.
A minimalist unofficial implementation of "Self-Distillation from the Last Mini-Batch for Consistency Regularization"
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".
(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)
Deep Hash Distillation for Image Retrieval - ECCV 2022
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
Add a description, image, and links to the self-distillation topic page so that developers can more easily learn about it.
To associate your repository with the self-distillation topic, visit your repo's landing page and select "manage topics."