Skip to content

Commit

Permalink
updated for more talks
Browse files Browse the repository at this point in the history
  • Loading branch information
alexsnow348 committed Dec 23, 2024
1 parent 2b7390d commit c539d9a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion talks/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ title: Talks
* Slides - <strong><a href ="https://drive.google.com/file/d/18GRYC2YsePyo6wLdQlYBuR2bflnDSxob/view?usp=sharing" target="_blank">Download</a></strong>


* <strong><a href ="https://youtu.be/0LtoRcXcudc" target="_blank">Streamlining AI: Knowledge Distillation for Smaller, Efficient Models</a></strong>
* <strong><a href ="https://youtu.be/0LtoRcXcudc" target="_blank"> Knowledge Distillation : Making LLM model smaller - Streamlining AI</a></strong>

This talk explores the transformative potential of knowledge distillation in creating smaller, efficient AI models while preserving their performance. Delve into its role in flexible architectures, data augmentation, and resource-constrained applications like TinyML. The discussion covers key concepts, including the teacher-student framework, various distillation schemes, and objective loss functions. It also highlights practical tools like PyTorch's `torchdistill` library and examines real-world applications in NLP and object detection. Join us to uncover how knowledge distillation is shaping the future of efficient deep learning.

Expand Down

0 comments on commit c539d9a

Please sign in to comment.