A curated list of papers of interesting empirical study and insight on deep learning. Continually updating...
-
Updated
Nov 3, 2024
A curated list of papers of interesting empirical study and insight on deep learning. Continually updating...
Code for Arxiv Double Descent Demystified: Identifying, Interpreting & Ablating the Sources of a Deep Learning Puzzle
MDL Complexity computations and experiments from the paper "Revisiting complexity and the bias-variance tradeoff".
Explore the double-descent phenomena in the context of system identification. Companion code to the paper (https://arxiv.org/abs/2012.06341):
Double Descent results for FCNNs on MNIST, extended by Label Noise (Reconciling Modern Machine-Learning Practice and the Classical Bias–Variance Trade-Off) [Python/PyTorch]..
This repository is the official implementation of "Optimization Variance: Delve into the Epoch-Wise Double Descent of DNNs"
This project outlines 4 experiments to explore the effects of several settings on the bias-variance tradeoff curve
A Review of Preetum Nakkiran's "More Data Can Hurt for Linear Regression: Sample-wise Double Descent"
Assignments of my CST Part II Deep Neural Networks unit
Double descent experiments/repros on classical ML models and deep neural nets
ICLR 2022: Phenomenology of Double Descent in Finite-width Neural Networks
Implementation of the double descent Deep Learning phenomenon from the article Grokking: Generalization beyond overfitting.
Interpolating Neural Networks in Asset Pricing Data. Supports Distributed Training in TensorFlow.
DSC 261 Responsible Data Science Project
Toy dataset to study double descent optimization patterns in machine learning.
Add a description, image, and links to the double-descent topic page so that developers can more easily learn about it.
To associate your repository with the double-descent topic, visit your repo's landing page and select "manage topics."