Skip to content

Latest commit

 

History

History
51 lines (44 loc) · 2.32 KB

2017-automatic-differentiation-in-pytorch.md

File metadata and controls

51 lines (44 loc) · 2.32 KB
title authors fieldsOfStudy filesize_readable meta_key meta_relpath numCitedBy pdf_relpath reading_status ref_count tags url_slug urls venue year
Automatic differentiation in PyTorch
Adam Paszke
S. Gross
Soumith Chintala
Gregory Chanan
E. Yang
Zach DeVito
Zeming Lin
Alban Desmaison
L. Antiga
Adam Lerer
Computer Science
50.8 KB
2017-automatic-differentiation-in-pytorch
paper-extra-data/pdf-meta/2017-automatic-differentiation-in-pytorch.yaml
10406
paper-repo/pdfs/2017-automatic-differentiation-in-pytorch.pdf
TBD
7
gen-from-ref
paper
pytorch
2017-automatic-differentiation-in-pytorch
2017

pdf(local)

semanticscholar url

Automatic differentiation in PyTorch

Abstract

In this article, we describe an automatic differentiation module of PyTorch — a library designed to enable rapid research on machine learning models. It builds upon a few projects, most notably Lua Torch, Chainer, and HIPS Autograd [4], and provides a high performance environment with easy access to automatic differentiation of models executed on different devices (CPU and GPU). To make prototyping easier, PyTorch does not follow the symbolic approach used in many other deep learning frameworks, but focuses on differentiation of purely imperative programs, with a focus on extensibility and low overhead. Note that this preprint is a draft of certain sections from an upcoming paper covering all PyTorch features.

Paper References

  1. Modeling, Inference and Optimization With Composable Differentiable Procedures
  2. DyNet - The Dynamic Neural Network Toolkit
  3. TensorFlow - Large-Scale Machine Learning on Heterogeneous Distributed Systems
  4. Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition
  5. Compiling fast partial derivatives of functions given by algorithms
  6. SciPy - Open Source Scientific Tools for Python