Skip to content

jurajHasik/tensorgrad

 
 

Repository files navigation

Differentiable Programming Tensor Networks

arXiv:1903.09650, by Hai-Jun Liao, Jin-Guo Liu, Lei Wang, and Tao Xiang

Requirements

  • PyTorch 1.0+
  • A good GPU card if you are inpatient or ambitious

Higher order gradient of free energy

Run this to compute the energy and specific heat of the 2D classical Ising model using Automatic Differentiation through the Tensor Renormalization Group contraction.

$ cd 1_ising_TRG
$ python ising.py 

You can supply the command line argument -use_checkpoint to reduce the memory usage.

trg

Variational optimization of iPEPS

Run this to optimize an iPEPS wavefuntion for 2D quantum Heisenberg model. Here, we use Corner Transfer Matrix Renormalization Group for contraction, and L-BFGS for optimization.

$ cd 2_variational_iPEPS
$ python variational.py -D 3 -chi 30 

In case of a question, you can type python variational.py -h. To make use of the GPU, you can add -cuda <GPUID>. You will reach the state-of-the-art variational energy and staggered magnetization using this code. You can also supply your own Hamiltonian of interest.

heisenberg

What is under the hood ?

Reverse mode AD computes gradient accurately and efficiently for you! Check the codes in adlib for backward functions which propagate gradients through tensor network contractions.

About

Differentiable Programming Tensor Networks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.7%
  • Shell 0.3%