an environment for interactive exploration of reinforcement learning
Build | Interactive Demo |
---|---|
- View the releases on GitHub
- Choose the right artifacts for your operating system and hardware
- CPU: does not require a CUDA-capable graphics card and drivers
- GPU: some features require a CUDA-capable graphics card and drivers
- Download the artifacts (they are big, it will take a while)
- If your platform contains multiple
.zip
and.z01
,.z02
files, you need to download them all- a GUI or the
zip
command line tool will know how to decompress them correctly by using the.zip
file, which should be smaller than the others
- a GUI or the
- If your platform contains multiple
- Follow the instructions for e.g. miniconda
- on Windows, prefer installing to a short path on a fast device (e.g. SSD) such
as
C:/gtcl
- if you've already installed Anaconda, Miniconda, or Miniforge, it is recommended to not enable shell integration, environment variables, or registering as the default Python
- on Windows, prefer installing to a short path on a fast device (e.g. SSD) such
as
- From the command line, activate the environment
- on Windows, you should see a Start Menu entry
- run
jupyter lab
, and you should see the JupyterLab interface open in your default browser with- to choose your browser, start
jupyter lab --no-browser
and copy/paste the URL shown into your browser of choice
- to choose your browser, start
- If you run into problems, create an issue on GitHub
Copyright (c) 2021 University System of Georgia and GTCOARLab Contributors
Distributed under the terms of the BSD-3-Clause License