An out-of-the-box GUI tool for offline deep reinforcement learning
-
Updated
May 29, 2021 - JavaScript
An out-of-the-box GUI tool for offline deep reinforcement learning
Learning representations for RL in Healthcare under a POMDP assumption
Direct port of TD3_BC to JAX using Haiku and optax.
PyTorch implementation of the implicit Q-learning algorithm (IQL)
The Medkit-Learn(ing) Environment: Medical Decision Modelling through Simulation (NeurIPS 2021) by Alex J. Chan, Ioana Bica, Alihan Huyuk, Daniel Jarrett, and Mihaela van der Schaar.
ExORL: Exploratory Data for Offline Reinforcement Learning
Code for Continuous Doubly Constrained Batch Reinforcement Learning, NeurIPS 2021.
Codes accompanying the paper "Offline Reinforcement Learning with Value-Based Episodic Memory" (ICLR 2022 https://arxiv.org/abs/2110.09796)
PyTorch Implementation of Offline Reinforcement Learning algorithms
Package for recording Transitions in OpenAI Gym Environments.
Code for the paper "Showing Your Offline Reinforcement Learning Work: Online Evaluation Budget Matters", ICML 2022
Pytorch implementation of state-of-the-art offline reinforcement learning algorithms.
[MLHC 2021] Model Selection for Offline RL: Practical Considerations for Healthcare Settings. https://arxiv.org/abs/2107.11003
A large-scale multi-modal pre-trained model
Extreme Q-Learning: Max Entropy RL without Entropy
Code to reproduce experiments from "User-Interactive Offline Reinforcement Learning" (ICLR 2023)
Neural Laplace Control for Continuous-time Delayed Systems - an offline RL method combining Neural Laplace dynamics model and MPC planner to achieve near-expert policy performance in environments with irregular time intervals and an unknown constant delay.
Offline to Online RL: AWAC & IQL PyTorch Implementation
Code for Undergrad Final Year Project “Offline Risk-Averse Actor-Critic with Curriculum Learning”
Add a description, image, and links to the offline-rl topic page so that developers can more easily learn about it.
To associate your repository with the offline-rl topic, visit your repo's landing page and select "manage topics."