Skip to content

Latest commit

 

History

History
9 lines (6 loc) · 328 Bytes

README.md

File metadata and controls

9 lines (6 loc) · 328 Bytes

Sufficient PPO

This repo contains a refactored version of PPO from stable-baselines3

Features

- Minor changes to improve performance
- Minor changes to mirror OpenAI baselines ppo2 implementation
- Configs necessary to run experiments on mujoco and atari environments with default OpenAI baselines parameters