-
Updated
Oct 22, 2018 - Python
ddqn
Here are 158 public repositories matching this topic...
DQN, Double DQN, Dueling Network
-
Updated
Jun 2, 2017 - Python
This repo contains all the praticals/homeworks assigned during the Reinforcement Learning course held by Prof. Roberto Capobianco at the AI & Robotics Master's Degree at University of Sapienza @ Rome, Italy.
-
Updated
Nov 24, 2023 - Jupyter Notebook
Deep Q-Learning agent learns how to navigate a world full of bananas. Part of the coursework for Udacity's Deep RL Nanodegree.
-
Updated
Sep 7, 2021 - Jupyter Notebook
Pong-Reinforcement-Learning Here
-
Updated
Jun 7, 2021 - Python
Jaxplorer is a Jax reinforcement learning (RL) framework for exploring new ideas.
-
Updated
Jul 19, 2024 - Python
Double DQN for Openai's Atari Environments
-
Updated
Dec 11, 2017 - Python
My Little Reinforcement Learning
-
Updated
Jul 13, 2021 - Python
Propose fully convolutional network with skip connection which is deeper than the network used in vanilla DQN.
-
Updated
Mar 2, 2021 - Python
Contains deep reinforcement learning algorithms I have implemented.
-
Updated
Dec 12, 2022 - Python
A complete MataLab laboratory for training and evaluating EMG HGR RL DQN and DDQN models.
-
Updated
Mar 7, 2024 - Jupyter Notebook
Simple implementation of DDQN for Flappy_Bird Game
-
Updated
Aug 10, 2018 - Python
Reinforcement learning algorithms and solved gym environments.
-
Updated
Oct 30, 2020 - Jupyter Notebook
AI Models for Playing Super Mario Bros
-
Updated
Jun 13, 2024 - Python
-
Updated
Jun 23, 2017 - Python
Reinforcement learning in trading and algorithmic trading is a fairly frequent example of the application of RL in practice.
-
Updated
Feb 25, 2022 - Jupyter Notebook
Value Based and policy gradient Algorithms Implementation on single and multi agent environments.
-
Updated
Oct 31, 2022 - Jupyter Notebook
Improve this page
Add a description, image, and links to the ddqn topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the ddqn topic, visit your repo's landing page and select "manage topics."