An Optimistic Approach to the Q-Network Error in Actor-Critic Methods
-
Updated
Jun 23, 2022 - Python
An Optimistic Approach to the Q-Network Error in Actor-Critic Methods
Active versus Passive exploration
over-parameterization = exploration ?
This project uses Reinforcement Learning to teach an agent to drive by itself and learn from its observations so that it can maximize the reward(180+ lines)
A simple exercise in reinforcement learning
The GitHub repository for "Accelerating Approximate Thompson Sampling with Underdamped Langevin Monte Carlo", AISTATS 2024.
Reinforcement Learning (COMP 579) Project
OpenAI, gym environment implementation
Repository Containing Comparison of two methods for dealing with Exploration-Exploitation dilemma for MultiArmed Bandits
OSPO is a novel metaheuristic algorithm which has the potential to solve different kinds of problems with promising performance.
Deep Intrinsically Motivated Exploration in Continuous Control
This repository contains a variety of projects related to reinforcement learning, showcasing different approaches to implementing it in various scenarios.
Human and sim. behavioral / small-scale neural data for paper: https://www.biorxiv.org/content/10.1101/2022.10.03.510668v2
This project focuses on comparing different Reinforcement Learning Algorithms, including monte-carlo, q-learning, lambda q-learning epsilon-greedy variations, etc.
A companion repository for 'Inverse Bayesian Optimization: Learning Human Acquisition Functions in an Exploration vs Exploitation Search Task'
Official implementation of LECO (NeurIPS'22)
This is an implementation of the Reinforcement Learning multi-arm-bandit experiment using different exploration techniques.
Explore the 10-Arm Testbed Simulation! 🎲 Utilize Python to test various ε-greedy strategies in a reinforcement learning environment. Visualize and compare agents' performance as they balance exploration and exploitation. Perfect for learners and enthusiasts! 🚀📊
Add a description, image, and links to the exploration-exploitation topic page so that developers can more easily learn about it.
To associate your repository with the exploration-exploitation topic, visit your repo's landing page and select "manage topics."