-
Updated
Jul 1, 2019 - Jupyter Notebook
bandit-algorithm
Here are 20 public repositories matching this topic...
A small collection of Bandit Algorithms (ETC, E-Greedy, Elimination, UCB, Exp3, LinearUCB, and Thompson Sampling)
-
Updated
May 25, 2022 - Python
Online learning approaches to optimize database join operations in PostgreSQL.
-
Updated
Nov 4, 2024 - C
This presentation contains very precise yet detailed explanation of concepts of a very interesting topic -- Reinforcement Learning.
-
Updated
Dec 25, 2017
Research project on automated A/B testing of software by evolutionary bandits.
-
Updated
Jan 17, 2019 - MATLAB
-
Updated
Oct 28, 2020 - Jupyter Notebook
Solutions to the Stanford CS:234 Reinforcement Learning 2022 course assignments.
-
Updated
Jun 27, 2022 - Python
Adversarial multi-armed bandit algorithms
-
Updated
Jul 8, 2019 - MATLAB
Reinforcement learning
-
Updated
Jun 20, 2020 - Scala
-
Updated
Jan 23, 2018 - Python
Movie Recommendation using Cascading Bandits namely CascadeLinTS and CascadeLinUCB
-
Updated
May 17, 2018 - MATLAB
Network-Oriented Repurposing of Drugs Python Package
-
Updated
Oct 29, 2024 - Jupyter Notebook
Client that handles the administration of StreamingBandit online, or straight from your desktop. Setup and run streaming (contextual) bandit experiments in your browser.
-
Updated
Dec 7, 2022 - JavaScript
-
Updated
Mar 4, 2020 - Python
Solutions and figures for problems from Reinforcement Learning: An Introduction Sutton&Barto
-
Updated
Jul 16, 2019 - Python
Privacy-Preserving Bandits (MLSys'20)
-
Updated
Dec 8, 2022 - Jupyter Notebook
Another A/B test library
-
Updated
Nov 13, 2024 - Scala
Contextual bandit algorithm called LinUCB / Linear Upper Confidence Bounds as proposed by Li, Langford and Schapire
-
Updated
Feb 2, 2023 - Java
Thompson Sampling Tutorial
-
Updated
Jan 25, 2019 - Jupyter Notebook
Improve this page
Add a description, image, and links to the bandit-algorithm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the bandit-algorithm topic, visit your repo's landing page and select "manage topics."