Skip to content
/ bandits Public

Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset

License

Notifications You must be signed in to change notification settings

jldbc/bandits

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Armed Bandits

Implementations of UCB1, Bayesian UCB, Epsilon Greedy, and EXP3 bandit algorithms on the Movielens-20m dataset. Algorithms are evaluated offline using replay.

To reproduce:

git clone https://github.com/jldbc/bandits
cd bandits/bandits
bash run.sh

Experiment setup details

Impementation details and results

Final results:

About

Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published