Skip to content

Latest commit

 

History

History
19 lines (12 loc) · 556 Bytes

README.md

File metadata and controls

19 lines (12 loc) · 556 Bytes

Multi-Armed Bandits

Implementations of UCB1, Bayesian UCB, Epsilon Greedy, and EXP3 bandit algorithms on the Movielens-20m dataset. Algorithms are evaluated offline using replay.

To reproduce:

git clone https://github.com/jldbc/bandits
cd bandits/bandits
bash run.sh

Experiment setup details

Impementation details and results

Final results: