SkillAgentSearch skills...

Bandits

Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset

Install / Use

/learn @jldbc/Bandits
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Multi-Armed Bandits

Implementations of UCB1, Bayesian UCB, Epsilon Greedy, and EXP3 bandit algorithms on the Movielens-20m dataset. Algorithms are evaluated offline using replay.

To reproduce:

git clone https://github.com/jldbc/bandits
cd bandits/bandits
bash run.sh

Experiment setup details

Impementation details and results

Final results:

<img src="/results/final_bandit_results.png" alt>

Related Skills

View on GitHub
GitHub Stars57
CategoryDevelopment
Updated5mo ago
Forks17

Languages

Python

Security Score

92/100

Audited on Oct 12, 2025

No findings