Bandits
Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset
Install / Use
/learn @jldbc/BanditsREADME
Multi-Armed Bandits
Implementations of UCB1, Bayesian UCB, Epsilon Greedy, and EXP3 bandit algorithms on the Movielens-20m dataset. Algorithms are evaluated offline using replay.
To reproduce:
git clone https://github.com/jldbc/bandits
cd bandits/bandits
bash run.sh
Impementation details and results
Final results:
<img src="/results/final_bandit_results.png" alt>Related Skills
node-connect
342.0kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
84.7kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
342.0kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
commit-push-pr
84.7kCommit, push, and open a PR
