bandit-algorithm
Here are 20 public repositories matching this topic...
This presentation contains very precise yet detailed explanation of concepts of a very interesting topic -- Reinforcement Learning.
-
Updated
Dec 25, 2017
-
Updated
Jan 23, 2018 - Python
Movie Recommendation using Cascading Bandits namely CascadeLinTS and CascadeLinUCB
-
Updated
May 17, 2018 - MATLAB
Research project on automated A/B testing of software by evolutionary bandits.
-
Updated
Jan 17, 2019 - MATLAB
Thompson Sampling Tutorial
-
Updated
Jan 25, 2019 - Jupyter Notebook
-
Updated
Jul 1, 2019 - Jupyter Notebook
Adversarial multi-armed bandit algorithms
-
Updated
Jul 8, 2019 - MATLAB
Solutions and figures for problems from Reinforcement Learning: An Introduction Sutton&Barto
-
Updated
Jul 16, 2019 - Python
-
Updated
Mar 4, 2020 - Python
Reinforcement learning
-
Updated
Jun 20, 2020 - Scala
-
Updated
Oct 28, 2020 - Jupyter Notebook
A small collection of Bandit Algorithms (ETC, E-Greedy, Elimination, UCB, Exp3, LinearUCB, and Thompson Sampling)
-
Updated
May 25, 2022 - Python
Solutions to the Stanford CS:234 Reinforcement Learning 2022 course assignments.
-
Updated
Jun 27, 2022 - Python
Client that handles the administration of StreamingBandit online, or straight from your desktop. Setup and run streaming (contextual) bandit experiments in your browser.
-
Updated
Dec 7, 2022 - JavaScript
Privacy-Preserving Bandits (MLSys'20)
-
Updated
Dec 8, 2022 - Jupyter Notebook
Contextual bandit algorithm called LinUCB / Linear Upper Confidence Bounds as proposed by Li, Langford and Schapire
-
Updated
Feb 2, 2023 - Java
Another A/B test library
-
Updated
Nov 5, 2024 - Scala
Network-Oriented Repurposing of Drugs Python Package
-
Updated
Oct 29, 2024 - Jupyter Notebook
Online learning approaches to optimize database join operations in PostgreSQL.
-
Updated
Nov 4, 2024 - C
Improve this page
Add a description, image, and links to the bandit-algorithm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the bandit-algorithm topic, visit your repo's landing page and select "manage topics."