A multi-armed bandit implementation in python
-
Updated
Mar 7, 2017 - Python
A multi-armed bandit implementation in python
Implementations of basic concepts dealt under the Reinforcement Learning umbrella. This project is collection of assignments in CS747: Foundations of Intelligent and Learning Agents (Autumn 2017) at IIT Bombay
Implementations of the bandit algorithms with unordered and ordered slates that are described in the paper "Non-Stochastic Bandit Slate Problems", by Kale et al. 2010.
Implementation of the X-armed Bandits algorithm, as detailed in the paper, "X-armed Bandits", Bubeck et al., 2011.
Python implementation of UCB, EXP3 and Epsilon greedy algorithms
Decentralized Intelligent Resource Allocation for LoRaWAN Networks
Foundations Of Intelligent Learning Agents (FILA) Assignments
Assignments for CS747 - Foundations of Intelligent and Learning Agents
WIP: A library and AWS sdk for non-contextual and contextual Multi-Armed-Bandit (MAB) algorithms for multiple use cases
DPE code - Code used in "Optimal Algorithms for Multiplayer Multi-Armed Bandits" (AISTATS 2020)
Bayesian Optimization for Categorical and Continuous Inputs
Development of algorithms for reinforcement learning. Specifically, software implementation of the algorithms and policies described in the paper Batched Multi-armed Bandits Problems, by Zijun Gao, Yanjun Han, Zhimei Ren, Zhengqing Zhou.
A beer recommendation system using multi-armed bandit approach to solve cold start problems
Code for Policy Optimization as Online Learning with Mediator Feedback
An improved version of Turbo algorithm for the Black-box optimization competition organized by NeurIPS 2020
This repository addresses popular content recommender design for public transportation as discussed in my Ph.D. thesis titled as: "Popular Content Distribution in Public Transportation Using Artificial Intelligence Techniques". The code used for the entire content recommender design is provided twice using two programming languages, namely: Pyth…
Implementation of multi-armed bandits from scratch
Multi Armed Bandits implementation using the Jester Dataset
Online Ranking with Multi-Armed-Bandits
Add a description, image, and links to the multi-armed-bandits topic page so that developers can more easily learn about it.
To associate your repository with the multi-armed-bandits topic, visit your repo's landing page and select "manage topics."