OpenDILab Decision AI Engine. The Most Comprehensive Reinforcement Learning Framework B.P.
-
Updated
Mar 17, 2025 - Python
OpenDILab Decision AI Engine. The Most Comprehensive Reinforcement Learning Framework B.P.
[NeurIPS 2023 Spotlight] LightZero: A Unified Benchmark for Monte Carlo Tree Search in General Sequential Decision Scenarios (awesome MCTS)
An artificial intelligence platform for the StarCraft II with large-scale distributed training and grand-master agents.
The official implementation of Self-Play Fine-Tuning (SPIN)
The official implementation of Self-Play Preference Optimization (SPPO)
A Massively Parallel Large Scale Self-Play Framework
AlphaZero implementation for Othello, Connect-Four and Tic-Tac-Toe based on "Mastering the game of Go without human knowledge" and "Mastering Chess and Shogi by Self-Play with a General Reinforcement Learning Algorithm" by DeepMind.
The exact codes used by the team "liveinparis" at the kaggle football competition ranked 6th/1141
A very fast implementation of AlphaZero, applied to games like Splendor, Santorini, The Little Prince, … Browser version available
TD-Gammon implementation
Backgammon OpenAI Gym
AI agents for the bavarian card game Schafkopf trained with reinforcement learning
This is the implementation of paper Model Free Episodic Control
Using self-play, MCTS, and a deep neural network to create a hearthstone ai player
Code base for Social Robot Tree Search (SoRTS).
A gym environment to train chatbots.
Code repository for On the interaction between supervision and self-play in emergent communication (ICLR 2020)
A clean and easy implementation of MuZero, AlphaZero and Self-Play reinforcement learning algorithms for any game.
An implementation of the AlphaZero algorithm for adversarial games to be used with the machine learning framework of your choice
Emulator and AI of Shadowverse
Add a description, image, and links to the self-play topic page so that developers can more easily learn about it.
To associate your repository with the self-play topic, visit your repo's landing page and select "manage topics."