From ed9fe216aa31f4c74c9893f3c215f0376f59da2f Mon Sep 17 00:00:00 2001 From: Nikhil Barhate <30340547+nikhilbarhate99@users.noreply.github.com> Date: Fri, 9 Apr 2021 16:20:24 +0530 Subject: [PATCH] Update README.md --- README.md | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 0d0c992..9080ce0 100644 --- a/README.md +++ b/README.md @@ -37,7 +37,18 @@ A concise explaination of PPO algorithm can be found [here](https://stackoverflo #### [Open in Google Colab](https://colab.research.google.com/github/nikhilbarhate99/PPO-PyTorch/blob/master/PPO_colab.ipynb) [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/nikhilbarhate99/PPO-PyTorch/blob/master/PPO_colab.ipynb) - +## Citing + +Please use this bibtex if you want to cite this repository in your publications : + + @misc{pytorch_minimal_ppo, + author = {Barhate, Nikhil}, + title = {Minimal PyTorch Implementation of Proximal Policy Optimization}, + year = {2021}, + publisher = {GitHub}, + journal = {GitHub repository}, + howpublished = {\url{https://github.com/nikhilbarhate99/PPO-PyTorch}}, + } ## Results