Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

forget to copy policy to policy_old during ppo initialization? #10

Closed
YilunZhou opened this issue Sep 25, 2019 · 1 comment
Closed

forget to copy policy to policy_old during ppo initialization? #10

YilunZhou opened this issue Sep 25, 2019 · 1 comment

Comments

@YilunZhou
Copy link

Should there be a self.policy_old.load_state_dict(self.policy.state_dict()) on line 85 of PPO.py, after the initialization of PPO object? PyTorch random initialization does not guarantee that these two policies will be the same. And the same issue for PPO_continuous.py.

nikhilbarhate99 added a commit that referenced this issue Sep 26, 2019
@nikhilbarhate99
Copy link
Owner

Thanks !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants