Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when 'learning_starts' Parameter is Smaller Than 'topk' in Notebook #10

Open
KoOBaALT opened this issue Nov 21, 2023 · 0 comments

Comments

@KoOBaALT
Copy link

I encountered an error in a notebook when the learning_starts parameter is smaller than the topk value. Specifically, the error occurs in the following code snippet:

lge.explore(num_timesteps)

It seems related to the default learning_starts parameter of lge being set to 100:

def __init__(... , learning_starts: int = 100, ...)

This causes an error because when the element in topk is smaller than the parameter k, it raises an error:

   cdist = torch.cdist(x, samples)
   dist_to_kst = cdist.topk(k, largest=False)[0][:, -1]

I was thinking of replacing k with:

k = min(1000, x.shape[0]

However, I'm uncertain if there's a specific intention behind hardcoding k = 1000. Could this be a viable solution, or is there a particular reason for the current implementation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant