Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nit: Tensor padding utility function (DRY) #154

Closed
josephdviviano opened this issue Feb 13, 2024 · 1 comment
Closed

Nit: Tensor padding utility function (DRY) #154

josephdviviano opened this issue Feb 13, 2024 · 1 comment
Assignees
Labels
small Small enhancement

Comments

@josephdviviano
Copy link
Collaborator

In trajectories.py we have the following:

                    # TODO: This should be a single reused function 
                    # The size of self needs to grow to match other along dim=0.
                    if self_shape[0] < other_shape[0]:
                        pad_dim = required_first_dim - self_shape[0]
                        pad_dim_full = (pad_dim,) + tuple(self_shape[1:])
                        output_padding = torch.full(
                            pad_dim_full,
                            fill_value=-float("inf"),
                            dtype=self.estimator_outputs.dtype,  # TODO: This isn't working! Hence the cast below...
                            device=self.estimator_outputs.device,
                        )
                        self.estimator_outputs = torch.cat(
                            (self.estimator_outputs, output_padding),
                            dim=0,
                        )


This logic appears multiple times in the library and could be abstracted into a utility function.

@josephdviviano
Copy link
Collaborator Author

closing as we have a PR.

josephdviviano added a commit that referenced this issue Feb 19, 2024
removed utility function (DRY) addressing #154
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
small Small enhancement
Projects
None yet
Development

No branches or pull requests

1 participant