Skip to content

Fix the nonlinear bug and add mnist example #110

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jan 18, 2023
Merged

Fix the nonlinear bug and add mnist example #110

merged 2 commits into from
Jan 18, 2023

Conversation

Orienfish
Copy link
Contributor

Add one more step in the nonlinear encoding: normalize the weight, which is similar to the random projection case.
To test it, I create a mnist_nonlinear.py in the examples.
Without this normalization, the accuracy of running python mnist_nonlinear.py is 10.75%.
After adding this normalization, the accuracy becomes 83.93%.

Another example that justify this change is in the random_projection.py example:

    def encode(self, x):
        enc = self.project(x)
        sample_hv = torch.cos(enc + self.bias) * torch.sin(enc)
        return torchhd.hard_quantize(sample_hv)

This encoding function is doing nothing but first a random projection and then passing through the nonlinear encoding.
Since in the random projection, the weight has been normalized; thus in the nonlinear encoding, the weight should also be normalized.

Please let me know if you have any questions or concerns.

Copy link
Member

@mikeheddes mikeheddes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the PR and for finding this issue. I verified the accuracy gains that you reported. If you could remove the NUM_LEVELS unused variable then I will go ahead and merge you're request.


DIMENSIONS = 10000
IMG_SIZE = 28
NUM_LEVELS = 1000
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This variable is not used anymore, could you remove it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your quick review! I remove the NUM_LEVELS variable as you suggested. I also add one line of comment at the beginning of mnist_nonlinear.py example.

@mikeheddes
Copy link
Member

Congrats on your first contribution to the library! Thank you for your help :)

@mikeheddes mikeheddes merged commit 909e12d into hyperdimensional-computing:main Jan 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants