Skip to content
This repository was archived by the owner on Jun 14, 2024. It is now read-only.

Fix a bug for SVGP regression with minibatch traning #148

Merged
merged 2 commits into from
Jan 25, 2019

Conversation

zhenwendai
Copy link
Contributor

  • Fix a bug for SVGP regression with minibatch traning
  • Implement Matern kernel family
  • Remove broadcast_w_samples in stationary, rbf and linear kernel implementation for efficiency.

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

Zhenwen Dai added 2 commits January 25, 2019 10:28
Remove the dependency on broadcast_w_samples operators in RBF and Linear 
kernel.
@codecov-io
Copy link

codecov-io commented Jan 25, 2019

Codecov Report

Merging #148 into develop will increase coverage by 0.08%.
The diff coverage is 92.72%.

Impacted file tree graph

@@             Coverage Diff             @@
##           develop     #148      +/-   ##
===========================================
+ Coverage     85.1%   85.19%   +0.08%     
===========================================
  Files           77       78       +1     
  Lines         3814     3850      +36     
  Branches       653      654       +1     
===========================================
+ Hits          3246     3280      +34     
- Misses         375      376       +1     
- Partials       193      194       +1
Impacted Files Coverage Δ
...on/components/distributions/gp/kernels/__init__.py 100% <100%> (ø) ⬆️
mxfusion/inference/minibatch_loop.py 77.77% <100%> (ø) ⬆️
...sion/components/distributions/gp/kernels/linear.py 96.77% <100%> (-0.11%) ⬇️
...xfusion/components/distributions/gp/kernels/rbf.py 100% <100%> (ø) ⬆️
...sion/components/distributions/gp/kernels/matern.py 100% <100%> (ø)
mxfusion/modules/module.py 79.48% <100%> (+0.32%) ⬆️
.../components/distributions/gp/kernels/stationary.py 94.28% <100%> (-0.16%) ⬇️
mxfusion/modules/gp_modules/svgp_regression.py 82.48% <50%> (ø) ⬆️
mxfusion/modules/gp_modules/sparsegp_regression.py 80.95% <50%> (+0.1%) ⬆️
mxfusion/modules/gp_modules/gp_regression.py 81.53% <66.66%> (-0.48%) ⬇️
... and 1 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update aa45a3b...6dfaa21. Read the comment docs.

@@ -95,7 +95,7 @@ def compute(self, F, variables):
logL = logL + F.sum(F.sum(F.square(LinvKuf)/noise_var_m, axis=-1),
axis=-1)*D/2.
logL = logL + F.sum(F.sum(Linvmu*LinvKufY, axis=-1), axis=-1)
logL = logL + self.model.U.factor.log_pdf_scaling*KL_u
logL = self.log_pdf_scaling*logL + KL_u
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a reference to the doc here (and to the other GP implementations please) about where you get the maths from?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This relates to mini-batch learning. We should have a tutorial explaining this.

@zhenwendai zhenwendai merged commit 0d98f32 into amzn:develop Jan 25, 2019
@zhenwendai zhenwendai added this to the MXFusion v0.3.0 milestone Feb 20, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants