Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hybrid/simplify #1388

Merged
merged 29 commits into from
Jan 17, 2023
Merged

Hybrid/simplify #1388

merged 29 commits into from
Jan 17, 2023

Conversation

dellaert
Copy link
Member

@dellaert dellaert commented Jan 17, 2023

This PR implements the scheme in the (included) doc/Hybrid.pdf, where GaussianMixture::likelihood returns a single GaussianMixtureFactor instance, that encodes the log-normalization constant as well - by adding a row to A and b, respectively. This is only needed when the noise is mode-dependent.

We also now enforce error>0, and define the normalization constant for the GaussianMixture as the max of all normalization constants of its components.

With this scheme, there is no more need for FactorAndConstant and GraphAndConstant, which have been removed.

To make the elimination work, I use the correct formula in the hybrid elimination:

q(μ;m) = exp(-error(μ;m)) * sqrt(det(2π Σ_m))

There are two tests I had to disable:

  1. a pruning test that now yields infinity. @varunagrawal look into that?
  2. a complex example where both noise model and motion model are governed by discrete variables, fails the ratio test after elimination. Please take a detailed look there whether I made a silly mistake? Hoping this one does not point to yet another issue with elimination.

@dellaert
Copy link
Member Author

OK @varunagrawal , @ProfFan I fixed the last issue. One of the things GraphAndConstant did was to correct for the newly created GaussianConditional normalization factors. I now did this in a second path in the hybrid elimination step, by using HessianFactor::constantTerm().

Could you review? A bit of a deadline, as I want to land 4.2, and this was the missing piece. Also, @varunagrawal , consult on the pruning? I had to change the regression tests in testHybridBayesNet, but I believe the results to be correct now (and not before).

static const VectorValues kEmpty;
// If the factor is not null, it has no keys, just contains the residual.
const auto &factor = pair.second;
if (!factor) return 1.0; // TODO(dellaert): not loving this.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How could this branch be taken?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, whenever we do BayesTree there is some way (that I don't yet understand) that nullptrs are used, and we are dealing with all over the place. Can you explain? Or @varunagrawal ?

gtsam/hybrid/HybridGaussianFactorGraph.cpp Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactorGraph.cpp Show resolved Hide resolved
Copy link
Collaborator

@varunagrawal varunagrawal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could be broken down into smaller PRs...

gtsam/hybrid/GaussianMixture.cpp Show resolved Hide resolved
gtsam/hybrid/tests/testHybridBayesNet.cpp Show resolved Hide resolved
@varunagrawal
Copy link
Collaborator

varunagrawal commented Jan 17, 2023

@dellaert I'm a bit confused. Have we gone from using probabilities errors in the discrete trees to logProbabilities? If that's the case, then the pruning methods need to be updated.

@dellaert
Copy link
Member Author

@dellaert I'm a bit confused. Have we gone from using probabilities errors in the discrete trees to logProbabilities? If that's the case, then the pruning methods need to be updated.

Yeah, this is why I asked your consult. I'm looking at it now.

@dellaert
Copy link
Member Author

@dellaert I'm a bit confused. Have we gone from using probabilities errors in the discrete trees to logProbabilities? If that's the case, then the pruning methods need to be updated.

Yeah, this is why I asked your consult. I'm looking at it now.

Unfortunately, I am not able to decode the pruning code. Can we chat?

@varunagrawal
Copy link
Collaborator

@dellaert I'm a bit confused. Have we gone from using probabilities errors in the discrete trees to logProbabilities? If that's the case, then the pruning methods need to be updated.

Yeah, this is why I asked your consult. I'm looking at it now.

Unfortunately, I am not able to decode the pruning code. Can we chat?

Yeah the fix is easy: the pruning code sets the probability to 0, so log(0) = -inf. We have to just update the regression parameter.

@varunagrawal
Copy link
Collaborator

varunagrawal commented Jan 17, 2023

@dellaert I'm making a PR for this now.

No please don't, I'd like to chat

@dellaert
Copy link
Member Author

@dellaert I'm making a PR for this now.

@varunagrawal No please don't, I'd like to chat to get on the same page first.

@dellaert
Copy link
Member Author

OK,

@dellaert dellaert merged commit 0571af9 into develop Jan 17, 2023
@dellaert dellaert deleted the hybrid/simplify branch January 17, 2023 20:37
@@ -190,12 +189,13 @@ discreteElimination(const HybridGaussianFactorGraph &factors,
// If any GaussianFactorGraph in the decision tree contains a nullptr, convert
// that leaf to an empty GaussianFactorGraph. Needed since the DecisionTree will
// otherwise create a GFG with a single (null) factor.
// TODO(dellaert): still a mystery to me why this is needed.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dellaert about this TODO, the explanation is given above. The assembleGraphTree method causes an empty GaussianFactorGraph to be created when there is a nullptr, hence we need to explicitly check and remove any factor graph with nullptrs (by setting it to nullptr).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants