-
Notifications
You must be signed in to change notification settings - Fork 767
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hybrid/simplify #1388
Hybrid/simplify #1388
Conversation
…lization constants.
OK @varunagrawal , @ProfFan I fixed the last issue. One of the things GraphAndConstant did was to correct for the newly created GaussianConditional normalization factors. I now did this in a second path in the hybrid elimination step, by using Could you review? A bit of a deadline, as I want to land 4.2, and this was the missing piece. Also, @varunagrawal , consult on the pruning? I had to change the regression tests in |
static const VectorValues kEmpty; | ||
// If the factor is not null, it has no keys, just contains the residual. | ||
const auto &factor = pair.second; | ||
if (!factor) return 1.0; // TODO(dellaert): not loving this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How could this branch be taken?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, whenever we do BayesTree there is some way (that I don't yet understand) that nullptrs are used, and we are dealing with all over the place. Can you explain? Or @varunagrawal ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could be broken down into smaller PRs...
@dellaert I'm a bit confused. Have we gone from using |
Yeah, this is why I asked your consult. I'm looking at it now. |
Unfortunately, I am not able to decode the pruning code. Can we chat? |
Yeah the fix is easy: the pruning code sets the probability to 0, so |
@dellaert I'm making a PR for this now. No please don't, I'd like to chat |
@varunagrawal No please don't, I'd like to chat to get on the same page first. |
OK, |
@@ -190,12 +189,13 @@ discreteElimination(const HybridGaussianFactorGraph &factors, | |||
// If any GaussianFactorGraph in the decision tree contains a nullptr, convert | |||
// that leaf to an empty GaussianFactorGraph. Needed since the DecisionTree will | |||
// otherwise create a GFG with a single (null) factor. | |||
// TODO(dellaert): still a mystery to me why this is needed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dellaert about this TODO, the explanation is given above. The assembleGraphTree
method causes an empty GaussianFactorGraph
to be created when there is a nullptr, hence we need to explicitly check and remove any factor graph with nullptrs (by setting it to nullptr).
This PR implements the scheme in the (included) doc/Hybrid.pdf, where
GaussianMixture::likelihood
returns a singleGaussianMixtureFactor
instance, that encodes the log-normalization constant as well - by adding a row to A and b, respectively. This is only needed when the noise is mode-dependent.We also now enforce error>0, and define the normalization constant for the
GaussianMixture
as the max of all normalization constants of its components.With this scheme, there is no more need for
FactorAndConstant
andGraphAndConstant
, which have been removed.To make the elimination work, I use the correct formula in the hybrid elimination:
There are two tests I had to disable: