Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hybrid Factor Graph Implementation #1203

Merged
merged 76 commits into from
Jun 3, 2022

Conversation

ProfFan
Copy link
Collaborator

@ProfFan ProfFan commented May 21, 2022

Original draft PR: varunagrawal#42

Copy link
Member

@dellaert dellaert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fantastic progress. Some more comments (and some important comments were added before I did a re-review) because I still don’t see eye to eye yet on some naming etc…. Also, I think we should go for HybraidGaussianXXX rather than GaussianHybridXXX because everything in this folder should mimick what is in linear and non-linear, except preceded by Hybrid.

#include <gtsam/linear/GaussianConditional.h>

namespace gtsam {
class GaussianMixtureConditional
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The math in the comment is too generic. I think this class is a mixture of Gaussian, so expect to see that detail :-) Please check GaussianConditional.h for an example.

* @param continuousFrontals the continuous frontals.
* @param continuousParents the continuous parents.
* @param discreteParents the discrete parents. Will be placed last.
* @param conditionals a decision tree of GaussianConditionals.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not resolved


/* ************************************************************************ */
std::pair<HybridConditional::shared_ptr, HybridFactor::shared_ptr> //
EliminateHybrid(const HybridFactorGraph &factors, const Ordering &frontalKeys) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why cant’t we do this in this PR? It will make the pruning PR also easier to review.

// discrete variable, no continuous factor is allowed (escapes CG regime), so
// we panic, if discrete only we do the discrete elimination.

// However it is not that simple. During elimination it is possible that the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’d love this important PR to be self-contained, so could the comment be adapted?

/**
* Hybrid Factor Graph
* -----------------------
* This is the linear version of a hybrid factor graph. Everything inside needs
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, let’s rename this, but since we have hybridGaussianFactor should this not be HybridGaussianFactorGraph?

* -------------------------------------------------------------------------- */

/**
* @file HybridISAM.h
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

HybridGaissianISAM, to be consistent.

* -------------------------------------------------------------------------- */

/**
* @file HybridJunctionTree.cpp
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

HybridGaussianJunctionTree?

Copy link
Member

@dellaert dellaert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome. Still some nits, and I am not sure all classes have the right name yet.
Some HybridXX need to become HybridGaussianXX?
Since this is still in alpha I think we can leave to next PR...

* variable, M is the selection of discrete variables corresponding to a subset
* of the Gaussian variables and Z is parent of this node
*
* The negative log-probability is given by \f$ \sum_{m=1}^M \pi_m \frac{1}{2}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: this is probably wrong. The mixture is \sum N(x; R,S,T,...d) = \sum exp -0.5 |...|^2 so the negative log would be -log \sum exp -0.5 |...|^2 which is (unfortunately) not \sum 0.5 |...|^2. But, in Conditionals we don't really care about the neg log, so we can just say

The probability P(x|y,z,...) is proportional to \f$ \sum_i k_i \exp - \frac{1}{2} |R_i x - (d_i - S_i y - T_i z - ...)|^2 \f$ where i indexes the components and k_i is a component-wise normalization constant.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking about this today morning. Let me redo it.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, github now supports latex:

$\sum_i k_i \exp - \frac{1}{2} |R_i x - (d_i - S_i y - T_i z - ...)|^2$

}

auto result = EliminatePreferCholesky(gfg, frontalKeys);
return std::make_pair(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does just

return {result.first, result.second};

not work?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure. Most of the comments in this round is from Fan's code that I didn't feel needed any updates. :)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, CI will fail (I initially did {})

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Main culprit is old GCC, we will be free of these problems once we move to C++14/17.


auto result = EliminateDiscrete(dfg, frontalKeys);

return std::make_pair(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same

GaussianFactorGraph gfg;
for (auto &fp : factors) {
auto ptr = boost::dynamic_pointer_cast<HybridGaussianFactor>(fp);
if (ptr) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can do (here and below)

    if (auto hgf = boost::dynamic_pointer_cast<HybridGaussianFactor>(fp)) {...

DiscreteKeys discreteSeparator(discreteSeparatorSet.begin(),
discreteSeparatorSet.end());

// sum out frontals, this is the factor on the separator
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make this a separate function as well?


gttoc(sum);

using EliminationPair = GaussianFactorGraph::EliminationResult;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on what's happening...

@varunagrawal
Copy link
Collaborator

varunagrawal commented Jun 3, 2022

Awesome. Still some nits, and I am not sure all classes have the right name yet. Some HybridXX need to become HybridGaussianXX? Since this is still in alpha I think we can leave to next PR...

I was wondering about this. We have GaussianEliminationTree, SymbolicEliminationTree and DiscreteEliminationTree, so should this be just HybridEliminationTree? Same question for HybridJunctionTree since these two only come into play after linearization so Gaussian is implied.

@dellaert once this question is resolved, I can merge.

@ProfFan
Copy link
Collaborator Author

ProfFan commented Jun 3, 2022

Thanks @varunagrawal!

*
*/
class GTSAM_EXPORT GaussianMixture
: public HybridFactor,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ProfFan @dellaert why is this inheriting from HybridFactor and not HybridGaussianMixtureFactor like in the previous scheme? Do we not need to record the conditionals as factors in the base class anymore?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I lost track of this - @ProfFan might have the answer

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this can be done that way, just what was on my mind is that I want to keep the two things separate, as they don't necessarily need to have inheritance relationship. We can revisit this later.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry but this is an early design decision that needs to be resolved ASAP.

My understanding was that eliminating a HybridMixtureFactor should result in a GaussianMixture conditional, so there already exists an intrinsic relationship between the two.

My main concern is what happens when we only eliminate a partial set of the keys? If the factor is F(X1, X2, M1) and we only eliminate X1, shouldn't the gaussian factors for X1 still be captured in the conditional until even X2 is eliminated?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually agree with Varun: a conditional is a factor. No sense to have extra storage in this conditional for the Gaussian components, if that is going on.

However, I don't fully understand how the elimination example is relevant. We don't eliminate conditionals, we eliminate factors, so not sure how it bears on this discussion.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GaussianMixtureFactor is a container for other factors via the decision tree with Gaussian FGs as the leaves. So if we eliminate one variable, I don't remember how that affects the other variables within the tree. What is the expected behavior?

My understanding is that it should be a conditional on the variable with the remaining variables as parents, so does this mean the conditional would be dependent on variables still a part of factors? In that case we should be recording the factors. However the final Bayes network will only have conditionals, so the GaussianMixtures will have to be retroactively updated to reflect the new conditionals for parents.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll go back and look at the behavior from the previous scheme.

@dellaert
Copy link
Member

dellaert commented Jun 3, 2022

Awesome. Still some nits, and I am not sure all classes have the right name yet. Some HybridXX need to become HybridGaussianXX? Since this is still in alpha I think we can leave to next PR...

I was wondering about this. We have GaussianEliminationTree, SymbolicEliminationTree and DiscreteEliminationTree, so should this be just HybridEliminationTree? Same question for HybridJunctionTree since these two only come into play after linearization so Gaussian is implied.

@dellaert once this question is resolved, I can merge.

OK, that makes sense. Yes, HybridEliminationTree and HybridJunctionTree I guess.

@varunagrawal varunagrawal merged commit 9a1eb02 into borglab:develop Jun 3, 2022
@varunagrawal varunagrawal deleted the fan/prototype-hybrid-tr branch October 2, 2022 20:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants