Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add logDensity and evaluate to GaussianBN and HybridBN #1352

Merged
merged 10 commits into from
Dec 29, 2022

Conversation

dellaert
Copy link
Member

  • Add logDensity and evaluate to GaussianBN and HybridBN
  • Added a test to make sure pdf integrates to 1 for 1D GaussianConditionals
  • Added a test of sampling to do a Monte Carlo estimate of variance
  • Also:
    • renamed asDiscreteConditional to asDiscrete to be consistent
    • changed the static cast to a dynamic cast for more elegant code

@dellaert dellaert self-assigned this Dec 29, 2022
boost::make_shared<GaussianMixture>(*gaussianMixture);
prunedGaussianMixture->prune(*decisionTree);
auto prunedGaussianMixture = boost::make_shared<GaussianMixture>(*gm);
prunedGaussianMixture->prune(*decisionTree); // imperative :-(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should make an issue for this? There may have been a reason why I (a Lisp lover) made this imperative, so it'll be good to re-examine this now.

}

/* ************************************************************************* */
GaussianBayesNet HybridBayesNet::choose(
const DiscreteValues &assignment) const {
GaussianBayesNet gbn;
for (auto &&conditional : *this) {
if (conditional->isHybrid()) {
if (auto gm = conditional->asMixture()) {
Copy link
Collaborator

@varunagrawal varunagrawal Dec 29, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel this is less readable just to save 1 line of code, but okay.

}
}

DiscreteValues mpe = DiscreteFactorGraph(discrete_bn).optimize();

// Given the MPE, compute the optimal continuous values.
GaussianBayesNet gbn = this->choose(mpe);
GaussianBayesNet gbn = choose(mpe);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I generally like using this because:

  1. Specifies this is a class method similar to self in python.
  2. Auto-complete works better.

@@ -284,23 +300,20 @@ AlgebraicDecisionTree<Key> HybridBayesNet::error(

// Iterate over each conditional.
for (auto &&conditional : *this) {
if (conditional->isHybrid()) {
if (auto gm = conditional->asMixture()) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, this is more clever but sacrifices readability...

Copy link
Collaborator

@varunagrawal varunagrawal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel some readability has been reduced in the Hybrid* classes but otherwise LGTM.

@varunagrawal varunagrawal merged commit a849eab into develop Dec 29, 2022
@varunagrawal varunagrawal deleted the feature/HBN-evaluate branch December 29, 2022 03:50
@dellaert dellaert added this to the Hybrid Inference milestone Feb 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants