Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactored hybrid inference #1863

Merged
merged 56 commits into from
Oct 10, 2024
Merged

Refactored hybrid inference #1863

merged 56 commits into from
Oct 10, 2024

Conversation

dellaert
Copy link
Member

@dellaert dellaert commented Oct 8, 2024

This is a big PR that gets rind of "hiding" the conditional constants into the Jacobian factors in HGF, in favor of storing a scalar alongside each factor in the HGFG, as is done in HNFG. It also adds extensive testing.

Details

  • GaussianFactorGraphTree is now its own class, called HybridGaussianProductFactor
  • asGaussianFactorGraphTree is renamed to collectProductFactor, and now only exists in HGF, no longer in HGC
  • scalars are initialized with each conditionals adjusted negLogConstants, i.e. scalars are zero if they are the same for each mode.
  • augment is GONE
  • added methods discretePosterior which yield much easier "ratio tests".

Testing

  • added many more tests in testHGC, testHBN, and testHGFG
  • moved all multi-frontal elimination tests to testHybridBayesTree

TODO:

  • replace all ratio tests
  • get rid of HGC::conditionals_ if improves performance

@varunagrawal
Copy link
Collaborator

Would it be possible to break this into a separate PR for the HybridGaussianProductFactor?

gtsam/hybrid/HybridGaussianConditional.cpp Outdated Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactor.cpp Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactor.cpp Outdated Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactor.cpp Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactor.cpp Outdated Show resolved Hide resolved
gtsam/hybrid/tests/testHybridNonlinearFactorGraph.cpp Outdated Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactorGraph.cpp Outdated Show resolved Hide resolved
auto [factor, scalar] = pair;
// If factor is null, it has been pruned, hence return potential zero
if (!factor) return 0.0;
return exp(-scalar - factor->error(kEmpty));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a major bug!!!
when we don't normalize the values, if we have large factor errors (e.g. even 80), then exp(-80) ~= 0.0 and everything collapses. This was one of the major issues I fixed last December that got things working.

Unfortunately this is not revealed in the the GTSAM tests because the sigmas are too simple but show up in the hybrid estimation code (same reason why the pruning PR didn't show any issues in GTSAM but caused experiments to fail).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Confirming that I am running into a similar issue as before.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I'll fix this

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will fix it only here, when the exp is taken. I'd like the original errors preserved in the factors, not shifted before it's needed. I can explain in person if you vehemently object.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, for this case and case below, I added a function

/// Take negative log-values, shift them so that the minimum value is 0, and
/// then exponentiate to create a DecisionTreeFactor (not normalized yet!).
static DecisionTreeFactor::shared_ptr DiscreteFactorFromErrors(
    const DiscreteKeys &discreteKeys,
    const AlgebraicDecisionTree<Key> &errors) {
  double min_log = errors.min();
  AlgebraicDecisionTree<Key> potentials = DecisionTree<Key, double>(
      errors, [&min_log](const double x) { return exp(-(x - min_log)); });
  return std::make_shared<DecisionTreeFactor>(discreteKeys, potentials);
}

Please note I realized it is not necessary to normalize to probabilities: it is still just a factor that we need, not a conditional. That was an error on my part before - although it only cost some computation, nothing else.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also I would like to hear the explanation for my own curiosity. I have a suspicion we may need to normalize somewhere.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, it might still need to happen in the discretePosterior methods. It seems what we need is a constructor in DecisionTreeFactor. factors are not normalized, so for discretePosterior it should be a constructor in DiscreteConditional (which is normalized).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can do it and piggyback it off of #1867

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apparently we can't create a DiscreteConditional since we don't have the cardinality information of the Keys from an AlgebraicDecisionTree. Thoughts?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also discretePosterior already normalizes due to the p / p.sum().

gtsam/hybrid/HybridGaussianFactorGraph.cpp Outdated Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactorGraph.cpp Show resolved Hide resolved
@varunagrawal
Copy link
Collaborator

Here are my clang-format settings:

"clang-format.executable": "/usr/local/bin/clang-format",
"clang-format.language.c.style": "Google",
"clang-format.language.cpp.style": "Google",
"clang-format.fallbackStyle": "Google",

@varunagrawal
Copy link
Collaborator

varunagrawal commented Oct 8, 2024

I re-added the normalization and tests that initially passed are now failing. :(
Nvm I was missing the scalar value.

@dellaert
Copy link
Member Author

dellaert commented Oct 9, 2024

Would it be possible to break this into a separate PR for the HybridGaussianProductFactor?

It's possible to split the PR - but not with this version of HybridGaussianProductFactor

@dellaert
Copy link
Member Author

dellaert commented Oct 9, 2024

Here are my clang-format settings:

"clang-format.executable": "/usr/local/bin/clang-format",
"clang-format.language.c.style": "Google",
"clang-format.language.cpp.style": "Google",
"clang-format.fallbackStyle": "Google",

This file is checked in in:

BasedOnStyle: Google

BinPackArguments: false
BinPackParameters: false
ColumnLimit: 100
DerivePointerAlignment: false
IncludeBlocks: Preserve
PointerAlignment: Left

Maybe we can just change that file to not have any modifications on top of Google style.

@dellaert
Copy link
Member Author

dellaert commented Oct 9, 2024

Pushed new changes. I have no idea on how to fix windows. Because istream is involved it could be serialization?

@varunagrawal
Copy link
Collaborator

Pushed new changes. I have no idea on how to fix windows. Because istream is involved it could be serialization?

I figured out a hack to fix the Windows CI. It is in #1866.

@dellaert
Copy link
Member Author

@varunagrawal Can we now merge this? Do your experiments work?

@varunagrawal
Copy link
Collaborator

Yes, but I wanted to fix all the pending TODOs. I'll do them in the other PR.

Copy link
Collaborator

@varunagrawal varunagrawal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@varunagrawal varunagrawal merged commit def8b25 into develop Oct 10, 2024
33 checks passed
@varunagrawal varunagrawal deleted the feature/no_hiding branch October 10, 2024 13:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants