Skip to content

Allow combining of Prob Scorers #2469

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

benthecarman
Copy link
Contributor

Could be useful if getting scorers from multiple different sources

@tnull
Copy link
Contributor

tnull commented Aug 3, 2023

Mh, I think it would be nice to enable this, but it currently leaves out the most important part: combining the historical liquidity data, which likely will not be as easy as just merging by replacing with the most recent data items.

Comment on lines 957 to 967
if current.last_updated.duration_since(item.last_updated) == Duration::from_secs(0) {
channel_liquidities.insert(id, item);
}
Copy link
Contributor

@jkczyz jkczyz Aug 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The most recent isn't necessarily better. Either the min or or the max liquidity would have been affected, but not both. So you may be discarding a better estimate of one for a worse estimate of the other -- or just getting a worse estimate for both if the updated one was based on a smaller value. You probably want the one that has the smallest delta between min and max, which indicates a tighter bound. Taking account the decay should account for older data.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed to this, probably best solution is being able to combine the HistoricalBucketRangeTracker but that is outside my paygrade haha

@codecov-commenter
Copy link

codecov-commenter commented Aug 3, 2023

Codecov Report

Patch coverage has no change and project coverage change: -0.04% ⚠️

Comparison is base (6f58072) 90.34% compared to head (9606fa1) 90.31%.
Report is 13 commits behind head on main.

❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2469      +/-   ##
==========================================
- Coverage   90.34%   90.31%   -0.04%     
==========================================
  Files         106      106              
  Lines       55784    55779       -5     
  Branches    55784    55779       -5     
==========================================
- Hits        50398    50376      -22     
- Misses       5386     5403      +17     
Files Changed Coverage Δ
lightning/src/routing/scoring.rs 92.74% <0.00%> (-0.76%) ⬇️

... and 10 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

impl<T: Time> ChannelLiquidity<T> {
/// The difference between the upper and lower liquidity bounds.
fn liquidity_offset_delta(&self) -> u64 {
self.max_liquidity_offset_msat - self.min_liquidity_offset_msat
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are relative offsets (i.e., from the channel capacity / max htlc and zero, respectively). You want the absolute, decayed values instead. Otherwise, no knowledge (i.e., offsets of zero) would be preferred. See min_liquidity_msat and max_liquidity_msat instead, which also account for decays.

Please also add tests. :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those are available for the directed version, I can put fake info to get those, otherwise not sure the best way to handle this

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think just adding a private function using the canonical order is fine. It doesn't matter which way they are directed so long as they are the same direction when compared.

@TheBlueMatt
Copy link
Collaborator

Combining the buckets (which I think we need to do) shouldn't be too hard, I think. Basically just add each bucket up and divide by two, I think. Set the last-update time to the newest one and call it a day.

@benthecarman
Copy link
Contributor Author

I think I am now correctly combining the channel liquidities

Comment on lines +821 to +822
self.min_liquidity_offset_msat = self.min_liquidity_offset_msat.max(other.min_liquidity_offset_msat);
self.max_liquidity_offset_msat = self.max_liquidity_offset_msat.min(other.max_liquidity_offset_msat);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add a test. You need to use the methods on DirectedChannelLiquidity otherwise these aren't decayed and thus aren't directly comparable. Each ChannelLiquidity will likely have a different last_updated, meaning one may need to be decayed more than the other.

self.max_liquidity_offset_history.combine(other.max_liquidity_offset_history);

if self.last_updated.duration_since(other.last_updated) < Duration::from_secs(0) {
self.last_updated = other.last_updated;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing last_updated without decaying the liquidity offsets to reflect the new time won't work. The data is stored as undecayed offsets. When the penalty is calculated, the offsets are decayed based on elapsed time since the last_updated. And any time last_updated is modified, any unaffected offset from the update is set to its decayed value. You'll need to do something similar here, otherwise offsets may not be properly decayed.

See DirectedChannelLiquidity::set_min_liquidity_msat and DirectedChannelLiquidity::set_max_liquidity_msat. I think it would be easiest to normalize self and other by setting setting each offset to their decayed values and set last_updated to T::now(). Then they can be directly compared.

@wpaulino
Copy link
Contributor

Is this still on your radar @benthecarman?

@benthecarman benthecarman deleted the combine-scorer branch October 31, 2023 17:58
@benthecarman
Copy link
Contributor Author

This would be cool functionality but I'm not really sure what I'm doing here

@TheBlueMatt
Copy link
Collaborator

Shame, I don't think this PR was that far away.

@TheBlueMatt TheBlueMatt mentioned this pull request Nov 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants