Skip to content

Allow combining of Prob Scorers #2469

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 51 additions & 0 deletions lightning/src/routing/scoring.rs
Original file line number Diff line number Diff line change
Expand Up @@ -650,6 +650,20 @@ struct HistoricalBucketRangeTracker {

impl HistoricalBucketRangeTracker {
fn new() -> Self { Self { buckets: [0; 8] } }

/// Returns the average of the buckets between the two trackers.
pub(crate) fn combine(mut self, other: Self) -> Self {
let mut buckets: [u16; 8] = [0; 8];
for i in 0..8 {
let current = self.buckets[i];
let other = other.buckets[i];
buckets[i] = (current + other) / 2;
}

self.buckets = buckets;
self
}

fn track_datapoint(&mut self, liquidity_offset_msat: u64, capacity_msat: u64) {
// We have 8 leaky buckets for min and max liquidity. Each bucket tracks the amount of time
// we spend in each bucket as a 16-bit fixed-point number with a 5 bit fractional part.
Expand Down Expand Up @@ -787,6 +801,7 @@ impl HistoricalMinMaxBuckets<'_> {
/// Direction is defined in terms of [`NodeId`] partial ordering, where the source node is the
/// first node in the ordering of the channel's counterparties. Thus, swapping the two liquidity
/// offset fields gives the opposite direction.
#[derive(Clone)]
struct ChannelLiquidity<T: Time> {
/// Lower channel liquidity bound in terms of an offset from zero.
min_liquidity_offset_msat: u64,
Expand All @@ -801,6 +816,21 @@ struct ChannelLiquidity<T: Time> {
max_liquidity_offset_history: HistoricalBucketRangeTracker,
}

impl<T: Time> ChannelLiquidity<T> {
pub(crate) fn combine(mut self, other: Self) -> Self {
self.min_liquidity_offset_msat = self.min_liquidity_offset_msat.max(other.min_liquidity_offset_msat);
self.max_liquidity_offset_msat = self.max_liquidity_offset_msat.min(other.max_liquidity_offset_msat);
Comment on lines +821 to +822
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add a test. You need to use the methods on DirectedChannelLiquidity otherwise these aren't decayed and thus aren't directly comparable. Each ChannelLiquidity will likely have a different last_updated, meaning one may need to be decayed more than the other.

self.min_liquidity_offset_history.combine(other.min_liquidity_offset_history);
self.max_liquidity_offset_history.combine(other.max_liquidity_offset_history);

if self.last_updated.duration_since(other.last_updated) < Duration::from_secs(0) {
self.last_updated = other.last_updated;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing last_updated without decaying the liquidity offsets to reflect the new time won't work. The data is stored as undecayed offsets. When the penalty is calculated, the offsets are decayed based on elapsed time since the last_updated. And any time last_updated is modified, any unaffected offset from the update is set to its decayed value. You'll need to do something similar here, otherwise offsets may not be properly decayed.

See DirectedChannelLiquidity::set_min_liquidity_msat and DirectedChannelLiquidity::set_max_liquidity_msat. I think it would be easiest to normalize self and other by setting setting each offset to their decayed values and set last_updated to T::now(). Then they can be directly compared.

}

self
}
}

/// A snapshot of [`ChannelLiquidity`] in one direction assuming a certain channel capacity and
/// decayed with a given half life.
struct DirectedChannelLiquidity<L: Deref<Target = u64>, BRT: Deref<Target = HistoricalBucketRangeTracker>, T: Time, U: Deref<Target = T>> {
Expand Down Expand Up @@ -942,6 +972,27 @@ impl<G: Deref<Target = NetworkGraph<L>>, L: Deref, T: Time> ProbabilisticScorerU
}
None
}

/// Combines two `ProbabilisticScorerUsingTime` into one.
///
/// This merges the channel liquidity information of both scorers.
pub fn combine(mut self, other: Self) -> Self {
let mut channel_liquidities = self.channel_liquidities;

for (id, item) in other.channel_liquidities {
match channel_liquidities.get(&id) {
None => { channel_liquidities.insert(id, item); },
Some(current) => {
let liquidity = current.clone();
channel_liquidities.insert(id, liquidity.combine(item));
}
}
}

self.channel_liquidities = channel_liquidities;

self
}
}

impl<T: Time> ChannelLiquidity<T> {
Expand Down