Skip to content

Conversation

@erskingardner
Copy link
Member

@erskingardner erskingardner commented Sep 11, 2025

I would like to have a section in the developer settings screen where you can inspect and delete all your key packages that have been published to relays.

This might end up in a key package settings screen in the future to help users clean up the same.

I believe we might also need to build a little modal in the app to get people to publish a new key package - many early user's will have key packages that are outdated at this point (I ran into this error yesterday testing an account).

Summary by CodeRabbit

  • New Features

    • New public operations to publish, fetch, delete single, and bulk-delete an account's MLS key packages from configured relays, with optional local key cleanup and per-item/result logging.
  • Bug Fixes

    • Clearer error when an account has no key-package relays configured.
    • Minor efficiency improvements reducing unnecessary cloning in relay checks and tests.
    • Key-package publish now only occurs when a prior deletion actually completed, avoiding redundant publishes.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Sep 11, 2025

Walkthrough

This PR avoids cloning single-element relay slices, adds public Whitenoise APIs to fetch/delete account key-package events (with optional local MLS cleanup), changes the key-package deletion API to return a boolean that gates publishing, and adds the AccountMissingKeyPackageRelays error variant.

Changes

Cohort / File(s) Summary of Changes
Nostr subscriptions
src/nostr_manager/subscriptions.rs
Replaced ensure_relays_connected(&[relay_url.clone()]) with ensure_relays_connected(std::slice::from_ref(&relay_url)) to avoid cloning; subscription call unchanged.
DB test (Whitenoise)
src/whitenoise/database/group_information.rs
Test updated to pass a single-element slice via std::slice::from_ref(&group_id) instead of cloning the element.
Whitenoise key packages
src/whitenoise/key_packages.rs
publish_key_package_for_account made pub; deletion API renamed to delete_key_package_for_account(...) -> Result<bool> and now returns whether a deletion occurred; added fetch_all_key_packages_for_account and delete_all_key_packages_for_account; per-event streaming (10s window), optional MLS local-key cleanup, and richer logging and per-event error handling.
Whitenoise errors
src/whitenoise/error.rs
Added AccountMissingKeyPackageRelays error variant ("Account has no key package relays configured").
Event handler update
src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs
Replaced unconditional delete-and-publish with let deleted = self.delete_key_package_for_account(...).await?; and only publish when deleted is true; logs when publish is skipped.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor Caller
  participant W as Whitenoise
  participant R as Relays

  Caller->>W: fetch_all_key_packages_for_account(account)
  alt relays configured
    W->>R: stream key-package events (10s window)
    R-->>W: events[]
    W-->>Caller: Vec<Event>
  else no relays
    W-->>Caller: Err(AccountMissingKeyPackageRelays)
  end
Loading
sequenceDiagram
  autonumber
  actor Handler
  participant W as Whitenoise
  participant L as MLS_Storage
  participant R as Relays

  Handler->>W: delete_key_package_for_account(account, delete_local?)
  W->>W: find_all_key_packages_for_account(account)
  alt events found
    loop per event
      opt delete_local?
        W->>L: remove local MLS key
        L-->>W: OK/Err
      end
      W->>R: publish deletion (signed)
      R-->>W: Ack/Result
    end
    W-->>Handler: Ok(true/false or count)
  else no events
    W-->>Handler: Ok(false)
  end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Suggested reviewers

  • delcin-raj
  • jgmontoya

Pre-merge checks (3 passed)

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title "Add methods to manage key_packages" is a concise, single-sentence summary that accurately captures the primary change (adding key-package management methods and related delete/fetch/publish behavior) described in the PR and raw summary. It is specific enough for a reviewer to understand the main intent, avoids noise or vague wording, and aligns with the PR objectives about inspecting and deleting key packages.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

Poem

A rabbit nibbles at the code so bright,
I borrow slices, skip the needless bite.
I find old keys and sweep their track,
Only publish when deletions come back —
Hoppity hops, relays rest through the night. 🥕

✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch key-package-mgmt

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
src/whitenoise/key_packages.rs (3)

136-139: Deduplicate events returned from multiple relays.

The same event can arrive from several relays; without de-duping, downstream deletion will publish duplicates and attempt duplicate local deletions.

Apply:

@@
-        let mut key_package_events = Vec::new();
+        let mut key_package_events = Vec::new();
         while let Some(event) = key_package_stream.next().await {
             key_package_events.push(event);
         }
+
+        // Dedupe by event id
+        let mut seen = std::collections::HashSet::new();
+        key_package_events.retain(|e| seen.insert(e.id));

Also applies to: 141-149


196-229: Avoid re-creating NostrMls per event; create once when needed.

Constructing storage for every event is I/O-heavy. Build it once when delete_mls_stored_keys is true, reuse it across the loop, and keep existing “warn and continue” behavior on init failure.

Apply:

@@
-        let mut deleted_count = 0;
+        let mut deleted_count = 0;
+
+        // Create NostrMls once if we need to delete locally stored keys
+        let nostr_mls = if delete_mls_stored_keys {
+            match Account::create_nostr_mls(account.pubkey, &self.config.data_dir) {
+                Ok(v) => Some(v),
+                Err(e) => {
+                    tracing::warn!(
+                        target: "whitenoise::delete_all_key_packages_for_account",
+                        "Failed to create NostrMls instance: {}",
+                        e
+                    );
+                    None
+                }
+            }
+        } else {
+            None
+        };
@@
-            if delete_mls_stored_keys {
-                match Account::create_nostr_mls(account.pubkey, &self.config.data_dir) {
-                    Ok(nostr_mls) => match nostr_mls.parse_key_package(event) {
-                        Ok(key_package) => {
-                            if let Err(e) = nostr_mls.delete_key_package_from_storage(&key_package)
-                            {
-                                tracing::warn!(
-                                    target: "whitenoise::delete_all_key_packages_for_account",
-                                    "Failed to delete key package from storage for event {}: {}",
-                                    event.id,
-                                    e
-                                );
-                            }
-                        }
-                        Err(e) => {
-                            tracing::warn!(
-                                target: "whitenoise::delete_all_key_packages_for_account",
-                                "Failed to parse key package for event {}: {}",
-                                event.id,
-                                e
-                            );
-                        }
-                    },
-                    Err(e) => {
-                        tracing::warn!(
-                            target: "whitenoise::delete_all_key_packages_for_account",
-                            "Failed to create NostrMls instance: {}",
-                            e
-                        );
-                    }
-                }
-            }
+            if let (true, Some(nostr_mls)) = (delete_mls_stored_keys, &nostr_mls) {
+                match nostr_mls.parse_key_package(event) {
+                    Ok(key_package) => {
+                        if let Err(e) = nostr_mls.delete_key_package_from_storage(&key_package) {
+                            tracing::warn!(
+                                target: "whitenoise::delete_all_key_packages_for_account",
+                                "Failed to delete key package from storage for event {}: {}",
+                                event.id,
+                                e
+                            );
+                        }
+                    }
+                    Err(e) => {
+                        tracing::warn!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Failed to parse key package for event {}: {}",
+                            event.id,
+                            e
+                        );
+                    }
+                }
+            }

196-253: Optional: publish deletions with bounded concurrency.

For many key packages, publishing sequentially can be slow. Consider FuturesUnordered with a small concurrency cap (e.g., 8–16) to improve latency while not overloading relays. Keep counters based on per-future results.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e70462b and 90b560e.

📒 Files selected for processing (3)
  • src/nostr_manager/subscriptions.rs (1 hunks)
  • src/whitenoise/database/group_information.rs (1 hunks)
  • src/whitenoise/key_packages.rs (1 hunks)
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.ensure_relays_connected()` before publishing events. This means explicit relay connectivity checks are not needed before calling methods like `publish_relay_list_for_pubkey` that internally use `publish_event_builder_with_signer`.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.add_relays()` before publishing events. This means explicit relay connectivity checks are not needed before calling publishing methods that use this function.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#324
File: src/integration_tests/test_cases/subscription_processing/publish_metadata_update.rs:42-45
Timestamp: 2025-08-23T16:27:41.511Z
Learning: In Whitenoise integration tests, using `unwrap()` on `test_client.add_relay()` for dev relay connections is acceptable since these are controlled test environments where failing fast is preferred over complex error handling.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#340
File: src/whitenoise/event_processor/mod.rs:121-158
Timestamp: 2025-09-04T16:58:00.736Z
Learning: Repository parres-hq/whitenoise: For retry scheduling hardening (schedule_retry in src/whitenoise/event_processor/mod.rs and backoff logic in RetryInfo in src/types.rs), the team prefers to land the improvements in a separate PR, tracked via a GitHub issue, as requested by jgmontoya in PR #340.
📚 Learning: 2025-09-04T03:38:54.953Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#340
File: src/nostr_manager/subscriptions.rs:72-89
Timestamp: 2025-09-04T03:38:54.953Z
Learning: In the global subscription batching system for Whitenoise (src/nostr_manager/subscriptions.rs), the current hash(pubkey) % batch_count approach causes batch ID reshuffling when user count changes, potentially leading to stale/duplicate author lists since only the triggering user's batch is refreshed. This is a known limitation documented in PR #340 that the team is aware of and will address in future iterations.

Applied to files:

  • src/nostr_manager/subscriptions.rs
📚 Learning: 2025-08-11T10:25:55.436Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.ensure_relays_connected()` before publishing events. This means explicit relay connectivity checks are not needed before calling methods like `publish_relay_list_for_pubkey` that internally use `publish_event_builder_with_signer`.

Applied to files:

  • src/nostr_manager/subscriptions.rs
🧬 Code graph analysis (1)
src/whitenoise/key_packages.rs (1)
src/whitenoise/accounts.rs (6)
  • account (1070-1076)
  • account (1168-1174)
  • key_package_relays (110-116)
  • new (51-70)
  • create_nostr_mls (250-257)
  • create_nostr_mls (957-959)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: check (macos-14, native)
  • GitHub Check: check (macos-latest, native)
  • GitHub Check: check (ubuntu-latest, native)
🔇 Additional comments (3)
src/whitenoise/database/group_information.rs (1)

312-314: Nice borrow: avoid cloning in the single-element slice.

Using std::slice::from_ref(&group_id) removes an unnecessary clone and keeps intent clear.

src/nostr_manager/subscriptions.rs (1)

170-173: Good: zero-copy relay slice.

Switching to std::slice::from_ref(&relay_url) avoids a temporary Vec/clone and keeps ownership of relay_url for the subsequent move.

src/whitenoise/key_packages.rs (1)

231-235: No change needed — publish_event_deletion_with_signer already ensures relay connectivity.
publish_event_deletion_with_signer delegates to publish_event_builder_with_signer, which calls self.ensure_relays_connected(&urls).

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/whitenoise/key_packages.rs (1)

60-70: Potentially missing events: streaming only from currently-connected relays. Ensure connection or stream from explicit relays.

Without connecting to the account’s key-package relays first, stream_events may miss the target event on disconnected relays.

Apply:

@@
-        let key_package_filter = Filter::new()
+        let key_package_filter = Filter::new()
             .id(*event_id)
             .kind(Kind::MlsKeyPackage)
             .author(account.pubkey);
 
-        let mut key_package_stream = self
-            .nostr
-            .client
-            .stream_events(key_package_filter, Duration::from_secs(5))
-            .await?;
+        // Ensure we're querying the account's key-package relays
+        let key_package_relays = account.key_package_relays(self).await?;
+        if key_package_relays.is_empty() {
+            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
+        }
+        let relay_urls: Vec<RelayUrl> =
+            key_package_relays.iter().map(|r| r.url.clone()).collect();
+        // Option A: connect first, then stream
+        // self.nostr.ensure_relays_connected(&relay_urls).await?;
+        // let mut key_package_stream = self
+        //     .nostr
+        //     .client
+        //     .stream_events(key_package_filter, Duration::from_secs(5))
+        //     .await?;
+        // Option B (preferred): stream directly from explicit relays
+        let mut key_package_stream = self
+            .nostr
+            .client
+            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(5))
+            .await?;
@@
-            let key_package_relays = account.key_package_relays(self).await?;
-            if key_package_relays.is_empty() {
-                return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
-            }
-
             self.nostr
                 .publish_event_deletion_with_signer(
                     &event.id,
                     &key_package_relays,
                     signer,
                 )
                 .await?;

Also applies to: 86-95

♻️ Duplicate comments (1)
src/whitenoise/key_packages.rs (1)

128-144: Resolved: now streaming from explicit relay URLs and using the computed list.

This addresses the earlier concern about unused relay_urls and disconnected-relay queries.

🧹 Nitpick comments (6)
src/whitenoise/key_packages.rs (6)

24-27: Propagate Nostr MLS errors with the typed variant instead of wrapping into Configuration.

Preserve error fidelity and existing #[from] nostr_mls::Error conversion.

Apply:

-        let result = nostr_mls
-            .create_key_package_for_event(&account.pubkey, key_package_relay_urls)
-            .map_err(|e| WhitenoiseError::Configuration(format!("NostrMls error: {}", e)))?;
+        let result = nostr_mls
+            .create_key_package_for_event(&account.pubkey, key_package_relay_urls)?;

37-39: Remove redundant empty-relays check (already enforced in encoded_key_package).

Avoid double-fetch and double-branch; rely on the helper’s check.

Apply:

-        if relays.is_empty() {
-            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
-        }

145-158: De-duplicate events across relays before returning.

Multiple relays may yield the same event; avoid downstream duplicate deletions.

Apply:

-        let mut key_package_events = Vec::new();
-        while let Some(event) = key_package_stream.next().await {
-            key_package_events.push(event);
-        }
+        use std::collections::HashSet;
+        let mut seen = HashSet::new();
+        let mut key_package_events = Vec::new();
+        while let Some(event) = key_package_stream.next().await {
+            if seen.insert(event.id) {
+                key_package_events.push(event);
+            }
+        }

199-206: Minor: avoid secrets-store access when relays are missing.

Fetch and validate relays first, then load signing keys.

Apply:

-        let signer = self
-            .secrets_store
-            .get_nostr_keys_for_pubkey(&account.pubkey)?;
-        let key_package_relays = account.key_package_relays(self).await?;
-
-        if key_package_relays.is_empty() {
-            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
-        }
+        let key_package_relays = account.key_package_relays(self).await?;
+        if key_package_relays.is_empty() {
+            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
+        }
+        let signer = self
+            .secrets_store
+            .get_nostr_keys_for_pubkey(&account.pubkey)?;

208-241: Avoid per-event MLS storage re-initialization; create once, reuse.

Cuts repeated filesystem/DB work and log spam while preserving non-fatal behavior on errors.

Apply:

         let mut deleted_count = 0;
 
-        for event in &key_package_events {
-            if delete_mls_stored_keys {
-                match Account::create_nostr_mls(account.pubkey, &self.config.data_dir) {
-                    Ok(nostr_mls) => match nostr_mls.parse_key_package(event) {
-                        Ok(key_package) => {
-                            if let Err(e) = nostr_mls.delete_key_package_from_storage(&key_package)
-                            {
-                                tracing::warn!(
-                                    target: "whitenoise::delete_all_key_packages_for_account",
-                                    "Failed to delete key package from storage for event {}: {}",
-                                    event.id,
-                                    e
-                                );
-                            }
-                        }
-                        Err(e) => {
-                            tracing::warn!(
-                                target: "whitenoise::delete_all_key_packages_for_account",
-                                "Failed to parse key package for event {}: {}",
-                                event.id,
-                                e
-                            );
-                        }
-                    },
-                    Err(e) => {
-                        tracing::warn!(
-                            target: "whitenoise::delete_all_key_packages_for_account",
-                            "Failed to create NostrMls instance: {}",
-                            e
-                        );
-                    }
-                }
-            }
+        // Initialize MLS storage once if requested; continue without it on failure.
+        let nostr_mls = if delete_mls_stored_keys {
+            match Account::create_nostr_mls(account.pubkey, &self.config.data_dir) {
+                Ok(m) => Some(m),
+                Err(e) => {
+                    tracing::warn!(
+                        target: "whitenoise::delete_all_key_packages_for_account",
+                        "Failed to create NostrMls instance (will skip local key deletion): {}",
+                        e
+                    );
+                    None
+                }
+            }
+        } else {
+            None
+        };
+
+        for event in &key_package_events {
+            if let Some(nostr_mls) = &nostr_mls {
+                match nostr_mls.parse_key_package(event) {
+                    Ok(key_package) => {
+                        if let Err(e) = nostr_mls.delete_key_package_from_storage(&key_package) {
+                            tracing::warn!(
+                                target: "whitenoise::delete_all_key_packages_for_account",
+                                "Failed to delete key package from storage for event {}: {}",
+                                event.id,
+                                e
+                            );
+                        }
+                    }
+                    Err(e) => {
+                        tracing::warn!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Failed to parse key package for event {}: {}",
+                            event.id,
+                            e
+                        );
+                    }
+                }
+            }

139-144: Optional: expose the stream timeout as a configurable constant.

Makes tests and UX tuning easier without code changes.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d2a88da and 232af19.

📒 Files selected for processing (2)
  • src/whitenoise/error.rs (1 hunks)
  • src/whitenoise/key_packages.rs (5 hunks)
🧰 Additional context used
🧠 Learnings (4)
📚 Learning: 2025-08-17T19:34:30.333Z
Learnt from: jmcorgan
PR: parres-hq/whitenoise#318
File: src/whitenoise/accounts/messages.rs:193-195
Timestamp: 2025-08-17T19:34:30.333Z
Learning: In the whitenoise codebase, `Account::create_nostr_mls()` returns `Result<NostrMls<NostrMlsSqliteStorage>, AccountError>`, and `WhitenoiseError` has an `Account(#[from] AccountError)` variant that automatically converts. When handling MLS creation errors, use the `?` operator to let `AccountError` convert naturally to `WhitenoiseError::Account` rather than manually mapping to `WhitenoiseError::Configuration`.

Applied to files:

  • src/whitenoise/error.rs
📚 Learning: 2025-08-17T19:35:53.520Z
Learnt from: jmcorgan
PR: parres-hq/whitenoise#318
File: src/whitenoise/accounts/welcomes.rs:53-53
Timestamp: 2025-08-17T19:35:53.520Z
Learning: In the Whitenoise codebase, Account::create_nostr_mls should have its errors mapped to WhitenoiseError::NostrMlsError instead of using anyhow::anyhow!() wrapping, as this preserves type safety and specific error information.

Applied to files:

  • src/whitenoise/error.rs
📚 Learning: 2025-08-11T10:25:55.436Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.ensure_relays_connected()` before publishing events. This means explicit relay connectivity checks are not needed before calling methods like `publish_relay_list_for_pubkey` that internally use `publish_event_builder_with_signer`.

Applied to files:

  • src/whitenoise/key_packages.rs
📚 Learning: 2025-08-11T10:23:59.406Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/relays.rs:260-278
Timestamp: 2025-08-11T10:23:59.406Z
Learning: In `src/whitenoise/accounts/relays.rs`, the `publish_relay_list_for_account` method is only called after the account setup is complete, which guarantees that `account.nip65_relays` is already populated and will never be empty. Therefore, no fallback logic is needed when `target_relays` is None.

Applied to files:

  • src/whitenoise/key_packages.rs
🧬 Code graph analysis (1)
src/whitenoise/key_packages.rs (2)
src/whitenoise/accounts.rs (8)
  • key_package_relays (110-116)
  • account (1070-1076)
  • account (1168-1174)
  • relays (85-93)
  • relays (488-491)
  • new (51-70)
  • create_nostr_mls (250-257)
  • create_nostr_mls (957-959)
src/whitenoise/database/users.rs (1)
  • relays (166-198)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: check (macos-latest, native)
  • GitHub Check: check (ubuntu-latest, native)
🔇 Additional comments (2)
src/whitenoise/error.rs (1)

38-40: New error variant is appropriate and well-scoped.

Clear message, consistent naming, and placement. No issues.

src/whitenoise/key_packages.rs (1)

16-18: Good: explicit guard for missing key-package relays.

Prevents misleading publishes and surfaces a precise error.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/whitenoise/key_packages.rs (1)

67-76: Ensure streaming targets the account's key-package relays and connect first

Replace the client.stream_events(...) call in src/whitenoise/key_packages.rs (lines ~67–76) with an explicit lookup of the account's key-package relays, call self.nostr.ensure_relays_connected(&relay_urls).await? and then use client.stream_events_from(relay_urls, ...). Add use futures_util::StreamExt; if the trait import is missing.

@@
-        let mut key_package_stream = self
-            .nostr
-            .client
-            .stream_events(key_package_filter, Duration::from_secs(5))
-            .await?;
+        // Query explicitly against the account's key-package relays
+        let relay_urls: Vec<RelayUrl> = account
+            .key_package_relays(self)
+            .await?
+            .into_iter()
+            .map(|r| r.url)
+            .collect();
+
+        // Optional but recommended: ensure connectivity to avoid partial discovery
+        self.nostr.ensure_relays_connected(&relay_urls).await?;
+
+        let mut key_package_stream = self
+            .nostr
+            .client
+            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(5))
+            .await?;

Also add this import if absent:

+use futures_util::StreamExt;
♻️ Duplicate comments (1)
src/whitenoise/key_packages.rs (1)

132-149: Ensure connections before streaming from specific relays; also dedupe results

Good: using stream_events_from with explicit relays. However, unless stream_events_from also dials, you may still miss events on disconnected relays. Ensure connectivity first. Additionally, dedupe events by id across relays.

@@
-        let mut key_package_stream = self
+        // Ensure we're connected to these relays before streaming
+        self.nostr.ensure_relays_connected(&relay_urls).await?;
+
+        let mut key_package_stream = self
             .nostr
             .client
             .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(10))
             .await?;
@@
-        let mut key_package_events = Vec::new();
-        while let Some(event) = key_package_stream.next().await {
-            key_package_events.push(event);
-        }
+        let mut seen = std::collections::HashSet::new();
+        let mut key_package_events = Vec::new();
+        while let Some(event) = key_package_stream.next().await {
+            if seen.insert(event.id) {
+                key_package_events.push(event);
+            }
+        }

Additional imports (outside this hunk):

+use futures_util::StreamExt;
+use std::collections::HashSet;
Does nostr-sdk Client::stream_events_from connect to relays automatically, or must callers ensure connections first?
🧹 Nitpick comments (4)
src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs (1)

74-80: Include the key-package event id in logs for traceability

Make debugging easier by logging the event id you attempted to delete.

-                tracing::debug!(target: "whitenoise::event_processor::process_welcome", "Deleted used key package from relays");
+                tracing::debug!(target: "whitenoise::event_processor::process_welcome", "Deleted used key package from relays: {}", key_package_event_id);
@@
-                tracing::debug!(target: "whitenoise::event_processor::process_welcome", "Key package not found or already deleted");
+                tracing::debug!(target: "whitenoise::event_processor::process_welcome", "Key package not found or already deleted: {}", key_package_event_id);
src/whitenoise/key_packages.rs (3)

37-39: Redundant empty-relays check

publish_key_package_for_account already calls encoded_key_package(), which returns AccountMissingKeyPackageRelays when empty. This second check is redundant; remove to reduce duplicate I/O and branching.

-        if relays.is_empty() {
-            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
-        }

104-106: Update the tracing target to match the function name

The log target still references the old function name. Rename for consistency.

-            tracing::warn!(target: "whitenoise::delete_key_package_from_relays_for_account", "Key package event not found for account: {}", account.pubkey.to_hex());
+            tracing::warn!(target: "whitenoise::delete_key_package_for_account", "Key package event not found for account: {}", account.pubkey.to_hex());

249-256: Count deletions only when any relay reports success

Treat a publish that returned with an empty success set as a non-deletion, so metrics reflect actual state.

-                Ok(_) => {
-                    deleted_count += 1;
+                Ok(res) => {
+                    if !res.success.is_empty() {
+                        deleted_count += 1;
+                    }
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 232af19 and 1ce5b44.

📒 Files selected for processing (2)
  • src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs (1 hunks)
  • src/whitenoise/key_packages.rs (4 hunks)
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/relays.rs:260-278
Timestamp: 2025-08-11T10:23:59.406Z
Learning: In `src/whitenoise/accounts/relays.rs`, the `publish_relay_list_for_account` method is only called after the account setup is complete, which guarantees that `account.nip65_relays` is already populated and will never be empty. Therefore, no fallback logic is needed when `target_relays` is None.
📚 Learning: 2025-08-11T10:25:55.436Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.ensure_relays_connected()` before publishing events. This means explicit relay connectivity checks are not needed before calling methods like `publish_relay_list_for_pubkey` that internally use `publish_event_builder_with_signer`.

Applied to files:

  • src/whitenoise/key_packages.rs
🧬 Code graph analysis (2)
src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs (1)
src/whitenoise/accounts.rs (2)
  • account (1070-1076)
  • account (1168-1174)
src/whitenoise/key_packages.rs (2)
src/whitenoise/accounts.rs (8)
  • key_package_relays (110-116)
  • account (1070-1076)
  • account (1168-1174)
  • relays (85-93)
  • relays (488-491)
  • new (51-70)
  • create_nostr_mls (250-257)
  • create_nostr_mls (957-959)
src/whitenoise/database/users.rs (1)
  • relays (166-198)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: check (ubuntu-latest, native)
  • GitHub Check: check (macos-14, native)
  • GitHub Check: check (macos-latest, native)
🔇 Additional comments (1)
src/whitenoise/key_packages.rs (1)

14-18: Good guard: fail fast when no key-package relays are configured

Early return with AccountMissingKeyPackageRelays is correct here and prevents generating unusable packages.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/whitenoise/key_packages.rs (1)

67-77: Stream from the account’s key-package relays to avoid false negatives.

stream_events only queries currently-connected relays. If we're not connected, we'll miss the event and incorrectly return false, which can suppress publishing in the Welcome path. Query the configured key-package relays explicitly and reuse them for deletion.

Apply:

-        let mut key_package_stream = self
-            .nostr
-            .client
-            .stream_events(key_package_filter, Duration::from_secs(5))
-            .await?;
+        // Query only the account's key-package relays to avoid connectivity-related false negatives
+        let key_package_relays = account.key_package_relays(self).await?;
+        if key_package_relays.is_empty() {
+            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
+        }
+        let relay_urls: Vec<RelayUrl> = key_package_relays.iter().map(|r| r.url.clone()).collect();
+        let mut key_package_stream = self
+            .nostr
+            .client
+            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(5))
+            .await?;
@@
-            let key_package_relays = account.key_package_relays(self).await?;
-            if key_package_relays.is_empty() {
-                return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
-            }
-
             let result = self
                 .nostr
                 .publish_event_deletion_with_signer(&event.id, &key_package_relays, signer)
                 .await?;

Also applies to: 88-96

🧹 Nitpick comments (7)
src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs (2)

67-74: Switch to boolean-returning delete API looks good; consider logging the event id for traceability.

Including the event_id in the debug path makes relay-side investigations easier.

Apply:

             let deleted = self
                 .delete_key_package_for_account(
                     account,
                     &key_package_event_id,
                     false, // For now we don't want to delete the key packages from MLS storage
                 )
                 .await?;

Optionally:

-            let deleted = self
+            let deleted = self
                 .delete_key_package_for_account(
                     account,
                     &key_package_event_id,
                     false, // For now we don't want to delete the key packages from MLS storage
                 )
                 .await?;

Follow-up logging change is in the next hunk.


75-81: Keep publish gated on deletion success; add a brief comment and include event_id in logs.

This aligns with the intended behavior for Welcome handling (publish only if we actually deleted the used key package).

Apply:

-            if deleted {
-                tracing::debug!(target: "whitenoise::event_processor::process_welcome", "Deleted used key package from relays");
-                self.publish_key_package_for_account(account).await?;
-                tracing::debug!(target: "whitenoise::event_processor::process_welcome", "Published new key package");
-            } else {
-                tracing::debug!(target: "whitenoise::event_processor::process_welcome", "Key package already deleted, skipping publish");
-            }
+            // Intentional: publish a replacement only if we successfully deleted the used key package.
+            // Prevents redundant publishes when another client already deleted it.
+            if deleted {
+                tracing::debug!(
+                    target: "whitenoise::event_processor::process_welcome",
+                    event_id = %key_package_event_id,
+                    "Deleted used key package from relays"
+                );
+                self.publish_key_package_for_account(account).await?;
+                tracing::debug!(
+                    target: "whitenoise::event_processor::process_welcome",
+                    event_id = %key_package_event_id,
+                    "Published new key package"
+                );
+            } else {
+                tracing::debug!(
+                    target: "whitenoise::event_processor::process_welcome",
+                    event_id = %key_package_event_id,
+                    "Key package already deleted or not found; skipping publish"
+                );
+            }
src/whitenoise/key_packages.rs (5)

14-18: Avoid unnecessary NostrMls initialization when relays are missing.

Move create_nostr_mls after the empty-relays check to skip opening storage on early return.

Apply:

-        let nostr_mls = Account::create_nostr_mls(account.pubkey, &self.config.data_dir)?;
-        let key_package_relays = account.key_package_relays(self).await?;
+        let key_package_relays = account.key_package_relays(self).await?;
 
         if key_package_relays.is_empty() {
             return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
         }
 
-        let key_package_relay_urls = key_package_relays
+        let key_package_relay_urls = key_package_relays
             .iter()
             .map(|r| r.url.clone())
             .collect::<Vec<RelayUrl>>();
+        let nostr_mls = Account::create_nostr_mls(account.pubkey, &self.config.data_dir)?;

37-39: Duplicate empty-relays check; consider returning relays from encoded_key_package.

encoded_key_package already enforces non-empty relays. To avoid a second DB fetch, return relays alongside (encoded, tags) and reuse here.

If you prefer to keep the signature, consider at least documenting why the second check is retained for defense-in-depth.


101-103: Fix tracing target to match the function name.

Prevents confusion in logs and makes searching easier.

Apply:

-            tracing::warn!(target: "whitenoise::delete_key_package_from_relays_for_account", "Key package event not found for account: {}", account.pubkey.to_hex());
+            tracing::warn!(target: "whitenoise::delete_key_package_for_account", "Key package event not found for account: {}", account.pubkey.to_hex());

129-146: Deduplicate events returned from multiple relays.

Multiple relays can return the same event; dedupe by event.id before returning/counting.

Apply:

         let mut key_package_events = Vec::new();
         while let Some(event) = key_package_stream.next().await {
             key_package_events.push(event);
         }
+        // Deduplicate across relays
+        let mut seen = std::collections::HashSet::new();
+        key_package_events.retain(|e| seen.insert(e.id));

And add (outside this hunk):

use std::collections::HashSet;

Also applies to: 147-151


246-269: Count deletions only when at least one relay acknowledges.

Treating any Ok(_) as success can overcount when no relay accepts the deletion.

Apply:

-            match self
+            match self
                 .nostr
                 .publish_event_deletion_with_signer(&event.id, &key_package_relays, signer.clone())
                 .await
             {
-                Ok(_) => {
-                    deleted_count += 1;
-                    tracing::debug!(
-                        target: "whitenoise::delete_all_key_packages_for_account",
-                        "Published deletion event for key package {}",
-                        event.id
-                    );
-                }
+                Ok(res) => {
+                    if !res.success.is_empty() {
+                        deleted_count += 1;
+                        tracing::debug!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Published deletion for key package {} (ok: {}, failed: {})",
+                            event.id,
+                            res.success.len(),
+                            res.failed.len()
+                        );
+                    } else {
+                        tracing::warn!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Deletion for key package {} yielded no successful relays",
+                            event.id
+                        );
+                    }
+                }
                 Err(e) => {
                     tracing::error!(
                         target: "whitenoise::delete_all_key_packages_for_account",
                         "Failed to publish deletion event for key package {}: {}",
                         event.id,
                         e
                     );
                 }
             }
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 1ce5b44 and 507be72.

📒 Files selected for processing (2)
  • src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs (1 hunks)
  • src/whitenoise/key_packages.rs (4 hunks)
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: erskingardner
PR: parres-hq/whitenoise#354
File: src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs:0-0
Timestamp: 2025-09-11T13:04:33.312Z
Learning: In the whitenoise project, when processing MLS Welcome messages in handle_giftwrap.rs, the code should only publish a replacement key package if the deletion of the used key package was successful (deleted == true). The conditional publishing behavior is intentional and should not be changed to always publish regardless of deletion success.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/relays.rs:260-278
Timestamp: 2025-08-11T10:23:59.406Z
Learning: In `src/whitenoise/accounts/relays.rs`, the `publish_relay_list_for_account` method is only called after the account setup is complete, which guarantees that `account.nip65_relays` is already populated and will never be empty. Therefore, no fallback logic is needed when `target_relays` is None.
📚 Learning: 2025-09-11T13:04:33.312Z
Learnt from: erskingardner
PR: parres-hq/whitenoise#354
File: src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs:0-0
Timestamp: 2025-09-11T13:04:33.312Z
Learning: In the whitenoise project, when processing MLS Welcome messages in handle_giftwrap.rs, the code should only publish a replacement key package if the deletion of the used key package was successful (deleted == true). The conditional publishing behavior is intentional and should not be changed to always publish regardless of deletion success.

Applied to files:

  • src/whitenoise/key_packages.rs
  • src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs
📚 Learning: 2025-08-11T10:25:55.436Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.ensure_relays_connected()` before publishing events. This means explicit relay connectivity checks are not needed before calling methods like `publish_relay_list_for_pubkey` that internally use `publish_event_builder_with_signer`.

Applied to files:

  • src/whitenoise/key_packages.rs
🧬 Code graph analysis (1)
src/whitenoise/key_packages.rs (2)
src/whitenoise/accounts.rs (8)
  • key_package_relays (110-116)
  • account (1070-1076)
  • account (1168-1174)
  • relays (85-93)
  • relays (488-491)
  • new (51-70)
  • create_nostr_mls (250-257)
  • create_nostr_mls (957-959)
src/whitenoise/database/users.rs (1)
  • relays (166-198)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: check (macos-14, native)
  • GitHub Check: check (macos-latest, native)
  • GitHub Check: check (ubuntu-latest, native)
🔇 Additional comments (1)
src/whitenoise/key_packages.rs (1)

53-61: API change ripple-check — confirm callers handle Result

Search found only one caller: src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs (it assigns to deleted), but the provided output didn't show whether the call .awaits and handles/propagates the boolean. Ensure no callers still expect Result<()> or ignore the deletion outcome.

)
.await?;

if deleted {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previously we were publishing no matter what here which meant that if there were multiple giftwraps that we processed (that referenced the same key package) then we'd end up publishing way more key packages than we needed. Now we only publish a new key package is we actually removed one from relays.

Copy link
Contributor

@jgmontoya jgmontoya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, left some suggestions!

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/whitenoise/key_packages.rs (1)

1-6: Add missing StreamExt import (fixes compile error on .next().await).

next() is from futures_util::StreamExt. Without the trait in scope this file won’t compile.

 use crate::whitenoise::accounts::Account;
 use crate::whitenoise::error::{Result, WhitenoiseError};
 use crate::whitenoise::Whitenoise;
 use nostr_sdk::prelude::*;
+use futures_util::StreamExt;
 use std::time::Duration;
♻️ Duplicate comments (3)
src/whitenoise/key_packages.rs (3)

62-76: Stream from the account’s key-package relays and ensure connectivity before querying.

As previously noted, streaming against only currently-connected relays can miss events; also avoid refetching relays later in the function.

-        let key_package_filter = Filter::new()
-            .id(*event_id)
-            .kind(Kind::MlsKeyPackage)
-            .author(account.pubkey);
-
-        let mut key_package_stream = self
-            .nostr
-            .client
-            .stream_events(key_package_filter, Duration::from_secs(5))
-            .await?;
+        let key_package_relays = account.key_package_relays(self).await?;
+        if key_package_relays.is_empty() {
+            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
+        }
+        let relay_urls: Vec<RelayUrl> = key_package_relays.iter().map(|r| r.url.clone()).collect();
+        // Ensure we are connected to the target relays before streaming
+        self.nostr.ensure_relays_connected(&relay_urls).await?;
+
+        let key_package_filter = Filter::new()
+            .id(*event_id)
+            .kind(Kind::MlsKeyPackage)
+            .author(account.pubkey);
+
+        let mut key_package_stream = self
+            .nostr
+            .client
+            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(5))
+            .await?;
@@
-            let key_package_relays = account.key_package_relays(self).await?;
-            if key_package_relays.is_empty() {
-                return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
-            }
-
             let result = self
                 .nostr
                 .publish_event_deletion_with_signer(&event.id, &key_package_relays, signer)
                 .await?;

Also applies to: 88-96


124-139: Ensure connectivity before stream_events_from (and keep it within nostr_manager if possible).

Even with stream_events_from, make the connection step explicit to avoid transient misses; alternatively, add a nostr_manager helper that wraps both connect-and-stream.

         let mut key_package_stream = self
             .nostr
-            .client
-            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(10))
+            .ensure_relays_connected(&relay_urls)
+            .await
+            .and_then(|_| async {
+                self.nostr
+                    .client
+                    .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(10))
+                    .await
+            })?

206-233: Good: instantiate NostrMls once outside the loop.

Avoids per-event initialization overhead; aligns with prior feedback.

🧹 Nitpick comments (3)
src/whitenoise/key_packages.rs (3)

33-33: Remove stale comment about “holding the lock”.

There’s no lock here; the comment is misleading.

-        // Extract key package data while holding the lock
+        // Extract key package data

131-134: Consider scoping the filter (limit/since) to avoid large backfills.

If a user has many key packages, this may fetch more than needed. Add limit() or a since() bound if appropriate.

-        let key_package_filter = Filter::new()
-            .kind(Kind::MlsKeyPackage)
-            .author(account.pubkey);
+        let key_package_filter = Filter::new()
+            .kind(Kind::MlsKeyPackage)
+            .author(account.pubkey)
+            .limit(1000);

124-126: Minor: relay_urls is now used — keep it, but consider a small helper to DRY it up.

You build relay URLs in three places. A tiny helper (e.g., fn key_package_relay_urls(&self, account) -> Result<Vec<RelayUrl>>) would reduce repetition.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 507be72 and 3a25178.

📒 Files selected for processing (1)
  • src/whitenoise/key_packages.rs (4 hunks)
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: erskingardner
PR: parres-hq/whitenoise#354
File: src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs:0-0
Timestamp: 2025-09-11T13:04:33.312Z
Learning: In the whitenoise project, when processing MLS Welcome messages in handle_giftwrap.rs, the code should only publish a replacement key package if the deletion of the used key package was successful (deleted == true). The conditional publishing behavior is intentional and should not be changed to always publish regardless of deletion success.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/relays.rs:260-278
Timestamp: 2025-08-11T10:23:59.406Z
Learning: In `src/whitenoise/accounts/relays.rs`, the `publish_relay_list_for_account` method is only called after the account setup is complete, which guarantees that `account.nip65_relays` is already populated and will never be empty. Therefore, no fallback logic is needed when `target_relays` is None.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#340
File: src/whitenoise/event_processor/mod.rs:121-158
Timestamp: 2025-09-04T16:58:00.736Z
Learning: Repository parres-hq/whitenoise: For retry scheduling hardening (schedule_retry in src/whitenoise/event_processor/mod.rs and backoff logic in RetryInfo in src/types.rs), the team prefers to land the improvements in a separate PR, tracked via a GitHub issue, as requested by jgmontoya in PR #340.
📚 Learning: 2025-09-11T13:04:33.312Z
Learnt from: erskingardner
PR: parres-hq/whitenoise#354
File: src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs:0-0
Timestamp: 2025-09-11T13:04:33.312Z
Learning: In the whitenoise project, when processing MLS Welcome messages in handle_giftwrap.rs, the code should only publish a replacement key package if the deletion of the used key package was successful (deleted == true). The conditional publishing behavior is intentional and should not be changed to always publish regardless of deletion success.

Applied to files:

  • src/whitenoise/key_packages.rs
📚 Learning: 2025-08-11T10:25:55.436Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.ensure_relays_connected()` before publishing events. This means explicit relay connectivity checks are not needed before calling methods like `publish_relay_list_for_pubkey` that internally use `publish_event_builder_with_signer`.

Applied to files:

  • src/whitenoise/key_packages.rs
🧬 Code graph analysis (1)
src/whitenoise/key_packages.rs (1)
src/whitenoise/accounts.rs (8)
  • key_package_relays (110-116)
  • account (1070-1076)
  • account (1168-1174)
  • relays (85-93)
  • relays (488-491)
  • new (51-70)
  • create_nostr_mls (250-257)
  • create_nostr_mls (957-959)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: check (macos-14, native)
  • GitHub Check: check (ubuntu-latest, native)
  • GitHub Check: check (macos-latest, native)
🔇 Additional comments (4)
src/whitenoise/key_packages.rs (4)

14-23: Good: early relay check and correct tagging inputs.

Fail-fast on missing key-package relays and passing explicit relay URLs into create_key_package_for_event looks right.


54-56: Docs: return semantics are clear.

The boolean meaning is explicit and helpful.


93-100: Returning !result.success.is_empty() matches the intended gating behavior.

This aligns with the requirement to publish a replacement key package only if deletion succeeded.


37-39: Remove relays.is_empty() check — publish_event_builder_with_signer already ensures relay connectivity.

publish_key_package_with_signer delegates to publish_event_builder_with_signer (src/nostr_manager/publisher.rs:136–146), and per team learning publish_event_builder_with_signer auto-connects relays (calls add_relays/ensure_relays_connected), so the relays.is_empty() guard at src/whitenoise/key_packages.rs:37–39 is redundant and can be removed.

Comment on lines +239 to +261
match self
.nostr
.publish_event_deletion_with_signer(&event.id, &key_package_relays, signer.clone())
.await
{
Ok(_) => {
deleted_count += 1;
tracing::debug!(
target: "whitenoise::delete_all_key_packages_for_account",
"Published deletion event for key package {}",
event.id
);
}
Err(e) => {
tracing::error!(
target: "whitenoise::delete_all_key_packages_for_account",
"Failed to publish deletion event for key package {}: {}",
event.id,
e
);
}
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Count deletions only when at least one relay reports success.

Right now any Ok(_) increments the count even if no relay accepted the deletion. Match the single-delete path’s semantics.

-            match self
+            match self
                 .nostr
                 .publish_event_deletion_with_signer(&event.id, &key_package_relays, signer.clone())
                 .await
             {
-                Ok(_) => {
-                    deleted_count += 1;
-                    tracing::debug!(
-                        target: "whitenoise::delete_all_key_packages_for_account",
-                        "Published deletion event for key package {}",
-                        event.id
-                    );
-                }
+                Ok(result) => {
+                    if result.success.is_empty() {
+                        tracing::warn!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Deletion published but no relay reported success for {}",
+                            event.id
+                        );
+                    } else {
+                        deleted_count += 1;
+                        tracing::debug!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Published deletion event for key package {} ({} relay(s) ok)",
+                            event.id,
+                            result.success.len()
+                        );
+                    }
+                }
                 Err(e) => {
                     tracing::error!(
                         target: "whitenoise::delete_all_key_packages_for_account",
                         "Failed to publish deletion event for key package {}: {}",
                         event.id,
                         e
                     );
                 }
             }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
match self
.nostr
.publish_event_deletion_with_signer(&event.id, &key_package_relays, signer.clone())
.await
{
Ok(_) => {
deleted_count += 1;
tracing::debug!(
target: "whitenoise::delete_all_key_packages_for_account",
"Published deletion event for key package {}",
event.id
);
}
Err(e) => {
tracing::error!(
target: "whitenoise::delete_all_key_packages_for_account",
"Failed to publish deletion event for key package {}: {}",
event.id,
e
);
}
}
}
match self
.nostr
.publish_event_deletion_with_signer(&event.id, &key_package_relays, signer.clone())
.await
{
Ok(result) => {
if result.success.is_empty() {
tracing::warn!(
target: "whitenoise::delete_all_key_packages_for_account",
"Deletion published but no relay reported success for {}",
event.id
);
} else {
deleted_count += 1;
tracing::debug!(
target: "whitenoise::delete_all_key_packages_for_account",
"Published deletion event for key package {} ({} relay(s) ok)",
event.id,
result.success.len()
);
}
}
Err(e) => {
tracing::error!(
target: "whitenoise::delete_all_key_packages_for_account",
"Failed to publish deletion event for key package {}: {}",
event.id,
e
);
}
}
}
🤖 Prompt for AI Agents
In src/whitenoise/key_packages.rs around lines 239 to 261, the code increments
deleted_count on any Ok(_) from publish_event_deletion_with_signer even if no
relay actually accepted the deletion; change the Ok arm to inspect the returned
success information (the Ok value) and only increment deleted_count when at
least one relay reports success (e.g., check the returned per-relay results or
accepted count), log a debug message listing which relays accepted when
incrementing, and treat an Ok-with-no-accepts like a non-deletion (log a
warning/info and do not increment). Keep the Err branch unchanged.

@erskingardner erskingardner merged commit 0bff7e7 into master Sep 11, 2025
3 of 4 checks passed
@erskingardner erskingardner deleted the key-package-mgmt branch September 11, 2025 15:41
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (3)
src/whitenoise/key_packages.rs (3)

67-76: Stream from the account’s key‑package relays (and ensure connectivity) to avoid false negatives.
Using client.stream_events queries only currently-connected relays; if the client isn't connected, you'll miss the event and wrongly return Ok(false). Stream from the account’s relays (and connect first), then reuse the same key_package_relays for deletion.

Apply:

+        // Ensure we're connected to the account's key-package relays and stream from them
+        let key_package_relays = account.key_package_relays(self).await?;
+        if key_package_relays.is_empty() {
+            return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
+        }
+        let relay_urls: Vec<RelayUrl> = key_package_relays.iter().map(|r| r.url.clone()).collect();
+        self.nostr.ensure_relays_connected(&relay_urls).await?;
-        let mut key_package_stream = self
-            .nostr
-            .client
-            .stream_events(key_package_filter, Duration::from_secs(5))
-            .await?;
+        let mut key_package_stream = self
+            .nostr
+            .client
+            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(5))
+            .await?;
@@
-            let key_package_relays = account.key_package_relays(self).await?;
-            if key_package_relays.is_empty() {
-                return Err(WhitenoiseError::AccountMissingKeyPackageRelays);
-            }

Also applies to: 88-92


127-143: Ensure relay connectivity before streaming, or confirm stream_events_from dials on demand.
To avoid empty results due to connection state, connect first.

Apply:

         let mut key_package_stream = self
             .nostr
             .client
-            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(10))
+            // Connect to the relays before streaming to avoid missing events
+            // due to connection state.
+            // SAFETY: no-op if already connected.
+            // (If the SDK already dials on `stream_events_from`, this is harmless.)
+            // self.nostr.ensure_relays_connected(&relay_urls).await?;
+            .stream_events_from(relay_urls, key_package_filter, Duration::from_secs(10))
             .await?;

Would you like me to enable the commented line above, or do we have guarantees that nostr_sdk::Client::stream_events_from establishes connections automatically?


241-263: Count a deletion only when at least one relay reports success.
Currently any Ok(_) increments the count even if no relay accepted the deletion, inflating the result.

Apply:

-            {
-                Ok(_) => {
-                    deleted_count += 1;
-                    tracing::debug!(
-                        target: "whitenoise::delete_all_key_packages_for_account",
-                        "Published deletion event for key package {}",
-                        event.id
-                    );
-                }
+            {
+                Ok(result) => {
+                    if result.success.is_empty() {
+                        tracing::warn!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Deletion published but no relay reported success for {}",
+                            event.id
+                        );
+                    } else {
+                        deleted_count += 1;
+                        tracing::debug!(
+                            target: "whitenoise::delete_all_key_packages_for_account",
+                            "Published deletion event for key package {} ({} relay(s) ok)",
+                            event.id,
+                            result.success.len()
+                        );
+                    }
+                }
                 Err(e) => {
                     tracing::error!(
                         target: "whitenoise::delete_all_key_packages_for_account",
                         "Failed to publish deletion event for key package {}: {}",
                         event.id,
                         e
                     );
                 }
             }
🧹 Nitpick comments (3)
src/whitenoise/key_packages.rs (3)

32-39: Minor duplication: avoid re-checking empty relays.
encoded_key_package already errors on empty relays. Either:

  • return the relays from encoded_key_package to reuse here, or
  • keep fetching but drop the second emptiness check.

54-56: Clarify boolean semantics in the doc comment.
Spell out that “deleted” means “acknowledged by at least one relay.”

Apply:

-/// Returns `true` if a key package was found and deleted, `false` if no key package was found.
+/// Returns `true` if a matching key package was found and at least one relay acknowledged the deletion; `false` otherwise.

144-147: De-duplicate events returned by multiple relays.
Multiple relays may return the same event; dedupe by event.id.

Apply:

-        let mut key_package_events = Vec::new();
-        while let Some(event) = key_package_stream.next().await {
-            key_package_events.push(event);
-        }
+        let mut seen = std::collections::HashSet::new();
+        let mut key_package_events = Vec::new();
+        while let Some(event) = key_package_stream.next().await {
+            if seen.insert(event.id) {
+                key_package_events.push(event);
+            }
+        }

And add (outside this hunk) at the top of the file if not already present:

use std::collections::HashSet;
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3a25178 and dd87494.

📒 Files selected for processing (1)
  • src/whitenoise/key_packages.rs (4 hunks)
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: erskingardner
PR: parres-hq/whitenoise#354
File: src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs:0-0
Timestamp: 2025-09-11T13:04:33.312Z
Learning: In the whitenoise project, when processing MLS Welcome messages in handle_giftwrap.rs, the code should only publish a replacement key package if the deletion of the used key package was successful (deleted == true). The conditional publishing behavior is intentional and should not be changed to always publish regardless of deletion success.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/relays.rs:260-278
Timestamp: 2025-08-11T10:23:59.406Z
Learning: In `src/whitenoise/accounts/relays.rs`, the `publish_relay_list_for_account` method is only called after the account setup is complete, which guarantees that `account.nip65_relays` is already populated and will never be empty. Therefore, no fallback logic is needed when `target_relays` is None.
Learnt from: jgmontoya
PR: parres-hq/whitenoise#340
File: src/whitenoise/event_processor/mod.rs:121-158
Timestamp: 2025-09-04T16:58:00.736Z
Learning: Repository parres-hq/whitenoise: For retry scheduling hardening (schedule_retry in src/whitenoise/event_processor/mod.rs and backoff logic in RetryInfo in src/types.rs), the team prefers to land the improvements in a separate PR, tracked via a GitHub issue, as requested by jgmontoya in PR #340.
📚 Learning: 2025-09-11T13:04:33.312Z
Learnt from: erskingardner
PR: parres-hq/whitenoise#354
File: src/whitenoise/event_processor/event_handlers/handle_giftwrap.rs:0-0
Timestamp: 2025-09-11T13:04:33.312Z
Learning: In the whitenoise project, when processing MLS Welcome messages in handle_giftwrap.rs, the code should only publish a replacement key package if the deletion of the used key package was successful (deleted == true). The conditional publishing behavior is intentional and should not be changed to always publish regardless of deletion success.

Applied to files:

  • src/whitenoise/key_packages.rs
📚 Learning: 2025-08-11T10:25:55.436Z
Learnt from: jgmontoya
PR: parres-hq/whitenoise#310
File: src/whitenoise/accounts/mod.rs:669-692
Timestamp: 2025-08-11T10:25:55.436Z
Learning: In the Whitenoise codebase, the `NostrManager::publish_event_builder_with_signer` method automatically ensures relay connectivity by calling `self.ensure_relays_connected()` before publishing events. This means explicit relay connectivity checks are not needed before calling methods like `publish_relay_list_for_pubkey` that internally use `publish_event_builder_with_signer`.

Applied to files:

  • src/whitenoise/key_packages.rs
🧬 Code graph analysis (1)
src/whitenoise/key_packages.rs (2)
src/whitenoise/accounts.rs (8)
  • key_package_relays (110-116)
  • account (1070-1076)
  • account (1168-1174)
  • relays (85-93)
  • relays (488-491)
  • new (51-70)
  • create_nostr_mls (250-257)
  • create_nostr_mls (957-959)
src/whitenoise/database/users.rs (1)
  • relays (166-198)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: check (ubuntu-latest, native)
🔇 Additional comments (3)
src/whitenoise/key_packages.rs (3)

14-26: Good guard on missing relays; URL extraction looks solid.
This precheck prevents generating unusable key packages and passes only validated relay URLs to MLS.


93-98: Correct gating on per-relay success.
Ok(!result.success.is_empty()) is the right signal for higher-level “only publish replacement if deletion succeeded” logic.


209-213: Nice: instantiate NostrMls once outside the loop.
Avoids repeated open/close of storage and reduces I/O.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants