Skip to content

Conversation

@erskingardner
Copy link
Member

@erskingardner erskingardner commented Aug 11, 2025

We'll change the names of the files and tables to drop the new parts once we've fully migrated things

Summary by CodeRabbit

  • New Features
    • Added first-class support for accounts, users, followers, relays, and user-relay associations, including automatic created_at/updated_at tracking.
  • Bug Fixes
    • Improved validation and error handling for keys, URLs, metadata, and timestamps to reduce runtime failures.
  • Tests
    • Added extensive unit tests covering parsing, timestamp edge cases, and various data scenarios.
  • Chores
    • Added a database migration to create the new tables and update triggers.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Aug 11, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

Adds a SQL migration creating five new tables with AFTER UPDATE triggers; adds three Rust DB submodules (accounts_new, relays_new, users_new), sqlx::FromRow mappings with parsing/validation, new AccountNew type and nostr MLS wiring, a DatabaseError::InvalidTimestamp variant, and comprehensive unit tests.

Changes

Cohort / File(s) Summary of Changes
DB Migration
db_migrations/0005_new_tables.sql
Adds tables accounts_new, users_new, account_followers, relays, user_relays with AUTOINCREMENT primary keys, created_at/updated_at defaulting to CURRENT_TIMESTAMP, UNIQUE pubkey where applicable, and AFTER UPDATE triggers that set updated_at = CURRENT_TIMESTAMP.
Database module wiring
src/whitenoise/database/mod.rs
Exposes new submodules accounts_new, relays_new, users_new; adds DatabaseError::InvalidTimestamp { timestamp: i64 }.
Accounts mapping & API + tests
src/whitenoise/database/accounts_new.rs
Adds private AccountNewRow and sqlx::FromRow impl parsing pubkey (hex → PublicKey), settings (JSON → AccountSettings), timestamps (epoch ms → DateTime), maps parse/convert errors to sqlx::Error::ColumnDecode; adds Whitenoise::load_account_new and save_account_new; includes extensive unit tests and roundtrip tests using in-memory SQLite.
Relays mapping + tests
src/whitenoise/database/relays_new.rs
Adds private RelayNewRow and sqlx::FromRow impl parsing url (String → RelayUrl) and timestamps with InvalidTimestamp error mapping; maps URL parse failures to ColumnDecode; extensive tests for valid/invalid URLs and timestamp edge cases.
Users mapping + tests
src/whitenoise/database/users_new.rs
Adds private UserRow and sqlx::FromRow impl parsing pubkey (hex → PublicKey), metadata (JSON → Metadata), timestamps with InvalidTimestamp mapping; maps errors to ColumnDecode; extensive unit tests covering valid, optional metadata, and error cases.
Account model & MLS visibility
src/whitenoise/accounts/mod.rs
Adds pub struct AccountNew (public fields) with crate-private nostr_mls handle; changes create_nostr_mls visibility to pub(crate) to allow internal initialization.

Sequence Diagram(s)

sequenceDiagram
  participant App
  participant Whitenoise as Whitenoise API
  participant SQL as SQLite/sqlx
  participant FromRow as sqlx::FromRow impl
  participant Parser as Parsers (pubkey/json/url/timestamps)

  App->>Whitenoise: load_account_new(pubkey)
  Whitenoise->>SQL: SELECT ... FROM accounts_new WHERE pubkey = ?
  SQL->>FromRow: row -> FromRow::from_row
  FromRow->>Parser: parse pubkey, settings, timestamps
  alt all parses succeed
    Parser-->>FromRow: typed values
    FromRow-->>Whitenoise: AccountNewRow
    Whitenoise->>App: Ok(AccountNew) (with nostr_mls created)
  else parse/convert failure
    Parser-->>FromRow: error (e.g., ColumnDecode/InvalidTimestamp)
    FromRow-->>Whitenoise: Err(sqlx::Error::ColumnDecode)
    Whitenoise->>App: Err(...)
  end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~35 minutes

Possibly related PRs

Suggested reviewers

  • delcin-raj
  • jgmontoya

Poem

I thump my paws on freshly tilled schema round,
New tables sprout in tidy rows and sound.
Hex keys, JSON, relays I gently comb,
Timestamps tick — bad ones sent back home.
A happy rabbit hops — our data’s found! 🐇✨


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9354438 and 70c04d0.

📒 Files selected for processing (2)
  • src/whitenoise/accounts/mod.rs (3 hunks)
  • src/whitenoise/database/accounts_new.rs (1 hunks)
✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch create-new-tables

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@jgmontoya jgmontoya self-requested a review August 11, 2025 10:41
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🔭 Outside diff range comments (1)
src/whitenoise/database/accounts_new.rs (1)

1-69: Add unit tests for AccountNewRow

This module lacks unit tests, unlike relays_new.rs and users_new.rs which have comprehensive test coverage. Consider adding tests for:

  • Valid data mapping
  • Invalid pubkey handling
  • Invalid settings JSON handling
  • Timestamp edge cases (0, future timestamps, invalid timestamps)

Would you like me to generate a comprehensive test suite similar to the ones in the other modules?

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between a7bf49f and fbec7bc.

📒 Files selected for processing (5)
  • db_migrations/0005_new_tables.sql (1 hunks)
  • src/whitenoise/database/accounts_new.rs (1 hunks)
  • src/whitenoise/database/mod.rs (2 hunks)
  • src/whitenoise/database/relays_new.rs (1 hunks)
  • src/whitenoise/database/users_new.rs (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (2)
src/whitenoise/database/accounts_new.rs (2)
src/whitenoise/database/users_new.rs (1)
  • from_row (27-73)
src/whitenoise/database/relays_new.rs (1)
  • from_row (25-62)
src/whitenoise/database/relays_new.rs (3)
src/whitenoise/database/accounts_new.rs (1)
  • from_row (30-67)
src/whitenoise/database/users_new.rs (1)
  • from_row (27-73)
src/whitenoise/database/mod.rs (1)
  • new (43-87)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: check (ubuntu-latest, native)
🔇 Additional comments (3)
src/whitenoise/database/mod.rs (1)

13-16: LGTM! Well-structured module organization and error handling

The new public modules and InvalidTimestamp error variant are properly integrated. The error message includes the problematic timestamp value which aids debugging.

Also applies to: 31-32

src/whitenoise/database/relays_new.rs (1)

1-383: Excellent implementation with comprehensive test coverage!

The RelayNewRow implementation demonstrates:

  • Proper error handling for URL parsing and timestamp conversions
  • Consistent use of DatabaseError::InvalidTimestamp
  • Comprehensive test suite covering edge cases
  • Clear error messages with appropriate column indices
src/whitenoise/database/users_new.rs (1)

1-348: Well-implemented with thorough error handling and testing!

The UserRow implementation shows:

  • Proper error handling for pubkey parsing, JSON deserialization, and timestamp conversions
  • Consistent error mapping with appropriate column indices
  • Comprehensive test coverage including invalid data scenarios
  • Good use of the DatabaseError::InvalidTimestamp pattern

Comment on lines 1 to 10
-- New accounts table
CREATE TABLE accounts_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
pubkey TEXT NOT NULL, -- Hex encoded nostr public key
user_id INTEGER NOT NULL, -- Foreign key to the users table
settings JSONB NOT NULL,
last_synced_at DATETIME,
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add foreign key constraint and indexes for accounts_new table

The table has a comment indicating user_id is a foreign key, but no actual foreign key constraint is defined. Also missing indexes on frequently queried columns.

Consider adding:

-- Add foreign key constraint
FOREIGN KEY (user_id) REFERENCES users_new(id),
-- Add indexes
CREATE INDEX idx_accounts_new_pubkey ON accounts_new(pubkey);
CREATE INDEX idx_accounts_new_user_id ON accounts_new(user_id);
CREATE UNIQUE INDEX idx_accounts_new_pubkey_user_id ON accounts_new(pubkey, user_id);
🤖 Prompt for AI Agents
In db_migrations/0005_new_tables.sql around lines 1 to 10, the accounts_new
table defines user_id in a comment but lacks an actual foreign key constraint
and has no indexes; update the CREATE TABLE statement to declare user_id as a
foreign key referencing users_new(id) and then add indexes for pubkey and
user_id plus a unique composite index on (pubkey, user_id) to improve query
performance and enforce uniqueness.

Comment on lines 12 to 19
-- New users table
CREATE TABLE users_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
pubkey TEXT NOT NULL,
metadata JSONB,
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add UNIQUE constraint and index on pubkey for users_new table

The pubkey field should likely be unique to prevent duplicate users.

 CREATE TABLE users_new (
     id INTEGER PRIMARY KEY AUTOINCREMENT,
-    pubkey TEXT NOT NULL,
+    pubkey TEXT NOT NULL UNIQUE,
     metadata JSONB,
     created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
     updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
 );
+
+-- Add index for pubkey lookups
+CREATE INDEX idx_users_new_pubkey ON users_new(pubkey);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
-- New users table
CREATE TABLE users_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
pubkey TEXT NOT NULL,
metadata JSONB,
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- New users table
CREATE TABLE users_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
pubkey TEXT NOT NULL UNIQUE,
metadata JSONB,
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- Add index for pubkey lookups
CREATE INDEX idx_users_new_pubkey ON users_new(pubkey);
🤖 Prompt for AI Agents
In db_migrations/0005_new_tables.sql around lines 12 to 19, the users_new table
lacks a uniqueness constraint on pubkey; update the migration so pubkey is
enforced unique (since it's NOT NULL) by adding a UNIQUE constraint to the
CREATE TABLE (e.g., include UNIQUE(pubkey)) or by adding a separate CREATE
UNIQUE INDEX idx_users_new_pubkey ON users_new(pubkey) after the table creation;
ensure the chosen approach matches the target DB dialect and migration ordering
so the unique constraint/index is created as part of this migration.

Comment on lines 21 to 28
-- New account_followers table
CREATE TABLE account_followers (
id INTEGER PRIMARY KEY AUTOINCREMENT,
account_id INTEGER NOT NULL,
followed_id INTEGER NOT NULL, -- This is the user_id of the followed user
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add foreign key constraints and prevent duplicate follows

The account_followers table needs foreign key constraints and should prevent duplicate follow relationships.

 CREATE TABLE account_followers (
     id INTEGER PRIMARY KEY AUTOINCREMENT,
-    account_id INTEGER NOT NULL,
-    followed_id INTEGER NOT NULL,
+    account_id INTEGER NOT NULL REFERENCES accounts_new(id),
+    followed_id INTEGER NOT NULL REFERENCES users_new(id),
     created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
-    updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
+    updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
+    UNIQUE(account_id, followed_id)
 );
+
+-- Add indexes for foreign keys
+CREATE INDEX idx_account_followers_account_id ON account_followers(account_id);
+CREATE INDEX idx_account_followers_followed_id ON account_followers(followed_id);
🤖 Prompt for AI Agents
In db_migrations/0005_new_tables.sql around lines 21 to 28, the
account_followers table lacks foreign key constraints and a uniqueness
constraint to prevent duplicate follow relationships; modify the CREATE TABLE to
add FOREIGN KEY (account_id) REFERENCES accounts(id) ON DELETE CASCADE and
FOREIGN KEY (followed_id) REFERENCES accounts(id) ON DELETE CASCADE, and add a
UNIQUE(account_id, followed_id) constraint to prevent duplicates (optionally
also add a CHECK(account_id <> followed_id) to prevent self-follows).

Comment on lines +30 to +36
-- New relays table
CREATE TABLE relays (
id INTEGER PRIMARY KEY AUTOINCREMENT,
url TEXT NOT NULL,
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add UNIQUE constraint on URL for relays table

Relay URLs should be unique to prevent duplicates.

 CREATE TABLE relays (
     id INTEGER PRIMARY KEY AUTOINCREMENT,
-    url TEXT NOT NULL,
+    url TEXT NOT NULL UNIQUE,
     created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
     updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
 );
🤖 Prompt for AI Agents
In db_migrations/0005_new_tables.sql around lines 30 to 36, the relays table
allows duplicate URLs; add a UNIQUE constraint to the url column to prevent
duplicates by either declaring "url TEXT NOT NULL UNIQUE" in the CREATE TABLE
statement or adding "CREATE UNIQUE INDEX idx_relays_url ON relays(url);"
immediately after the table creation; pick the approach consistent with our
migrations (prefer inline UNIQUE in the CREATE TABLE) and ensure the migration
will fail if existing duplicate URLs exist so they can be resolved before
applying.

Comment on lines +38 to +46
-- New user_relays table
CREATE TABLE user_relays (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
relay_id INTEGER NOT NULL,
relay_type TEXT NOT NULL,
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add constraints and validation for user_relays table

The table needs foreign key constraints, unique constraint to prevent duplicates, and validation for relay_type.

 CREATE TABLE user_relays (
     id INTEGER PRIMARY KEY AUTOINCREMENT,
-    user_id INTEGER NOT NULL,
-    relay_id INTEGER NOT NULL,
-    relay_type TEXT NOT NULL,
+    user_id INTEGER NOT NULL REFERENCES users_new(id),
+    relay_id INTEGER NOT NULL REFERENCES relays(id),
+    relay_type TEXT NOT NULL CHECK(relay_type IN ('read', 'write', 'both')),
     created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
-    updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
+    updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
+    UNIQUE(user_id, relay_id, relay_type)
 );
+
+-- Add indexes for foreign keys
+CREATE INDEX idx_user_relays_user_id ON user_relays(user_id);
+CREATE INDEX idx_user_relays_relay_id ON user_relays(relay_id);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
-- New user_relays table
CREATE TABLE user_relays (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
relay_id INTEGER NOT NULL,
relay_type TEXT NOT NULL,
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- New user_relays table
CREATE TABLE user_relays (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL REFERENCES users_new(id),
relay_id INTEGER NOT NULL REFERENCES relays(id),
relay_type TEXT NOT NULL CHECK(relay_type IN ('read', 'write', 'both')),
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
UNIQUE(user_id, relay_id, relay_type)
);
-- Add indexes for foreign keys
CREATE INDEX idx_user_relays_user_id ON user_relays(user_id);
CREATE INDEX idx_user_relays_relay_id ON user_relays(relay_id);
🤖 Prompt for AI Agents
In db_migrations/0005_new_tables.sql around lines 38 to 46, the user_relays
table needs referential integrity and validation: modify the CREATE TABLE to add
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE and FOREIGN KEY
(relay_id) REFERENCES relays(id) ON DELETE CASCADE, add a UNIQUE constraint on
(user_id, relay_id) to prevent duplicate entries, and add a CHECK constraint on
relay_type restricting it to the allowed set (e.g. CHECK (relay_type IN
('TYPE_A','TYPE_B')) — replace TYPE_A/TYPE_B with your app’s actual relay type
values); ensure the constraints are included in the CREATE TABLE statement so
migrations run atomically.

@erskingardner erskingardner merged commit 2140e36 into master Aug 11, 2025
3 of 4 checks passed
@erskingardner erskingardner deleted the create-new-tables branch August 11, 2025 11:25
);

-- New relays table
CREATE TABLE relays (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's best to not have this table.
A single table user_relays with user_id being a foreign key referenceing users table with

UNIQUE (user_id, url)

constraint will be optimal

Copy link
Contributor

@jgmontoya jgmontoya Aug 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a single relay can belong to multiple lists so we're using the relay table to avoid having to duplicate the relay information/url. It will also allow us to do some additional stuff in the future if we want, like this: #296 (comment)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then we need a Unique(user_id, relay_id, relay_type) constraint for the user_relays table.
No need to have id PRIMARY KEY

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants