Skip to content

fix: migration postgres azure #681

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Feb 13, 2025
Merged

fix: migration postgres azure #681

merged 2 commits into from
Feb 13, 2025

Conversation

gfyrag
Copy link
Contributor

@gfyrag gfyrag commented Feb 13, 2025

  • fix: migration using azure postgres with no superuser provileges
  • fix: next migration

@gfyrag gfyrag requested a review from a team as a code owner February 13, 2025 09:05
Copy link

coderabbitai bot commented Feb 13, 2025

Walkthrough

The changes update two migration scripts related to the transactions table. They remove session-level disabling and enabling of triggers, replacing it with ledger-specific trigger creation. The script in migration 18 introduces new variables (_ledger record and _vsql text) and sets up a trigger for updating transaction metadata upon changes. The migration 19 script similarly removes the global session trigger modifications and retains batch processing for updating transactions and enforcing constraints.

Changes

File(s) Change Summary
internal/.../18-transactions-fill-inserted-at/up.sql
internal/.../19-transactions-fill-pcv/up.sql
Modified migration scripts by removing session-level trigger disabling/enabling. In the 18 migration, new variables (_ledger record and _vsql text) are added and per-ledger triggers are created to update update_transaction_metadata_history when the metadata column changes. Batch processing and default value alterations for inserted_at remain in place.

Sequence Diagram(s)

sequenceDiagram
    participant MS as Migration Script
    participant LS as Ledgers Table
    participant TT as Temporary Table (logs_transactions)
    participant TX as Transactions Table
    participant UTM as update_transaction_metadata_history Function

    loop For each ledger in LS
        MS->>LS: Fetch ledger record
        MS->>MS: Create trigger for metadata update
        MS->>UTM: Link trigger to update_transaction_metadata_history
    end
    MS->>TT: Create and populate temporary logs
    MS->>TX: Update transactions based on TT
    MS->>TX: Alter inserted_at column default
    MS->>TT: Drop temporary table
Loading

Possibly related PRs

Suggested labels

build-images

Suggested reviewers

  • flemzord
  • paul-nicolas

Poem

In a burrow of code I delight,
Hopping through migrations late in the night.
Triggers now dance per ledger’s call,
No more global pauses to stall.
With SQL carrots crunching in sight, 🥕
This rabbit cheers for change done right!
🐇 Happy hops to new updates!


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@gfyrag gfyrag enabled auto-merge February 13, 2025 09:07
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🔭 Outside diff range comments (1)
internal/storage/bucket/migrations/18-transactions-fill-inserted-at/up.sql (1)

20-22: 💡 Verification agent

🧩 Analysis chain

Avoid hard-coding migration version IDs.

The query uses a hard-coded version_id = 12 which could cause issues if migration numbering changes.

Consider using a more robust way to identify the migration, such as by name or timestamp. Run this script to check other migrations for similar patterns:


🏁 Script executed:

#!/bin/bash
# Description: Check for hard-coded version IDs in migrations
# Expected: This should be a unique case or follow a consistent pattern

rg -g "*.sql" "version_id\s*=\s*\d+" internal/storage/bucket/migrations/

Length of output: 303


Hard-coded migration version IDs and missing error handling

  • The migration in internal/storage/bucket/migrations/18-transactions-fill-inserted-at/up.sql uses a hard-coded version_id = 12, and the same pattern is also present in internal/storage/bucket/migrations/19-transactions-fill-pcv/up.sql. This could lead to issues if migration numbering changes.
  • Consider adopting a more robust mechanism to reference migration versions—such as using migration names, timestamps, or a dedicated version management strategy.
  • Additionally, the migration script lacks explicit error handling for failed batches. Please add appropriate error handling to address potential failures during migration.
🧹 Nitpick comments (2)
internal/storage/bucket/migrations/18-transactions-fill-inserted-at/up.sql (2)

11-13: Great architectural improvement!

The change from session-level trigger management to per-ledger triggers is a solid improvement that:

  1. Fixes Azure PostgreSQL compatibility
  2. Optimizes performance by only triggering on metadata updates

36-52: Consider adding error handling for batch processing.

The batch processing loop lacks error handling for failed updates. A failure in one batch could leave the data in an inconsistent state.

Consider adding error handling:

for i in 0.._count-1 by _batch_size loop
  begin
    with _rows as (
      select *
      from logs_transactions
      order by ledger, transaction_id
      offset i
      limit _batch_size
    )
    update transactions
    set inserted_at = _rows.date
    from _rows
    where transactions.ledger = _rows.ledger and transactions.id = _rows.transaction_id;

    commit;

    perform pg_notify('migrations-{{ .Schema }}', 'continue: ' || _batch_size);
  exception when others then
    perform pg_notify('migrations-{{ .Schema }}', 'error: ' || sqlerrm);
    raise;
  end;
end loop;
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f7cf431 and ab5804d.

📒 Files selected for processing (2)
  • internal/storage/bucket/migrations/18-transactions-fill-inserted-at/up.sql (1 hunks)
  • internal/storage/bucket/migrations/19-transactions-fill-pcv/up.sql (0 hunks)
💤 Files with no reviewable changes (1)
  • internal/storage/bucket/migrations/19-transactions-fill-pcv/up.sql
⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: Tests
🔇 Additional comments (1)
internal/storage/bucket/migrations/18-transactions-fill-inserted-at/up.sql (1)

3-4: LGTM! Well-structured variable declarations.

The new variables _ledger and _vsql are appropriately typed for their intended use in dynamic trigger creation.

Comment on lines +14 to +17
for _ledger in select * from _system.ledgers where bucket = current_schema loop
_vsql = 'create or replace trigger "update_transaction_metadata_history_' || _ledger.id || '" after update of metadata on "transactions" for each row when (new.ledger = ''' || _ledger.name || ''') execute procedure update_transaction_metadata_history()';
execute _vsql;
end loop;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Review SQL injection risk in dynamic trigger creation.

The dynamic SQL construction using string concatenation with _ledger.id and _ledger.name could be vulnerable to SQL injection if these values contain malicious content.

Consider using proper quoting functions. Run this script to check if PostgreSQL's quote_ident and quote_literal functions are available in your environment:

If available, consider this safer approach:

_vsql = format(
  'create or replace trigger %s after update of metadata on transactions for each row when (new.ledger = %s) execute procedure update_transaction_metadata_history()',
  quote_ident('update_transaction_metadata_history_' || _ledger.id),
  quote_literal(_ledger.name)
);

🏁 Script executed:

#!/bin/bash
# Description: Check if proper quoting functions are available
# Expected: Functions should be available in PostgreSQL

curl -s https://www.postgresql.org/docs/current/functions-string.html | grep -A 2 "quote_ident"

Length of output: 1275


SQL Injection Risk: Use PostgreSQL Quoting Functions for Safe Dynamic SQL Generation

  • File: internal/storage/bucket/migrations/18-transactions-fill-inserted-at/up.sql (Lines 14-17)

  • Issue: The dynamic SQL construction concatenates _ledger.id and _ledger.name directly, which may lead to SQL injection if these values ever contain unexpected or malicious content.

  • Verification: A shell script confirmed that PostgreSQL’s quote_ident function is available (and by extension, quote_literal is available), so safe quoting functions can be used.

  • Recommendation: Replace the current string concatenation with a safer version using the format function combined with quote_ident and quote_literal. For example:

    _vsql = format(
      'create or replace trigger %s after update of metadata on transactions for each row when (new.ledger = %s) execute procedure update_transaction_metadata_history()',
      quote_ident('update_transaction_metadata_history_' || _ledger.id),
      quote_literal(_ledger.name)
    );

Please update the trigger creation code to use this format, ensuring that all identifiers and literals are safely quoted to mitigate any SQL injection risks.

@gfyrag gfyrag added this pull request to the merge queue Feb 13, 2025
Copy link

codecov bot commented Feb 13, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 81.58%. Comparing base (f7cf431) to head (ab5804d).
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #681      +/-   ##
==========================================
- Coverage   81.63%   81.58%   -0.05%     
==========================================
  Files         131      131              
  Lines        7061     7061              
==========================================
- Hits         5764     5761       -3     
- Misses        994      996       +2     
- Partials      303      304       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Merged via the queue into main with commit a348229 Feb 13, 2025
9 of 10 checks passed
@gfyrag gfyrag deleted the fix/migration-azure-postgres branch February 13, 2025 09:14
@gfyrag gfyrag mentioned this pull request Mar 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants