-
Notifications
You must be signed in to change notification settings - Fork 11k
[Assets] Initial implementation #9545
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Kosinkadink
merged 71 commits into
comfyanonymous:asset-management
from
bigcat88:asset-management
Sep 18, 2025
Merged
Changes from all commits
Commits
Show all changes
71 commits
Select commit
Hold shift + click to select a range
f92307c
dev: Everything is Assets
bigcat88 5c1b597
dev: refactor; populate models in more nodes; use Pydantic in endpoin…
bigcat88 8d46bec
use Pydantic for output; finished Tags endpoints
bigcat88 0755e53
remove timezone; download asset, delete asset endpoints
bigcat88 f2ea0bc
added create_asset_from_hash endpoint
bigcat88 a82577f
auto-creation of tags and fixed population DB when cloned asset is al…
bigcat88 d7464e9
implemented assets scaner
bigcat88 09dabf9
refactoring: use the same code for "scan task" and realtime DB popula…
bigcat88 a763cbd
add upload asset endpoint
bigcat88 6fade5d
add AssetsResolver support
bigcat88 7c1b0be
add Get Asset endpoint
bigcat88 026b7f2
add "--multi-user" support
bigcat88 0379eff
allow Upload Asset endpoint to accept hash (as documentation requires)
bigcat88 eb7008a
removed not used "added_by" column
bigcat88 871e41a
removed not needed "refcount" column
bigcat88 bdf4ba2
removed not needed "assets.updated_at" column
bigcat88 6b86be3
use UUID instead of autoincrement Integer for Assets ID field
bigcat88 bf8363e
always autofill "filename" in the metadata
bigcat88 ce270ba
added Assets Autoscan feature
bigcat88 84384ca
temporary restore ModelManager
bigcat88 789a62c
assume that DB packages always present; refactoring & cleanup
bigcat88 2d9be46
add support for assets duplicates
bigcat88 b8ef9bb
add detection of the missing files for existing assets
bigcat88 6282d49
corrected detection of missing files for assets
bigcat88 3fa0fc4
fix: use UPSERT to eliminate rare race condition during ingesting man…
bigcat88 e3311c9
feat: support for in-memory SQLite databases
bigcat88 0e9de2b
feat: add first test
bigcat88 dfb5703
feat: remove Asset when there is no references left + bugfixes + more…
bigcat88 faa1e4d
fixed another test
bigcat88 0ef73e9
fixed validation error + more tests
bigcat88 357193f
fixed metadata filtering + tests
bigcat88 1886f10
add download tests
bigcat88 964de8a
add more list_assets tests + fix one found bug
bigcat88 a9096f6
removed non-needed code, fix tests, +1 new test
bigcat88 6eaed07
add some logic tests
bigcat88 72548a8
added additional tests; sorted tests
bigcat88 0df1cca
GitHub CI test for Assets
bigcat88 3c9bf39
Merge pull request #1 from bigcat88/asset-management-ci
bigcat88 934377a
removed currently unnecessary "asset_locations" functionality
bigcat88 bb9ed04
global refactoring; add support for Assets without the computed hash
bigcat88 9b8e88b
added more tests for the Assets logic
bigcat88 4a71365
added more tests for the Assets logic
bigcat88 9756500
concurrency upload test + fixed 2 related bugs
bigcat88 37b81e6
fixed new PgSQL bug
bigcat88 cdd8d16
+2 tests for checking Asset downloading logic
bigcat88 47f7c7e
rework + add test for concurrent AssetInfo delete
bigcat88 0b795dc
removed non-needed code
bigcat88 a2ec1f7
simplify code
bigcat88 6cfa94e
fixed metadata[filename] feature + new tests for this
bigcat88 a7f2546
fix: use ".rowcount" instead of ".returning" on SQLite
bigcat88 a2fc2bb
corrected formatting
bigcat88 1d97038
added final tests
bigcat88 dda31de
rework: AssetInfo.name is only a display name
bigcat88 7becb84
fixed tests on SQLite file
bigcat88 025fc49
optimization: DB Queries (Tags)
bigcat88 5f187fe
optimization: make list_unhashed_candidates_under_prefixes single-que…
bigcat88 f3cf99d
fix+test: escape "_" symbol in tags filtering
bigcat88 f1fb743
fix+test: escape "_" symbol in assets filtering
bigcat88 0be513b
fix: escape "_" symbol in all other places
bigcat88 24a95f5
removed default scanning of "input" and "output" folders; added separ…
bigcat88 77332d3
optimization: fast scan: commit to the DB in chunks
bigcat88 a336c7c
refactor(1): use general fast_asset_file_check helper for fast check
bigcat88 31ec744
refactor(2)/fix: skip double checking the existing files during fast …
bigcat88 677a0e2
refactor(3): unite logic for Asset fast check
bigcat88 d0aa64d
refactor(4): use one query to init DB with all tags for assets
bigcat88 621faaa
refactor(5): use less DB queries to create seed asset
bigcat88 5b6810a
fixed hash calculation during model loading in ComfyUI
bigcat88 85ef084
optimization: initial scan speed(batching tags)
bigcat88 f960245
optimization: initial scan speed(batching metadata[filename])
bigcat88 1a37d14
refactor(6): fully batched initial scan
bigcat88 283cd27
final adjustments
bigcat88 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,174 @@ | ||
| name: Asset System Tests | ||
|
|
||
| on: | ||
| push: | ||
| paths: | ||
| - 'app/**' | ||
| - 'alembic_db/**' | ||
| - 'tests-assets/**' | ||
| - '.github/workflows/test-assets.yml' | ||
| - 'requirements.txt' | ||
| pull_request: | ||
| branches: [master] | ||
| workflow_dispatch: | ||
|
|
||
| permissions: | ||
| contents: read | ||
|
|
||
| env: | ||
| PIP_DISABLE_PIP_VERSION_CHECK: '1' | ||
| PYTHONUNBUFFERED: '1' | ||
|
|
||
| jobs: | ||
| sqlite: | ||
| name: SQLite (${{ matrix.sqlite_mode }}) • Python ${{ matrix.python }} | ||
| runs-on: ubuntu-latest | ||
| timeout-minutes: 40 | ||
| strategy: | ||
| fail-fast: false | ||
| matrix: | ||
| python: ['3.9', '3.12'] | ||
| sqlite_mode: ['memory', 'file'] | ||
|
|
||
| steps: | ||
| - uses: actions/checkout@v4 | ||
| - name: Set up Python | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: ${{ matrix.python }} | ||
|
|
||
| - name: Install dependencies | ||
| run: | | ||
| python -m pip install -U pip wheel | ||
| pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu | ||
| pip install -r requirements.txt | ||
| pip install pytest pytest-aiohttp pytest-asyncio | ||
|
|
||
| - name: Set deterministic test base dir | ||
| id: basedir | ||
| shell: bash | ||
| run: | | ||
| BASE="$RUNNER_TEMP/comfyui-assets-tests-${{ matrix.python }}-${{ matrix.sqlite_mode }}-${{ github.run_id }}-${{ github.run_attempt }}" | ||
| echo "ASSETS_TEST_BASE_DIR=$BASE" >> "$GITHUB_ENV" | ||
| echo "ASSETS_TEST_LOGS=$BASE/logs" >> "$GITHUB_ENV" | ||
| mkdir -p "$BASE/logs" | ||
| echo "ASSETS_TEST_BASE_DIR=$BASE" | ||
|
|
||
| - name: Set DB URL for SQLite | ||
| id: setdb | ||
| shell: bash | ||
| run: | | ||
| if [ "${{ matrix.sqlite_mode }}" = "memory" ]; then | ||
| echo "ASSETS_TEST_DB_URL=sqlite+aiosqlite:///:memory:" >> "$GITHUB_ENV" | ||
| else | ||
| DBFILE="$RUNNER_TEMP/assets-tests.sqlite" | ||
| mkdir -p "$(dirname "$DBFILE")" | ||
| echo "ASSETS_TEST_DB_URL=sqlite+aiosqlite:///$DBFILE" >> "$GITHUB_ENV" | ||
| fi | ||
|
|
||
| - name: Run tests | ||
| run: python -m pytest tests-assets | ||
|
|
||
| - name: Show ComfyUI logs | ||
| if: always() | ||
| shell: bash | ||
| run: | | ||
| echo "==== ASSETS_TEST_BASE_DIR: $ASSETS_TEST_BASE_DIR ====" | ||
| echo "==== ASSETS_TEST_LOGS: $ASSETS_TEST_LOGS ====" | ||
| ls -la "$ASSETS_TEST_LOGS" || true | ||
| for f in "$ASSETS_TEST_LOGS"/stdout.log "$ASSETS_TEST_LOGS"/stderr.log; do | ||
| if [ -f "$f" ]; then | ||
| echo "----- BEGIN $f -----" | ||
| sed -n '1,400p' "$f" | ||
| echo "----- END $f -----" | ||
| fi | ||
| done | ||
|
|
||
| - name: Upload ComfyUI logs | ||
| if: always() | ||
| uses: actions/upload-artifact@v4 | ||
| with: | ||
| name: asset-logs-sqlite-${{ matrix.sqlite_mode }}-py${{ matrix.python }} | ||
| path: ${{ env.ASSETS_TEST_LOGS }}/*.log | ||
| if-no-files-found: warn | ||
|
|
||
| postgres: | ||
| name: PostgreSQL ${{ matrix.pgsql }} • Python ${{ matrix.python }} | ||
| runs-on: ubuntu-latest | ||
| timeout-minutes: 40 | ||
| strategy: | ||
| fail-fast: false | ||
| matrix: | ||
| python: ['3.9', '3.12'] | ||
| pgsql: ['14', '16'] | ||
|
|
||
| services: | ||
| postgres: | ||
| image: postgres:${{ matrix.pgsql }} | ||
| env: | ||
| POSTGRES_DB: assets | ||
| POSTGRES_USER: postgres | ||
| POSTGRES_PASSWORD: postgres | ||
| ports: | ||
| - 5432:5432 | ||
| options: >- | ||
| --health-cmd "pg_isready -U postgres -d assets" | ||
| --health-interval 10s | ||
| --health-timeout 5s | ||
| --health-retries 12 | ||
|
|
||
| steps: | ||
| - uses: actions/checkout@v4 | ||
| - name: Set up Python | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: ${{ matrix.python }} | ||
|
|
||
| - name: Install dependencies | ||
| run: | | ||
| python -m pip install -U pip wheel | ||
| pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu | ||
| pip install -r requirements.txt | ||
| pip install pytest pytest-aiohttp pytest-asyncio | ||
| pip install greenlet psycopg | ||
|
|
||
| - name: Set deterministic test base dir | ||
| id: basedir | ||
| shell: bash | ||
| run: | | ||
| BASE="$RUNNER_TEMP/comfyui-assets-tests-${{ matrix.python }}-${{ matrix.sqlite_mode }}-${{ github.run_id }}-${{ github.run_attempt }}" | ||
| echo "ASSETS_TEST_BASE_DIR=$BASE" >> "$GITHUB_ENV" | ||
| echo "ASSETS_TEST_LOGS=$BASE/logs" >> "$GITHUB_ENV" | ||
| mkdir -p "$BASE/logs" | ||
| echo "ASSETS_TEST_BASE_DIR=$BASE" | ||
|
|
||
| - name: Set DB URL for PostgreSQL | ||
| shell: bash | ||
| run: | | ||
| echo "ASSETS_TEST_DB_URL=postgresql+psycopg://postgres:postgres@localhost:5432/assets" >> "$GITHUB_ENV" | ||
|
|
||
| - name: Run tests | ||
| run: python -m pytest tests-assets | ||
|
|
||
| - name: Show ComfyUI logs | ||
| if: always() | ||
| shell: bash | ||
| run: | | ||
| echo "==== ASSETS_TEST_BASE_DIR: $ASSETS_TEST_BASE_DIR ====" | ||
| echo "==== ASSETS_TEST_LOGS: $ASSETS_TEST_LOGS ====" | ||
| ls -la "$ASSETS_TEST_LOGS" || true | ||
| for f in "$ASSETS_TEST_LOGS"/stdout.log "$ASSETS_TEST_LOGS"/stderr.log; do | ||
| if [ -f "$f" ]; then | ||
| echo "----- BEGIN $f -----" | ||
| sed -n '1,400p' "$f" | ||
| echo "----- END $f -----" | ||
| fi | ||
| done | ||
|
|
||
| - name: Upload ComfyUI logs | ||
| if: always() | ||
| uses: actions/upload-artifact@v4 | ||
| with: | ||
| name: asset-logs-pgsql-${{ matrix.pgsql }}-py${{ matrix.python }} | ||
| path: ${{ env.ASSETS_TEST_LOGS }}/*.log | ||
| if-no-files-found: warn |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,187 @@ | ||
| """initial assets schema | ||
| Revision ID: 0001_assets | ||
| Revises: | ||
| Create Date: 2025-08-20 00:00:00 | ||
| """ | ||
|
|
||
| from alembic import op | ||
| import sqlalchemy as sa | ||
| from sqlalchemy.dialects import postgresql | ||
|
|
||
| revision = "0001_assets" | ||
| down_revision = None | ||
| branch_labels = None | ||
| depends_on = None | ||
|
|
||
|
|
||
| def upgrade() -> None: | ||
| # ASSETS: content identity | ||
| op.create_table( | ||
| "assets", | ||
| sa.Column("id", sa.String(length=36), primary_key=True), | ||
| sa.Column("hash", sa.String(length=256), nullable=True), | ||
| sa.Column("size_bytes", sa.BigInteger(), nullable=False, server_default="0"), | ||
| sa.Column("mime_type", sa.String(length=255), nullable=True), | ||
| sa.Column("created_at", sa.DateTime(timezone=False), nullable=False), | ||
| sa.CheckConstraint("size_bytes >= 0", name="ck_assets_size_nonneg"), | ||
| ) | ||
| if op.get_bind().dialect.name == "postgresql": | ||
| op.create_index( | ||
| "uq_assets_hash_not_null", | ||
| "assets", | ||
| ["hash"], | ||
| unique=True, | ||
| postgresql_where=sa.text("hash IS NOT NULL"), | ||
| ) | ||
| else: | ||
| op.create_index("uq_assets_hash", "assets", ["hash"], unique=True) | ||
| op.create_index("ix_assets_mime_type", "assets", ["mime_type"]) | ||
|
|
||
| # ASSETS_INFO: user-visible references | ||
| op.create_table( | ||
| "assets_info", | ||
| sa.Column("id", sa.String(length=36), primary_key=True), | ||
| sa.Column("owner_id", sa.String(length=128), nullable=False, server_default=""), | ||
| sa.Column("name", sa.String(length=512), nullable=False), | ||
| sa.Column("asset_id", sa.String(length=36), sa.ForeignKey("assets.id", ondelete="RESTRICT"), nullable=False), | ||
| sa.Column("preview_id", sa.String(length=36), sa.ForeignKey("assets.id", ondelete="SET NULL"), nullable=True), | ||
| sa.Column("user_metadata", sa.JSON(), nullable=True), | ||
| sa.Column("created_at", sa.DateTime(timezone=False), nullable=False), | ||
| sa.Column("updated_at", sa.DateTime(timezone=False), nullable=False), | ||
| sa.Column("last_access_time", sa.DateTime(timezone=False), nullable=False), | ||
| sa.UniqueConstraint("asset_id", "owner_id", "name", name="uq_assets_info_asset_owner_name"), | ||
| ) | ||
| op.create_index("ix_assets_info_owner_id", "assets_info", ["owner_id"]) | ||
| op.create_index("ix_assets_info_asset_id", "assets_info", ["asset_id"]) | ||
| op.create_index("ix_assets_info_name", "assets_info", ["name"]) | ||
| op.create_index("ix_assets_info_created_at", "assets_info", ["created_at"]) | ||
| op.create_index("ix_assets_info_last_access_time", "assets_info", ["last_access_time"]) | ||
| op.create_index("ix_assets_info_owner_name", "assets_info", ["owner_id", "name"]) | ||
|
|
||
| # TAGS: normalized tag vocabulary | ||
| op.create_table( | ||
| "tags", | ||
| sa.Column("name", sa.String(length=512), primary_key=True), | ||
| sa.Column("tag_type", sa.String(length=32), nullable=False, server_default="user"), | ||
| sa.CheckConstraint("name = lower(name)", name="ck_tags_lowercase"), | ||
| ) | ||
| op.create_index("ix_tags_tag_type", "tags", ["tag_type"]) | ||
|
|
||
| # ASSET_INFO_TAGS: many-to-many for tags on AssetInfo | ||
| op.create_table( | ||
| "asset_info_tags", | ||
| sa.Column("asset_info_id", sa.String(length=36), sa.ForeignKey("assets_info.id", ondelete="CASCADE"), nullable=False), | ||
| sa.Column("tag_name", sa.String(length=512), sa.ForeignKey("tags.name", ondelete="RESTRICT"), nullable=False), | ||
| sa.Column("origin", sa.String(length=32), nullable=False, server_default="manual"), | ||
| sa.Column("added_at", sa.DateTime(timezone=False), nullable=False), | ||
| sa.PrimaryKeyConstraint("asset_info_id", "tag_name", name="pk_asset_info_tags"), | ||
bigcat88 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| ) | ||
| op.create_index("ix_asset_info_tags_tag_name", "asset_info_tags", ["tag_name"]) | ||
| op.create_index("ix_asset_info_tags_asset_info_id", "asset_info_tags", ["asset_info_id"]) | ||
|
|
||
| # ASSET_CACHE_STATE: N:1 local cache rows per Asset | ||
| op.create_table( | ||
| "asset_cache_state", | ||
| sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True), | ||
| sa.Column("asset_id", sa.String(length=36), sa.ForeignKey("assets.id", ondelete="CASCADE"), nullable=False), | ||
| sa.Column("file_path", sa.Text(), nullable=False), # absolute local path to cached file | ||
| sa.Column("mtime_ns", sa.BigInteger(), nullable=True), | ||
| sa.Column("needs_verify", sa.Boolean(), nullable=False, server_default=sa.text("false")), | ||
| sa.CheckConstraint("(mtime_ns IS NULL) OR (mtime_ns >= 0)", name="ck_acs_mtime_nonneg"), | ||
| sa.UniqueConstraint("file_path", name="uq_asset_cache_state_file_path"), | ||
| ) | ||
| op.create_index("ix_asset_cache_state_file_path", "asset_cache_state", ["file_path"]) | ||
| op.create_index("ix_asset_cache_state_asset_id", "asset_cache_state", ["asset_id"]) | ||
|
|
||
| # ASSET_INFO_META: typed KV projection of user_metadata for filtering/sorting | ||
| op.create_table( | ||
| "asset_info_meta", | ||
| sa.Column("asset_info_id", sa.String(length=36), sa.ForeignKey("assets_info.id", ondelete="CASCADE"), nullable=False), | ||
| sa.Column("key", sa.String(length=256), nullable=False), | ||
| sa.Column("ordinal", sa.Integer(), nullable=False, server_default="0"), | ||
| sa.Column("val_str", sa.String(length=2048), nullable=True), | ||
| sa.Column("val_num", sa.Numeric(38, 10), nullable=True), | ||
| sa.Column("val_bool", sa.Boolean(), nullable=True), | ||
| sa.Column("val_json", sa.JSON().with_variant(postgresql.JSONB(), 'postgresql'), nullable=True), | ||
| sa.PrimaryKeyConstraint("asset_info_id", "key", "ordinal", name="pk_asset_info_meta"), | ||
| ) | ||
| op.create_index("ix_asset_info_meta_key", "asset_info_meta", ["key"]) | ||
| op.create_index("ix_asset_info_meta_key_val_str", "asset_info_meta", ["key", "val_str"]) | ||
| op.create_index("ix_asset_info_meta_key_val_num", "asset_info_meta", ["key", "val_num"]) | ||
| op.create_index("ix_asset_info_meta_key_val_bool", "asset_info_meta", ["key", "val_bool"]) | ||
|
|
||
| # Tags vocabulary | ||
| tags_table = sa.table( | ||
| "tags", | ||
| sa.column("name", sa.String(length=512)), | ||
| sa.column("tag_type", sa.String()), | ||
| ) | ||
| op.bulk_insert( | ||
| tags_table, | ||
| [ | ||
| {"name": "models", "tag_type": "system"}, | ||
| {"name": "input", "tag_type": "system"}, | ||
| {"name": "output", "tag_type": "system"}, | ||
|
|
||
| {"name": "configs", "tag_type": "system"}, | ||
| {"name": "checkpoints", "tag_type": "system"}, | ||
| {"name": "loras", "tag_type": "system"}, | ||
| {"name": "vae", "tag_type": "system"}, | ||
| {"name": "text_encoders", "tag_type": "system"}, | ||
| {"name": "diffusion_models", "tag_type": "system"}, | ||
| {"name": "clip_vision", "tag_type": "system"}, | ||
| {"name": "style_models", "tag_type": "system"}, | ||
| {"name": "embeddings", "tag_type": "system"}, | ||
| {"name": "diffusers", "tag_type": "system"}, | ||
| {"name": "vae_approx", "tag_type": "system"}, | ||
| {"name": "controlnet", "tag_type": "system"}, | ||
| {"name": "gligen", "tag_type": "system"}, | ||
| {"name": "upscale_models", "tag_type": "system"}, | ||
| {"name": "hypernetworks", "tag_type": "system"}, | ||
| {"name": "photomaker", "tag_type": "system"}, | ||
| {"name": "classifiers", "tag_type": "system"}, | ||
|
|
||
| {"name": "encoder", "tag_type": "system"}, | ||
| {"name": "decoder", "tag_type": "system"}, | ||
|
|
||
| {"name": "missing", "tag_type": "system"}, | ||
| {"name": "rescan", "tag_type": "system"}, | ||
| ], | ||
| ) | ||
|
|
||
|
|
||
| def downgrade() -> None: | ||
| op.drop_index("ix_asset_info_meta_key_val_bool", table_name="asset_info_meta") | ||
| op.drop_index("ix_asset_info_meta_key_val_num", table_name="asset_info_meta") | ||
| op.drop_index("ix_asset_info_meta_key_val_str", table_name="asset_info_meta") | ||
| op.drop_index("ix_asset_info_meta_key", table_name="asset_info_meta") | ||
| op.drop_table("asset_info_meta") | ||
|
|
||
| op.drop_index("ix_asset_cache_state_asset_id", table_name="asset_cache_state") | ||
| op.drop_index("ix_asset_cache_state_file_path", table_name="asset_cache_state") | ||
| op.drop_constraint("uq_asset_cache_state_file_path", table_name="asset_cache_state") | ||
| op.drop_table("asset_cache_state") | ||
|
|
||
| op.drop_index("ix_asset_info_tags_asset_info_id", table_name="asset_info_tags") | ||
| op.drop_index("ix_asset_info_tags_tag_name", table_name="asset_info_tags") | ||
| op.drop_table("asset_info_tags") | ||
|
|
||
| op.drop_index("ix_tags_tag_type", table_name="tags") | ||
| op.drop_table("tags") | ||
|
|
||
| op.drop_constraint("uq_assets_info_asset_owner_name", table_name="assets_info") | ||
| op.drop_index("ix_assets_info_owner_name", table_name="assets_info") | ||
| op.drop_index("ix_assets_info_last_access_time", table_name="assets_info") | ||
| op.drop_index("ix_assets_info_created_at", table_name="assets_info") | ||
| op.drop_index("ix_assets_info_name", table_name="assets_info") | ||
| op.drop_index("ix_assets_info_asset_id", table_name="assets_info") | ||
| op.drop_index("ix_assets_info_owner_id", table_name="assets_info") | ||
| op.drop_table("assets_info") | ||
|
|
||
| if op.get_bind().dialect.name == "postgresql": | ||
| op.drop_index("uq_assets_hash_not_null", table_name="assets") | ||
| else: | ||
| op.drop_index("uq_assets_hash", table_name="assets") | ||
| op.drop_index("ix_assets_mime_type", table_name="assets") | ||
| op.drop_table("assets") | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So playing around locally:
Lora Model Asset
{ "id": "9afecf35-9af9-46b9-a68e-2aff1714115d", "name": "blindbox_v1_mix.safetensors", "asset_hash": "blake3:78195c5cff4c6d83a61b0d6b836d59f85efee66fbb953758e4b2af0389f8a802", "size": 151108831, "mime_type": null, "tags": [ "loras", "models" ], "user_metadata": { "filename": "blindbox_v1_mix.safetensors" }, "preview_id": null, "created_at": "2025-09-16T06:02:41.494068", "last_access_time": "2025-09-16T06:02:41.494068" }Checkpoint Model Asset
{ "id": "837a544e-bf0f-427d-b586-2da83ae0c353", "name": "v1-5-pruned-emaonly-fp16.safetensors", "asset_hash": "blake3:4c50ebc6e2a5cb19e8d19626d5ede1fb64755562085ce7383d86c72d1d03eb7e", "size": 2132696762, "mime_type": null, "tags": [ "checkpoints", "models" ], "user_metadata": { "filename": "v1-5-pruned-emaonly-fp16.safetensors" }, "preview_id": null, "created_at": "2025-09-16T06:02:41.456527", "last_access_time": "2025-09-16T06:02:41.456527" }@bigcat88 or @guill, confirm you that
user_metadata.filenameis the path we're going to be what we will be using for backwards compatibility.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also for our purposes, I am having to include frontend logic to parse and categorize the different types of models.
For Example:
Is this intended? I could see it being fragile if we hardcode are changed or get removed. I guess the search endpoint in the future will fix this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay i forgot one more thing... is "name" ever going to be user-friendly from your end?
Using

nameright now, most people will see something like this: