Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
d0cf9b2
Configure NVIDIA GPU resources in docker-compose
NeptuneHub Dec 15, 2025
101962b
updated version 0.8.3 in config.py
NeptuneHub Dec 15, 2025
9619a8a
Initial plan
Copilot Dec 15, 2025
76ef0e0
Remove unused imports and environment variables
Copilot Dec 15, 2025
f971362
Merge pull request #229 from NeptuneHub/copilot/remove-unused-functio…
NeptuneHub Dec 15, 2025
140daa7
Text-Search: Removed autop load of clap model on Flask container star…
NeptuneHub Dec 15, 2025
d47a6d0
Fix hardcoded parameter - #231
NeptuneHub Dec 15, 2025
f4da5eb
text search: warmap
NeptuneHub Dec 16, 2025
c84f92e
text search improvement
NeptuneHub Dec 16, 2025
4f9fe4b
GPU check improvement
NeptuneHub Dec 16, 2025
232b876
workflow update for devel-nvidia image
NeptuneHub Dec 16, 2025
1dcdea4
MULAN first implementation
NeptuneHub Dec 17, 2025
dec4d44
fixed requirement
NeptuneHub Dec 17, 2025
c296a12
disable clap + MULAN on 30 sec only
NeptuneHub Dec 18, 2025
25caf4b
MULAN converter - incomplete
NeptuneHub Dec 18, 2025
0a432b2
update gitignore
NeptuneHub Dec 18, 2025
9a759dc
MULAN from pytorch to onnx
NeptuneHub Dec 18, 2025
0ba5995
ONNX MULAN fix
NeptuneHub Dec 18, 2025
7d6c83f
mulan analysis flag
NeptuneHub Dec 19, 2025
3d6ba49
MULAN 3 fix
NeptuneHub Dec 19, 2025
006f35d
MULAN inder reload
NeptuneHub Dec 19, 2025
08b8803
clamp3 test
NeptuneHub Dec 19, 2025
124b8ff
gitignore
NeptuneHub Dec 19, 2025
41f0a76
LOWER CPU USAGE
NeptuneHub Dec 19, 2025
342b911
ONNX multithreading commented
NeptuneHub Dec 19, 2025
66de025
INDEX RELOAD LOGGING IMPROVEMENT
NeptuneHub Dec 19, 2025
f5c21a8
ONNX CPU CAP for CLAP and MULAN
NeptuneHub Dec 20, 2025
4d7849b
ALL model load improvement per album
NeptuneHub Dec 20, 2025
4c6f5b6
REEBUILD TASK ADD FLASK CONTEXT
NeptuneHub Dec 21, 2025
d75c6b3
GITHUB WORKFLOW: mulan instead of onnx
NeptuneHub Dec 21, 2025
58dd00a
readme.md update
NeptuneHub Dec 21, 2025
fad49aa
worker local with cpu
NeptuneHub Dec 21, 2025
c653b16
Old tensorflow code remvoed + redis connection fix
NeptuneHub Dec 22, 2025
ae66537
docker compose local cpu
NeptuneHub Dec 22, 2025
7a3410b
MULAN DEFAULT DISABLED
NeptuneHub Dec 22, 2025
8fd25b3
Merge branch 'devel' into mulan
NeptuneHub Dec 22, 2025
3cdf640
Merge pull request #240 from NeptuneHub/mulan
NeptuneHub Dec 22, 2025
3fe8c45
configuable timeout - #237ì
NeptuneHub Dec 22, 2025
ac47037
better exception handling - #237
NeptuneHub Dec 22, 2025
3f24691
MULAN DISABLED FROM DOCKERFILE
NeptuneHub Dec 22, 2025
2d7b52b
CLAP MODEL SPLIT
NeptuneHub Dec 22, 2025
1db8c87
added CLAP_ENABLED env var to all the deployment example
NeptuneHub Dec 22, 2025
66a8d13
Merge branch 'main' into devel
NeptuneHub Dec 22, 2025
7d91e1c
TEST FIX for CLAP SPLIT MODEL
NeptuneHub Dec 22, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -39,3 +39,17 @@ screenshot/
htmlcov/
.coverage
*.log

# Large model files and caches (these are downloaded inside the container)
model/
*.onnx
*.pt
*.pth
*.bin
*.h5
*.pb
CLAMP3/weights_*.pth
CLAMP3/*.npy
mulan_model_export/
.cache/
*.tar.gz
12 changes: 6 additions & 6 deletions .github/workflows/build-arm-intel.yml
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
# This workflow builds and tests a multi-architecture Docker image for both AMD64 and ARM64.
# It triggers on pushes to the main, devel, and onnx branches, as well as on version tags.
# It triggers on pushes to the main, devel, and mulan branches, as well as on version tags.
# Tags are automatically generated based on the triggering event:
# - main branch -> 'latest'
# - devel branch -> 'devel'
# - onnx branch -> 'onnx'
# - mulan branch -> 'mulan'
# - version tags (e.g., v1.2.3) -> version number (e.g., '1.2.3')

name: Build, Test, and Push AudioMuse AI Docker Image INTEL and ARM

on:
push:
# We avoid triggering this workflow on direct pushes to `main` to prevent
# accidental builds from routine commits. Keep `devel` and `onnx` as before.
# accidental builds from routine commits. Keep `devel` and `mulan` as before.
branches:
- devel # Trigger for 'devel' tag
- onnx # New branch also triggers for 'onnx' tag
- mulan # New branch also triggers for 'mulan' tag
tags:
- 'v*.*.*' # Trigger for version tags like v1.0.0
# Allow manual runs for debugging or manual pushes
Expand Down Expand Up @@ -88,8 +88,8 @@ jobs:
type=raw,value=latest,enable=${{ startsWith(github.ref, 'refs/tags/v') }}
# Set 'devel' tag only for the devel branch
type=raw,value=devel,enable=${{ github.ref == 'refs/heads/devel' }}
# Set 'onnx' tag only for the onnx branch
type=raw,value=onnx,enable=${{ github.ref == 'refs/heads/onnx' }}
# Set 'mulan' tag only for the mulan branch
type=raw,value=mulan,enable=${{ github.ref == 'refs/heads/mulan' }}
# Use the version number for Git tags (e.g., v1.2.3 -> 1.2.3)
type=semver,pattern={{version}}

Expand Down
12 changes: 10 additions & 2 deletions .github/workflows/build-nvidia.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,12 @@ name: Build and Push AudioMuse AI Docker Image with NVIDIA support

on:
push:
# Only trigger on version tags to avoid automatic runs on direct pushes to main.
# Trigger on version tags for stable releases
tags:
- 'v*.*.*' # Trigger for version tags like v1.0.0
# Trigger on pushes to devel branch for development releases
branches:
- devel
# Allow manual runs from the Actions UI
workflow_dispatch:
inputs:
Expand Down Expand Up @@ -60,6 +63,11 @@ jobs:
ALL_TAGS="${VERSIONED_TAG},${LATEST_TAG}"
echo "Building versioned tag: $VERSIONED_TAG"
echo "Also tagging as latest: $LATEST_TAG"
elif [[ "${GITHUB_REF}" == refs/heads/devel ]]; then
# When compiling from devel branch, create devel-nvidia tag
DEVEL_TAG="ghcr.io/$REPO_NAME_LOWER:devel-nvidia"
ALL_TAGS="${DEVEL_TAG}"
echo "Building devel tag: $DEVEL_TAG"
elif [[ "${{ github.event_name }}" == "workflow_dispatch" ]]; then
# Manual run with version input
VERSION_TAG=$(echo "${{ inputs.version_tag }}" | sed -e 's|^v||g')
Expand All @@ -69,7 +77,7 @@ jobs:
echo "Building manual version tag: $VERSIONED_TAG"
echo "Also tagging as latest: $LATEST_TAG"
else
echo "This workflow is intended to run on tag pushes (refs/tags/v*) or manual dispatch. Exiting."
echo "This workflow is intended to run on tag pushes (refs/tags/v*), devel branch, or manual dispatch. Exiting."
exit 1
fi

Expand Down
13 changes: 7 additions & 6 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,15 +28,16 @@ jobs:
python -m pip install --upgrade pip
pip install -r test/requirements.txt

- name: Download CLAP model
- name: Download CLAP models
run: |
base_url="https://github.com/NeptuneHub/AudioMuse-AI/releases/download/v3.0.0-model"
clap_model="clap_model.onnx"
mkdir -p test/models
echo "Downloading CLAP model..."
curl -L "${base_url}/${clap_model}" -o "test/models/${clap_model}"
echo "CLAP model downloaded successfully"
ls -lh test/models/${clap_model}
echo "Downloading CLAP audio model..."
curl -L "${base_url}/clap_audio_model.onnx" -o "test/models/clap_audio_model.onnx"
echo "Downloading CLAP text model..."
curl -L "${base_url}/clap_text_model.onnx" -o "test/models/clap_text_model.onnx"
echo "CLAP models downloaded successfully"
ls -lh test/models/clap_*.onnx

- name: Verify test files exist
run: |
Expand Down
15 changes: 14 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ __pycache__/
*.pyd
venv/
.venv/
.venv-macos/
env/

# Local environment variables
Expand All @@ -33,4 +34,16 @@ htmlcov/

# Large model files in query folder
/query/*.pt
/query/*.onnx
/query/*.onnx


# CLAMP3 weight files (large model files)
/CLAMP3/weights_*.pth
/CLAMP3/*.pth


# Downloaded ONNX models (from run_worker_macos.sh or similar)
/model/

# Exported MuLan models
mulan_model_export/
Loading
Loading