Skip to content

Conversation

@NeptuneHub
Copy link
Owner

This contain multkiple clap fix and improvement.

Comment on lines +137 to +140
return jsonify({
'error': str(e),
'loaded': False
}), 500

Check warning

Code scanning / CodeQL

Information exposure through an exception Medium

Stack trace information
flows to this location and may be exposed to an external user.

Copilot Autofix

AI 22 days ago

In general, the fix is to avoid returning the raw exception string to the client and instead return a generic error message, while logging the detailed exception on the server. This aligns with the pattern shown in the background “GOOD” example.

Concretely, in warmup_model_api (lines 132–140), we should keep logging e but change the JSON response so that the "error" field contains a generic message like "Model warmup failed" instead of str(e). This preserves existing behavior in terms of HTTP status code (500) and the "loaded": False flag, but prevents any sensitive error details from leaking.

Only this region in app_clap_search.py needs modification:

  • Around line 135: keep logger.error(f"Model warmup failed: {e}") as-is.
  • Around lines 137–139: replace 'error': str(e) with a generic, non-sensitive message. No new imports or helper methods are required.
Suggested changeset 1
app_clap_search.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/app_clap_search.py b/app_clap_search.py
--- a/app_clap_search.py
+++ b/app_clap_search.py
@@ -135,7 +135,7 @@
     except Exception as e:
         logger.error(f"Model warmup failed: {e}")
         return jsonify({
-            'error': str(e),
+            'error': 'Model warmup failed due to an internal error.',
             'loaded': False
         }), 500
 
EOF
@@ -135,7 +135,7 @@
except Exception as e:
logger.error(f"Model warmup failed: {e}")
return jsonify({
'error': str(e),
'error': 'Model warmup failed due to an internal error.',
'loaded': False
}), 500

Copilot is powered by AI and may make mistakes. Always verify output.
Comment on lines +137 to +140
return jsonify({
'error': str(e),
'loaded': False
}), 500

Check warning

Code scanning / CodeQL

Information exposure through an exception Medium

Stack trace information
flows to this location and may be exposed to an external user.

Copilot Autofix

AI 22 days ago

In general, the fix is to avoid returning raw exception messages or stack traces to clients. Instead, log the detailed exception on the server (with stack trace) and respond with a stable, generic error message and, optionally, a non-sensitive error code that the client can use.

For this specific code, the minimal change is inside warmup_model_api in app_mulan_search.py:

  • Keep logging the failure, but enrich the logging with the full stack trace so developers still have diagnostics (e.g., logger.exception(...) or logger.error(..., exc_info=True)).
  • Replace the JSON 'error': str(e) field with a generic message such as 'error': 'MuLan model warmup failed', which does not depend on the exception content.
  • Preserve the existing HTTP status code (500) and the 'loaded': False flag so external behavior besides the message content remains the same.

Concretely:

  • Modify the except Exception as e: block around lines 135–140:
    • Change the logging call to include traceback information.
    • Change the JSON payload to use a constant generic message instead of str(e).

No new imports are required if we use logger.exception, since logging is already imported; otherwise, we could pass exc_info=True to logger.error without new imports.

Suggested changeset 1
app_mulan_search.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/app_mulan_search.py b/app_mulan_search.py
--- a/app_mulan_search.py
+++ b/app_mulan_search.py
@@ -133,9 +133,9 @@
         status = warmup_text_search_model()
         return jsonify(status)
     except Exception as e:
-        logger.error(f"MuLan model warmup failed: {e}")
+        logger.exception("MuLan model warmup failed")
         return jsonify({
-            'error': str(e),
+            'error': 'MuLan model warmup failed',
             'loaded': False
         }), 500
 
EOF
@@ -133,9 +133,9 @@
status = warmup_text_search_model()
return jsonify(status)
except Exception as e:
logger.error(f"MuLan model warmup failed: {e}")
logger.exception("MuLan model warmup failed")
return jsonify({
'error': str(e),
'error': 'MuLan model warmup failed',
'loaded': False
}), 500

Copilot is powered by AI and may make mistakes. Always verify output.
@NeptuneHub NeptuneHub merged commit 0e675db into main Dec 22, 2025
8 checks passed
@NeptuneHub NeptuneHub deleted the devel branch December 22, 2025 18:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants