Skip to content

Conversation

@CTY-git
Copy link
Contributor

@CTY-git CTY-git commented Mar 20, 2025

PR Checklist

  • The commit message follows our guidelines: Code of conduct
  • Tests for the changes have been added (for bug fixes / features)
  • Docs have been added / updated (for bug fixes / features)
  • Does this PR introduce a breaking change?
  • Include PR in release notes?

PR Type

  • Bugfix
  • Feature
  • Refactoring
  • Build /CI
  • Documentation
  • Others

What is the current behavior?

Issue Number: N/A

What is the new behavior?

Other information

@CTY-git CTY-git requested a review from whoisarpit March 20, 2025 13:40
@patched-admin
Copy link
Contributor

File Changed: patchwork/common/client/llm/google_.py

Details: Violation of Rule 1 - Potential bug introduced by changing error handling return value from -1 to 1. This change could mask errors and lead to incorrect token count validation, as a positive return value typically indicates available token space.

Affected Code Snippet:

except Exception as e:
    logger.debug(f"Error during token count at GoogleLlmClient: {e}")
    return 1

Start Line: 231

End Line: 233


Details: Violation of Rule 2 - The broad exception handling with only debug-level logging could hide security-critical failures or API-related issues. Error messages should be properly sanitized and logged at appropriate levels.

Affected Code Snippet:

except Exception as e:
    logger.debug(f"Error during token count at GoogleLlmClient: {e}")
    return 1

Start Line: 231

End Line: 233

@whoisarpit whoisarpit merged commit b230bdd into main Mar 20, 2025
3 of 4 checks passed
@whoisarpit whoisarpit deleted the fix-gemini-new-model-token-counting branch March 20, 2025 13:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants