⚡️ Speed up function _transform_prompt by 6%
          #159
        
          
      
  Add this suggestion to a batch that can be applied as a single commit.
  This suggestion is invalid because no changes were made to the code.
  Suggestions cannot be applied while the pull request is closed.
  Suggestions cannot be applied while viewing a subset of changes.
  Only one suggestion per line can be applied in a batch.
  Add this suggestion to a batch that can be applied as a single commit.
  Applying suggestions on deleted lines is not supported.
  You must change the existing code in this line in order to create a valid suggestion.
  Outdated suggestions cannot be applied.
  This suggestion has been applied or marked resolved.
  Suggestions cannot be applied from pending reviews.
  Suggestions cannot be applied on multi-line comments.
  Suggestions cannot be applied while the pull request is queued to merge.
  Suggestion cannot be applied right now. Please check back later.
  
    
  
    
📄 6% (0.06x) speedup for
_transform_promptinlitellm/llms/openai/completion/utils.py⏱️ Runtime :
1.13 milliseconds→1.07 milliseconds(best of79runs)📝 Explanation and details
The optimized code achieves a 5% speedup through three key improvements:
1. Optimized
is_tokens_or_list_of_tokensfunction:not isinstance(value, list)andnot value) to avoid expensiveall()operations on invalid inputs2. Replaced string concatenation with list joining in
convert_content_list_to_str:texts += text_content(O(n²) complexity) to collecting in a list and using"".join(text_parts)(O(n) complexity)texts = "")3. Used list comprehension instead of manual loop in
_transform_prompt:forloop withprompt_str_list = [convert_content_list_to_str(...) for m in messages]try/exceptblock that just re-raised exceptionsPerformance benefits are most notable for:
The optimizations particularly excel when processing structured content (lists of text dictionaries) and large message volumes, which are common in LLM applications.
✅ Correctness verification report:
🌀 Generated Regression Tests and Runtime
🔎 Concolic Coverage Tests and Runtime
codeflash_concolic_kt42dg31/tmpj9r77li5/test_concolic_coverage.py::test__transform_promptTo edit these changes
git checkout codeflash/optimize-_transform_prompt-mhde6s6cand push.