Skip to content

Commit

Permalink
refine context length (#1813)
Browse files Browse the repository at this point in the history
### What problem does this PR solve?

#1594
### Type of change

- [x] Performance Improvement
  • Loading branch information
KevinHuSh authored Aug 5, 2024
1 parent 5b013da commit 5650442
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion graphrag/index.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def build_knowlege_graph_chunks(tenant_id: str, chunks: List[str], callback, ent
llm_bdl = LLMBundle(tenant_id, LLMType.CHAT)
ext = GraphExtractor(llm_bdl)
left_token_count = llm_bdl.max_length - ext.prompt_token_count - 1024
left_token_count = llm_bdl.max_length * 0.4
left_token_count = max(llm_bdl.max_length * 0.8, left_token_count)

assert left_token_count > 0, f"The LLM context length({llm_bdl.max_length}) is smaller than prompt({ext.prompt_token_count})"

Expand Down

0 comments on commit 5650442

Please sign in to comment.