Skip to content

[Question]: Still unable to restrict rate limit when generating knowledge graph #5942

Open
@Joey233qwq

Description

@Joey233qwq

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (Language Policy).
  • Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
  • Please do not modify this template :) and fill in all the required fields.

Describe your problem

I have updated to nightly version, and also add "MAX_CONCURRENT_CHATS=10" in my docker/.env ( or should I add "export MAX_CONCURRENT_CHATS=10" instead ?). However, when extracting, there are still so many tokens consumed in 1 minute:

...
22:02:11 Page(1164): Embedding chunks (40.25s)
22:04:00 Page(1
164): Indexing done (108.99s). Task done (573.69s)
22:06:49 Entities extraction of chunk 368 277/800 done, 6 nodes, 0 edges, 8425 tokens.
22:06:49 Entities extraction of chunk 355 278/800 done, 0 nodes, 0 edges, 7128 tokens.
22:06:50 Entities extraction of chunk 358 279/800 done, 9 nodes, 0 edges, 7882 tokens.
22:06:50 Entities extraction of chunk 351 280/800 done, 8 nodes, 0 edges, 7669 tokens.
22:06:50 Entities extraction of chunk 345 281/800 done, 8 nodes, 8 edges, 9349 tokens.
...

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions