-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: After restarting, it is unusually slow to be able to use normally #37654
Comments
|
After waiting for 7 hours, it works normally now, but the cpu cost is still high. |
I guess the milvus is building index that's why the cpu cost keeps high at this moment. If you have milvus metrics, you can check and confirm it. Before that we can see there are /assign @fire717 |
My total data is around 10G, and when do restart, I am sure that I didn't send any insert or delete request, why it's cpu cost so high? And does it normal that it cost so much time for hours? After restarting, the log didn't show error, but if I try to search or delete , it shows error below. |
Is there an existing issue for this?
Environment
Current Behavior
After restarting, the milvus cannot work, search or delect api goes wrong.
And by TOP command, it shows that milvus cpu cost from 100% to 400%.
Expected Behavior
work normally.
Steps To Reproduce
No response
Milvus Log
ERROR:sanic.error:Exception occurred while handling uri: 'http://10.89.134.52:8777/api/local_doc_qa/delete_files'
Traceback (most recent call last):
File "handle_request", line 97, in handle_request
File "/workspace/qanything_local/qanything_kernel/qanything_server/handler.py", line 273, in delete_docs
milvus_kb.delete_files(file_ids)
File "/workspace/qanything_local/qanything_kernel/connector/database/milvus/milvus_client.py", line 284, in delete_files
self.sess.delete(expr=f"file_id in {files_id}")
File "/usr/local/lib/python3.10/dist-packages/pymilvus/orm/collection.py", line 563, in delete
res = conn.delete(self._name, expr, partition_name, timeout=timeout, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py", line 129, in handler
raise e from e
File "/usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py", line 125, in handler
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py", line 164, in handler
return func(self, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py", line 104, in handler
raise e from e
File "/usr/local/lib/python3.10/dist-packages/pymilvus/decorators.py", line 68, in handler
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/pymilvus/client/grpc_handler.py", line 586, in delete
raise err from err
File "/usr/local/lib/python3.10/dist-packages/pymilvus/client/grpc_handler.py", line 580, in delete
check_status(response.status)
File "/usr/local/lib/python3.10/dist-packages/pymilvus/client/utils.py", line 54, in check_status
raise MilvusException(status.code, status.reason, status.error_code)
pymilvus.exceptions.MilvusException: <MilvusException: (code=65535, message=failed to search/query delegator 31 for channel by-dev-rootcoord-dml_0_450679063477500466v0: Timestamp lag too large)>
Anything else?
No response
The text was updated successfully, but these errors were encountered: