Support IVF_PQ on GPU with using metric_type: IP #2585
Closed
Description
Describe the bug
14:21:33 2020-06-17:06:21:33,506 INFO [client.py:213] Building index start, collection_name: collection_zqwbeex0, index_type: IVF_PQ
14:21:33 2020-06-17:06:21:33,507 INFO [client.py:215] {'nlist': 2048, 'm': 32}
14:21:33 2020-06-17:06:21:33,509 ERROR [client.py:99] PQ not support IP in GPU version!
14:21:33 2020-06-17:06:21:33,509 ERROR [main.py:70] Status not ok
14:21:33 2020-06-17:06:21:33,510 ERROR [main.py:71] Traceback (most recent call last):
14:21:33 File "main.py", line 68, in queue_worker
14:21:33 runner.run(run_type, collection)
14:21:33 File "/home/jenkins/agent/workspace/milvus-benchmark-0.10.0/milvus_benchmark/k8s_runner.py", line 764, in run
14:21:33 milvus_instance.create_index(index_type, index_param=index_param)
14:21:33 File "/home/jenkins/agent/workspace/milvus-benchmark-0.10.0/milvus_benchmark/client.py", line 44, in wrapper
14:21:33 result = func(*args, **kwargs)
14:21:33 File "/home/jenkins/agent/workspace/milvus-benchmark-0.10.0/milvus_benchmark/client.py", line 217, in create_index
14:21:33 self.check_status(status)
14:21:33 File "/home/jenkins/agent/workspace/milvus-benchmark-0.10.0/milvus_benchmark/client.py", line 100, in check_status
14:21:33 raise Exception("Status not ok")
14:21:33 Exception: Status not ok
14:21:33
14:21:33 2020-06-17:06:21:33,596 DEBUG [k8s_runner.py:68] Start clean up: milvus-benchmark-test-gs765zjz
if (s.ok() && adapter_index_type == (int)engine::EngineType::FAISS_PQ &&
collection_info.metric_type_ == (int)engine::MetricType::IP) {
return Status(SERVER_UNEXPECTED_ERROR, "PQ not support IP in GPU version!");
}
Steps/Code to reproduce behavior
Follow this guide to craft a minimal bug report. This helps us reproduce the issue you're having and resolve the issue more quickly.
Expected behavior
A clear and concise description of what you expected to happen.
Environment details
0.10.0/master
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
Add any other context about the problem here.