Replies: 2 comments 2 replies
-
Turn on use multi TPU. That was the issue for me. The Coral implementation is a little clunky at the moment, so that setting doesn't seem to do what you think it should and is needed |
Beta Was this translation helpful? Give feedback.
1 reply
-
I am seeing the same issue, but the low timing on the "response received" leads me to believe that the coral TPU is still processing but the status page shows it changed to CPU. Is there a way to verify it is actually still using the TPU or not? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
On start i get the message that the TPU is detected and used:
16:44:59:objectdetection_coral_adapter.py: TPU detected
16:44:59:objectdetection_coral_adapter.py: Using Edge TPU
After a few seconds after the object detection module is started it switches from TPU to CPU: CPU (TF-Lite)
I tried restarting the service (host is windows) and reïnstalling the Module, but to no avail. Any tips on trouble shooting? I am using the Coral miniPCIE TPU. I tried with different models and sizes (YOLOv8 and MobileNet).
Module 'Object Detection (Coral)' 2.4.0 (ID: ObjectDetectionCoral)
Valid: True
Module Path: \modules\ObjectDetectionCoral
Module Location: Internal
AutoStart: True
Queue: objectdetection_queue
Runtime: python3.9
Runtime Location: Local
FilePath: objectdetection_coral_adapter.py
Start pause: 1 sec
Parallelism: 16
LogVerbosity:
Platforms: all
GPU Libraries: installed if available
GPU: use if supported
Accelerator:
Half Precision: enable
Environment Variables
CPAI_CORAL_MODEL_NAME = MobileNet SSD
CPAI_CORAL_MULTI_TPU = False
MODELS_DIR = \modules\ObjectDetectionCoral\assets
MODEL_SIZE = medium
Status Data: {
"inferenceDevice": null,
"inferenceLibrary": "TF-Lite",
"canUseGPU": "false",
"successfulInferences": 148,
"failedInferences": 0,
"numInferences": 148,
"averageInferenceMs": 10.675675675675675
Beta Was this translation helpful? Give feedback.
All reactions