Skip to content

FastDeploy RKNPU2 Memory Leak #1858

@MikeLud

Description

@MikeLud

Environment

FastDeploy version: latest code in develop branch
OS Platform: Linux (Linux 5.10.110-rockchip-rk3588 #23.02.2 SMP Fri Feb 17 23:59:20 UTC 2023)
Hardware: e.g. Orange Pi 5 Rockchip RK3588S 8-core 64-bit processor
Program Language: e.g. Python 3.9

Problem description

After running about 225 inferences I get the below errors

E RKNN: [16:11:55.647] failed to allocate handle, ret: -1, errno: 14, errstr: Bad address, sleep one second and try again!
E RKNN: [16:11:56.656] failed to allocate handle, ret: -1, errno: 14, errstr: Bad address
E RKNN: [16:11:56.656] failed to malloc npu memory!, size: 7397955, flags: 0x2
E RKNN: [16:11:56.656] rknn_init, load model failed!
[ERROR] fastdeploy/runtime/backends/rknpu2/rknpu2_backend.cc(180)::LoadModel    The function(rknn_init) failed! ret=-6
[ERROR] fastdeploy/runtime/backends/rknpu2/rknpu2_backend.cc(123)::Init Load model failed
[ERROR] fastdeploy/runtime/runtime.cc(328)::CreateRKNPU2Backend Failed to initialize RKNPU2 backend.
Aborted
def do_detect(img: Image, score_threshold: float = 0.3):

    # Configure runtime, load model
    runtime_option = fd.RuntimeOption()
    runtime_option.use_rknpu2()

    model = fd.vision.detection.RKYOLOV7(
    model_file,
    runtime_option=runtime_option,
    model_format=fd.ModelFormat.RKNN)
    
    # Predicting Image Results
    im = np.array(img)
    start_inference_time = time.perf_counter()
    result = model.predict(im, conf_threshold=score_threshold, nms_iou_threshold=0.5)
    inferenceMs = int((time.perf_counter() - start_inference_time) * 1000)

    """
    with open("log.txt", "a") as text_file:
        text_file.write(str(result) + "\n")    
    """
    result = str(result)
    lines = result.strip().split("\n")

    outputs = []

    for line in lines[1:]:
        # Split the line by comma to get a list of values
        values = line.split(",")
        values = [x.strip(' ') for x in values]
        
        """
        with open("values.txt", "a") as text_file:
            text_file.write(str(values) + "\n")
        """

        # Convert the values to appropriate data types
        xmin = float(values[0])
        ymin = float(values[1])
        xmax = float(values[2])
        ymax = float(values[3])
        score = float(values[4])
        label_id = int(values[5])

        # if score >= score_threshold:
        detection = {
            "confidence": score,
            "label": str(extract_label_from_file(label_id)),
            "x_min": int(xmin),
            "y_min": int(ymin),
            "x_max": int(xmax),
            "y_max": int(ymax),
        }

        outputs.append(detection)
    
    """
    with open("outputs.txt", "a") as text_file:
        text_file.write(str(outputs) + "\n")
    """
    
    return {
        "success"     : True,
        "count"       : len(outputs),
        "predictions" : outputs,
        "inferenceMs" : inferenceMs
    }

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions