Skip to content

The committed block count cannot exceed the maximum limit of 50,000 blocks. #324

Closed
@nidhijwt

Description

@nidhijwt

Overview

I am using CamelAzurestorageblobSinkConnector to sink to Archive data on my Kafka topics.

The problem

Data that I am sending, goes in Appendblob to Azure blob. What it does is, it creates another block in end every time and adds the block to an azure append blob. Azure gives a maximum limit of 50,000 blocks. Thus after 50,000 records data gets blob gets filled up and this happens within a few minutes. Append blob offers block size is upto 4 MB, which in my case is not being used fully as it saves only one record per block. The size of messages is really small (say 1 kb each). What happens is after 50,000 records my Append blob get full and I get an error saying

com.azure.storage.blob.models.BlobStorageException: Status code 409, "BlockCountExceedsLimitThe committed block count cannot exceed the maximum limit of 50,000 blocks.\nRequestId:b112f700-301e-0022-476f-5ea523000000\nTime:2020-07-20T08:23:40.1508526Z"\n\tat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)\n\tat sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)\n\tat

Ask
I do not see a feature where I can buffer a few records and then insert using a connector. Please suggest if any such feature exists as this seems basic thing when archiving using connectors.

I such a thing does not exist what work around other people use?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions