Skip to content

Getting duplicate events on Kafka connector restart #1682

Open
@shubham-maheshwari-tech

Description

Hi
I am using Apache camel S3 source connector, with Strimizi Operator. When ever the kafka connector is restarted after some config update duplicate events are coming in Kafka means older payload which are present in S3 bucket that can also be seen.
Below is full connector configuration:
CamelAwss3sourceSourceConnectorConfig values: camel.aggregation.size = 10 camel.aggregation.timeout = 500 camel.beans.aggregate = null camel.error.handler = default camel.error.handler.max.redeliveries = 0 camel.error.handler.redelivery.delay = 1000 camel.idempotency.enabled = true camel.idempotency.expression.header = null camel.idempotency.expression.type = body camel.idempotency.kafka.bootstrap.servers = localhost:9092 camel.idempotency.kafka.max.cache.size = 1000 camel.idempotency.kafka.poll.duration.ms = 100 camel.idempotency.kafka.topic = kafka_idempotent_repository camel.idempotency.memory.dimension = 100 camel.idempotency.repository.type = memory camel.kamelet.aws-s3-source.accessKey = [hidden] camel.kamelet.aws-s3-source.autoCreateBucket = false camel.kamelet.aws-s3-source.bucketNameOrArn = test-bucket camel.kamelet.aws-s3-source.delay = 500 camel.kamelet.aws-s3-source.deleteAfterRead = false camel.kamelet.aws-s3-source.forcePathStyle = false camel.kamelet.aws-s3-source.ignoreBody = false camel.kamelet.aws-s3-source.maxMessagesPerPoll = 10 camel.kamelet.aws-s3-source.overrideEndpoint = false camel.kamelet.aws-s3-source.prefix = schema/feature camel.kamelet.aws-s3-source.region = ap-southeast-1 camel.kamelet.aws-s3-source.secretKey = [hidden] camel.kamelet.aws-s3-source.uriEndpointOverride = null camel.kamelet.aws-s3-source.useDefaultCredentialsProvider = false camel.map.headers = true camel.map.properties = true camel.remove.headers.pattern = camel.source.camelMessageHeaderKey = CamelAwsS3Key camel.source.component = null camel.source.contentLogLevel = OFF camel.source.marshal = null camel.source.maxBatchPollSize = 1000 camel.source.maxNotCommittedRecords = 1024 camel.source.maxPollDuration = 1000 camel.source.pollingConsumerBlockTimeout = 0 camel.source.pollingConsumerBlockWhenFull = true camel.source.pollingConsumerQueueSize = 1000 camel.source.unmarshal = null camel.source.url = null topics = commerce-test

The camel.idempotency.enabled = true is already enabled but still duplicate events coming on every kafka connector restart.
Using 4.4.2 version apache camel lib

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions