Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
286 views
in Technique[技术] by (71.8m points)

apache kafka - duplicate events by consumer

we observed that one of the consumer try to pick the events multiple times from kafka topic. we have the below seetings on consumer application side. spring.kafka.consumer.enable-auto-commit=false & spring.kafka.consumer.auto-offset-reset=earliest. how to avoid the duplicate by the consumer application. Do we need to fine tune the above configuration settings to avoid the consumer to pick the events multiple times from the kafka topic.

question from:https://stackoverflow.com/questions/65640985/duplicate-events-by-consumer

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Since you've disabled auto commits, you do need to fine tune when you actually commit a record, otherwise you could have at least once processing.

You could also read the examples of the exactly once processing capabilities using transactions and idempotent producers

The auto.offset.reset only applies if your consumer group is removed, or never exists at all (you're not committing anything). In that case, you're always going to read from the beginning of the topic


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...