You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When for whatever reason there are two records in DLQ for the same original kafka record (that is, same key & value & kafkatimestamp & original offset), while replaying it doesn't make sense to put duplicate records on the retry topic.
So, when replaying, deduplicate first. This can be either default behavior or API parameter ...?unique=true.
In the response of replay endpoint, might be good to show that certain messages were not send because of duplicate, e.g.:
MitchelNijdam-Rockstars
changed the title
When replay all, only send unique records to retry topic
When replaying multiple, only send unique records to retry topic
Nov 15, 2019
When for whatever reason there are two records in DLQ for the same original kafka record (that is, same key & value & kafkatimestamp & original offset), while replaying it doesn't make sense to put duplicate records on the retry topic.
So, when replaying, deduplicate first. This can be either default behavior or API parameter
...?unique=true
.In the response of replay endpoint, might be good to show that certain messages were not send because of duplicate, e.g.:
The text was updated successfully, but these errors were encountered: