You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If destination topic does not exist, and client configured with kgo.AllowAutoTopicCreation() and kgo.ConsumeRegex() options, client.ProduceSync hangs with logs:
[DEBUG] wrote Fetch v16 broker: 1 bytes_written: 145 write_wait: 0s time_to_write: 0s err: <nil>
[INFO] producing to a new topic for the first time, fetching metadata to learn its partitions topic: result.topic.b
[INFO] immediate metadata update triggered why: forced load because we are producing to a topic for the first time
[DEBUG] wrote Metadata v12 broker: 1 bytes_written: 22 write_wait: 332µs time_to_write: 152.5µs err: <nil>
[INFO] producing to a new topic for the first time, fetching metadata to learn its partitions topic: result.topic.a
[DEBUG] read Metadata v12 broker: 1 bytes_read: 1997 read_wait: 1.0064ms time_to_read: 2.803ms err: <nil>
[DEBUG] metadata refresh has identical topic partition data topic: topic.a partition: 0 leader: 1 leader_epoch: 0
[DEBUG] metadata refresh has identical topic partition data topic: topic.b partition: 0 leader: 1 leader_epoch: 0
[DEBUG] immediate metadata update had inner errors, re-updating errors: topic_missing{} update_after: 250ms
.
.
.
[DEBUG] wrote Metadata v12 broker: 1 bytes_written: 22 write_wait: 0s time_to_write: 276.8µs err: <nil>
[DEBUG] read Metadata v12 broker: 1 bytes_read: 1997 read_wait: 2.0099ms time_to_read: 2.7625ms err: <nil>
[DEBUG] metadata refresh has identical topic partition data topic: topic.b partition: 0 leader: 1 leader_epoch: 0
[DEBUG] metadata refresh has identical topic partition data topic: topic.a partition: 0 leader: 1 leader_epoch: 0
[DEBUG] immediate metadata update had inner errors, re-updating errors: topic_missing{} update_after: 250ms
How to reproduce
Start Kafka in Docker with docker-compose.yml like this:
When using list of topics without regexp (see comment) and remove kgo.ConsumeRegex() option in kgo.NewClient everything works. Topics result.topic.a and result.topic.b will be successfully created.
The text was updated successfully, but these errors were encountered:
Problem Description
If destination topic does not exist, and client configured with
kgo.AllowAutoTopicCreation()
andkgo.ConsumeRegex()
options,client.ProduceSync
hangs with logs:How to reproduce
Start Kafka in Docker with docker-compose.yml like this:
Execute this in container:
To check available topics in kafka:
To delete topic:
Start application:
When using list of topics without regexp (see comment) and remove
kgo.ConsumeRegex()
option inkgo.NewClient
everything works. Topicsresult.topic.a
andresult.topic.b
will be successfully created.The text was updated successfully, but these errors were encountered: