You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Additional context
Node API is having kafka producer settings with KAFKA_PRINCIPAL_ID. Same machine we are running kinit command with different CACHE_PRINCIPAL_ID to get krb token for different purpose.
When we publish a message to topic it is getting published successfully using the above configuration, but some time it is trying to publish using the CACHE_PRINCIPAL_ID instead of KAFKA_PRINCIPAL_ID and it is getting denied.
I appreciate any help one this.
The text was updated successfully, but these errors were encountered:
Environment Information
node-rdkafka Configuration Settings
{
"producerConfig": {
"bootstrap.servers": "MY_BROKER_DETAILS",
"client.id": "localhost",
"debug": "consumer,cgrp,fetch,msg,eos",
"event_cb": true,
"message.max.bytes": 1048576,
"max.in.flight.requests.per.connection": 1,
"sasl.mechanism": "GSSAPI",
"sasl.kerberos.principal": "KAFKA_PRINCIPAL_ID",
"sasl.kerberos.service.name": "kafka",
"sasl.kerberos.keytab": "MY_KRB_KEYTAB",
"sasl.kerberos.kinit.cmd": "kinit -V -R -t "%{sasl.kerberos.keytab}" -k %{sasl.kerberos.principal} || kinit -V -t "%{sasl.kerberos.keytab}" -k %{sasl.kerberos.principal}",
"sasl.kerberos.min.time.before.relogin": 90000,
"security.protocol": "sasl_plaintext",
"compression.codec": "lz4",
"delivery.timeout.ms": 2147483647,
"dr_msg_cb": true,
"enable.idempotence": false,
"linger.ms": 100,
"message.send.max.retries": 3,
"request.timeout.ms": 180000,
"retry.backoff.ms": 200
},
"topicConfig": {
"acks": -1,
"compression.type": "lz4"
}
}
Additional context
Node API is having kafka producer settings with KAFKA_PRINCIPAL_ID. Same machine we are running kinit command with different CACHE_PRINCIPAL_ID to get krb token for different purpose.
When we publish a message to topic it is getting published successfully using the above configuration, but some time it is trying to publish using the CACHE_PRINCIPAL_ID instead of KAFKA_PRINCIPAL_ID and it is getting denied.
I appreciate any help one this.
The text was updated successfully, but these errors were encountered: