-
Notifications
You must be signed in to change notification settings - Fork 988
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added SCRAM-SHA-512 support to dataflow kafka-to-bigquery #2181
base: main
Are you sure you want to change the base?
Conversation
@an2x As discussed, I have created the PR to support scram-sha-512 in Dataflow Kaka2BQ. Request you to review and share your comments. Thank you, |
@@ -215,4 +217,62 @@ final class Offset { | |||
String getKafkaReadKeyPasswordSecretId(); | |||
|
|||
void setKafkaReadKeyPasswordSecretId(String sourceKeyPasswordSecretId); | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, but can you please rename the new parameters from getKafkaReadSASLScram512*
to getKafkaReadSaslScram*
(e.g. getKafkaReadSASLScram512UsernameSecretId
to getKafkaReadSaslScramUsernameSecretId
), and rename the corresponding variables in KafkaConfig
accordingly?
The reason for "SASL" -> "Sasl" change is that by convention, in all the Java identifiers in this project only the first latter in each word is capitalized, even if this is an abbreviation.
The reason for removing "512" is to allow extending this to support SASL SCRAM 256 auth as well in the future (if needed) without the need to introduce another set of ``getKafkaReadSaslScram256*` parameters (we'll just re-use these).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@an2x thank you for reviewing the PR and sharing your feedback. My understanding is the parameter prompts in the UI is based on the parent name and parentTriggerValues. For example if the chosen authentication mode is KafkaAuthenticationMethod.SASL_PLAIN then only two input fields will be displayed in the UI (Username and password). To address any differences in inputs, between SASL SCRAM 256 & SASL SCRAM 512, I thought of keeping them separate. If you think that is not required, I will rename the parameters as recommended.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vgnanasekaran It looks like one change is still missing: to rename "SASL" -> "Sasl" in the parameter names, could you please change that as well?
Sorry for nitpicking, but the parameter names are case-sensitive and this is inconsistent with pretty much every other parameter and may be confusing for the users.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@an2x not a problem at all. I would definitely stick to the standards followed by your team. Sorry for having missed the parameter case issue. I was more focused on the 512 name change. I have addressed the issue now.
Note: the checks are failing because of some recent changes in GitHub actions. Please also update the PR to pick up the most recent changes from the main branch, this should've been fixed in 023b53e. |
…ed for both sasl_scram_256 and sasl_scram_512
@an2x per your feedback, I have updated the parameter names and also have merged the recent changes. |
@@ -215,4 +217,62 @@ final class Offset { | |||
String getKafkaReadKeyPasswordSecretId(); | |||
|
|||
void setKafkaReadKeyPasswordSecretId(String sourceKeyPasswordSecretId); | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vgnanasekaran It looks like one change is still missing: to rename "SASL" -> "Sasl" in the parameter names, could you please change that as well?
Sorry for nitpicking, but the parameter names are case-sensitive and this is inconsistent with pretty much every other parameter and may be confusing for the users.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2181 +/- ##
============================================
- Coverage 47.05% 47.04% -0.02%
- Complexity 4049 4375 +326
============================================
Files 876 876
Lines 52230 52246 +16
Branches 5507 5508 +1
============================================
+ Hits 24578 24580 +2
- Misses 25892 25905 +13
- Partials 1760 1761 +1
|
Added SCRAM-SHA-512 support to dataflow kafka-to-bigquery