Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Propose the addition of an optional flag (azureAPIType) to support the AzureOpenAI backend #1247

Open
2 tasks done
lawrencelo8 opened this issue Sep 16, 2024 · 3 comments

Comments

@lawrencelo8
Copy link

Checklist

  • I've searched for similar issues and couldn't find anything matching
  • I've discussed this feature request in the K8sGPT Slack and got positive feedback

Is this feature request related to a problem?

No

Problem Description

I’d like to propose the addition of an optional flag to support the AzureOpenAI backend. Here’s why:
Currently, the k8sgpt AzureOpenAI backend uses DefaultAzureConfig, which defaults to the APITypeAzure (APIType = “AZURE”).
• Reference: k8sgpt AzureOpenAI backend
• Reference: Azure API configuration in go-openai
However, there are two other API Type options available:
• APITypeAzureAD (APIType = “AZURE_AD”)
• APITypeCloudflareAzure (APIType = “CLOUDFLARE_AZURE”)
• Reference: Additional API Types in go-openai

Solution Description

By introducing an optional arg/flag (azureAPIType), we could provide the ability to select from these alternative options. Here’s how it could look in the k8sgpt CLI:

Default behavior (still uses APIType = "AZURE")

k8sgpt auth add -b azureopenai ...

Using the AZURE_AD APIType

k8sgpt auth add -b azureopenai --azureAPIType AZURE_AD ...

Using the CLOUDFLARE_AZURE APIType

k8sgpt auth add -b azureopenai --azureAPIType CLOUDFLARE_AZURE ...

Why implement this as an optional flag (CLI arg) instead of an environment variable?

1 All parameters for AI providers are currently taken in using CLI args, and adding this as an environment variable would make it an outlier. Keeping it as a CLI arg would ensure consistency across the board.
2 Once the configuration parameters are taken in via CLI args, they are stored in a configuration YAML file for the AI providers, where k8sgpt saves and restores the settings. Even if this flag were added as an environment variable, it would still need to be remembered and stored as part of the AI provider’s configuration, which happens in the same YAML file.
3 Similarly, k8sgpt-operator takes in configuration via the k8sgpt CRD YAML file, not through environment variables.

Benefits

1 Additional choice APIType of Azure OpenAI’s configuration will be available
2 Given that this APIType is part of Azure OpenAI’s configuration, it would be best to keep it as a CLI argument for consistency.

Potential Drawbacks

No response

Additional Information

No response

@AlexsJones
Copy link
Member

I am not against this - what are the code changes required here? Is there a flag or parameter we would need to pass in with that azureAPIType ? https://github.com/k8sgpt-ai/k8sgpt/blob/main/pkg/ai/azureopenai.go#L30

@lawrencelo8
Copy link
Author

@AlexsJones

Thanks for checking back.

I plan to add azureAPIType := config.GetAzureAPIType() to read in string values such as “AZURE”, “AZURE_AD”, “CLOUDFLARE_AZURE”, and then verify that the strings being read match the constant values defined in this section of go-openai code https://github.com/sashabaranov/go-openai/blob/master/config.go#L20-L22

Based on that, I will update defaultConfig.APIType accordingly. For example, if the input string is “AZURE_AD”, defaultConfig.APIType = openai.APITypeAzureAD. If the input string doesn’t match a predefined value, defaultConfig.APIType will remain unchanged, with the default being “AZURE”.

Does it make sense to you?

I’ve been tied up, but I haven’t forgotten about this feature proposal. I’ll create a PR for you to review when it’s ready.

@AlexsJones
Copy link
Member

Understood, in which case it will be important to decide where you need that flag.
If we are going heavily down the route of provider specific flags, then it will mean we need to think about our long term design.
Likewise, it will influence the operator => k8sgpt deployment schema https://github.com/k8sgpt-ai/schemas/blob/main/protobuf/schema/v1/server_analyzer_service.proto#L10

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Proposed
Development

No branches or pull requests

2 participants