Open
Conversation
- Add gemini to PROVIDERS map with OpenAI-compatible endpoint (https://generativelanguage.googleapis.com/v1beta/openai) - Add gemini to PROVIDER_ENV_VARS (reads GEMINI_API_KEY from env) - Add gemini URL detection in inferProviderFromBaseUrl() - Add gemini to AIBrain.BASE_URLS fallback map - Fix localProvider not propagated in resolveApiConfig() return values - Fix visionModel/textModel falling back to pipeline config layer3/layer2 models when resolvedApi provides no model name (prevented empty model string being sent to the API) No API keys are stored in source files. Configure via env: GEMINI_API_KEY=your_key AI_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai
Owner
|
hey @do420 , these problems have been addressed in the V0.7.0 update on branch, check it out. also what OS are you operating on? |
Author
Thanks! Any plans on merging v0.7.0 to main? I am operating on Windows. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds first-class Google Gemini support as an AI provider using Gemini's OpenAI-compatible endpoint.
Changes
src/providers.ts
src/openclaw-credentials.ts
esolveApiConfig() to propagate localProvider in both return paths (it was being read to find the right env key but then dropped from the returned object)
src/ai-brain.ts
src/index.ts
esolveApiConfig() returns no model - prevents an empty string being sent to the API causing silent failures in the vision loop
Configuration
Set these env vars (or add to .env):
.env GEMINI_API_KEY=your_key_here AI_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai AI_TEXT_MODEL=gemini-2.0-flash AI_VISION_MODEL=gemini-2.0-flashOr start with explicit flags:
clawdcursor start --provider gemini --api-key your_key_hereTesting
Tested end-to-end on Windows with gemini-2.0-flash: