Skip to content

Conversation

@T-Atlas
Copy link

@T-Atlas T-Atlas commented Apr 21, 2025

feat: Add support for custom OpenAI-compatible API endpoints.

By introducing the BASE_URL environment variable, users are allowed to specify custom OpenAI-compatible API endpoints to support local LLM servers or other compatible services.

通过引入 `BASE_URL` 环境变量,允许用户指定自定义的 OpenAI 兼容 API 端点,以支持本地 LLM 服务器或其他兼容服务。feat: Add support for custom OpenAI-compatible API endpoints.

By introducing the `BASE_URL` environment variable, users are allowed to specify custom OpenAI-compatible API endpoints to support local LLM servers or other compatible services.
@BradKML
Copy link

BradKML commented May 20, 2025

Cross-ref from here #33

@robinsonkwame
Copy link

What if you have different OpenAI compatible endpoints that you want to use for model_writeup, model_citation, model_review, model_agg_plots. For example, using Replicate a user could stand up different OpenA compatiable proxies for each.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants