Skip to content

Conversation

BossyT
Copy link

@BossyT BossyT commented Aug 16, 2025

Closes #7” so it will automatically close the issue after merging

BossyT added 19 commits August 15, 2025 06:41
Commit new file
Add minimal OpenAI client to lib/ai/openai.ts which reads environment variables and provides a streamChat function for streaming chat completions.
Implement POST /api/ai/chat with edge runtime for streaming OpenAI chat completions. Validates messages and provider, calls streamChat and returns SSE as raw text.
Adds a new Edge runtime GET endpoint at `/api/health` which performs a health check for the OpenAI provider. It reads `AI_PROVIDER`, `OPENAI_API_KEY`, `OPENAI_MODEL`, and `OPENAI_BASE_URL` from the environment, pings the OpenAI `/models` endpoint to determine service status, and returns a JSON payload: `{ status, services: { openai: { ok, model, error? }}}`. Missing keys or API failures are reported gracefully without crashing the app.
Adds an "AI Provider configuration" section to the README. This section documents the default OpenAI provider (`AI_PROVIDER=openai`) and lists the required environment variables: `OPENAI_API_KEY`, `OPENAI_MODEL` (default `gpt-4o-mini`), and `OPENAI_BASE_URL` (defaults to `https://api.openai.com/v1`). It also groups alternative provider keys (Anthropic, Gemini, Groq) under an optional section.
feat: switch default AI provider to OpenAI (Issue A)
Add CI workflow
feat: use GPT‑5 as default model
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

React hydration error

1 participant