Skip to content

448523760/codex-llm-adapter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

codex-llm-adapter

The llm adapter for codex-cli

Background

The openai codex project will discontinue support for the chat/completions interface in February 2026. This change will impact users relying on open-source models or LLMs that do not support the response API, making it impossible to continue using Codex with these models. This project was created to ensure that such users can continue leveraging Codex by bridging the gap to the chat/completions API.

Project Overview

This project is a lightweight LLM proxy service. It exposes a /response API to external clients while internally forwarding requests to the chat/completions API. This ensures seamless compatibility with Codex. For details on the transformation logic, refer to the subsequent documentation.

About

The llm adapter for codex-cli

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages