Skip to content

Conversation

LylaYuKakola
Copy link

Summary

oxy.HttpLLM增加llm_request_modifier的Hook,提供在请求LLM前对headers、url、payload进行自定义修改的能力,应用在例如“headers中增加traceId来提供错误排查、payload中增加stream参数来指定流式返回”,目前应用在调用一些大模型网关的场景,适配该网关定制化的接入方式。

Related Issue(s)

Changes

http_llm.py中,对HttpLLM的class增加llm_request_modifier声明,提供实例化过程中的自定义扩展,在httpx.AsyncClient执行发送请求的逻辑前执行llm_request_modifier,传入headers、url、payload,处理完完整返回headers、url、payload;
对历史逻辑没有影响,在不配置llm_request_modifier的情况下,保持原本headers、url、payload的输出;

示例

def modifier_request_with_trace_id(url, headers, payload):
    headers['trace-id'] = str(uuid4())
    payload['stream'] = True
    print('---llm request---', headers['trace-id'])
    return url, headers, payload

oxy.HttpLLM(
        name="XXXXXX",
        api_key='XXXXX',
        base_url='http://XXXXXXX/v1/chat/completions',
        model_name='XXXXX',
        llm_request_modifier=modifier_request_with_trace_id,
)

Checklist

  • been self-reviewed.
    • Code is formatted
    • Tests added/updated
    • All CI checks passed
    • Documentation updated
  • added documentation for new or modified features or behaviors.
  • added new features, such as new agents, new flows, etc.
  • added or updated version, license, or notice information.
  • added or updated demo, integration tests, unit tests, or others.
  • added or updated ui, or had changes in frontend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant