wrap_openai#
- langsmith.wrappers._openai.wrap_openai(
- client: C,
- *,
- tracing_extra: TracingExtra | None = None,
- chat_name: str = 'ChatOpenAI',
- completions_name: str = 'OpenAI',
修补 OpenAI 客户端以使其可追溯。
- 支持
聊天 (Chat) 和响应 (Responses) API
同步和异步 OpenAI 客户端
create() 和 parse() 方法
支持流式传输和非流式传输
- 参数:
client (Union[OpenAI, AsyncOpenAI]) – 要修补的客户端。
tracing_extra (Optional[TracingExtra], optional) – 额外的追踪信息。默认为 None。
chat_name (str, optional) – 用于聊天补全端点的运行名称。默认为 “ChatOpenAI”。
completions_name (str, optional) – 用于补全端点的运行名称。默认为 “OpenAI”。
- 返回:
已修补的客户端。
- 返回类型:
Union[OpenAI, AsyncOpenAI]
示例
import openai from langsmith import wrappers # Use OpenAI client same as you normally would. client = wrappers.wrap_openai(openai.OpenAI()) # Chat API: messages = [ {"role": "system", "content": "You are a helpful assistant."}, { "role": "user", "content": "What physics breakthroughs do you predict will happen by 2300?", }, ] completion = client.chat.completions.create( model="gpt-4o-mini", messages=messages ) print(completion.choices[0].message.content) # Responses API: response = client.responses.create( model="gpt-4o-mini", messages=messages, ) print(response.output_text)
0.3.16 版本中更改:增加了对 Responses API 的支持。