wrap_openai#

langsmith.wrappers._openai.wrap_openai(client: C, *, tracing_extra: TracingExtra | None = None, chat_name: str = 'ChatOpenAI', completions_name: str = 'OpenAI') C[source]#

修补 OpenAI 客户端使其可追踪。

支持
  • Chat 和 Responses API

  • 同步和异步 OpenAI 客户端

  • create() 和 parse() 方法

  • 带和不带流式传输

参数:
  • client (Union[OpenAI, AsyncOpenAI]) – 要修补的客户端。

  • tracing_extra (Optional[TracingExtra], optional) – 额外的追踪信息。默认为 None。

  • chat_name (str, optional) – chat completions 端点的运行名称。默认为 “ChatOpenAI”。

  • completions_name (str, optional) – completions 端点的运行名称。默认为 “OpenAI”。

返回:

修补后的客户端。

返回类型:

Union[OpenAI, AsyncOpenAI]

示例

import openai
from langsmith import wrappers

# Use OpenAI client same as you normally would.
client = wrappers.wrap_openai(openai.OpenAI())

# Chat API:
messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {
        "role": "user",
        "content": "What physics breakthroughs do you predict will happen by 2300?",
    },
]
completion = client.chat.completions.create(
    model="gpt-4o-mini", messages=messages
)
print(completion.choices[0].message.content)

# Responses API:
response = client.responses.create(
    model="gpt-4o-mini",
    messages=messages,
)
print(response.output_text)

版本 0.3.16 中变更:添加了对 Responses API 的支持。