在不设置环境变量的情况下进行追踪
正如其他指南中提到的,以下环境变量允许您配置追踪启用状态、API 端点、API 密钥和追踪项目
LANGSMITH_TRACING
LANGSMITH_API_KEY
LANGSMITH_ENDPOINT
LANGSMITH_PROJECT
在某些环境中,无法设置环境变量。在这种情况下,您可以通过编程方式设置追踪配置。
最近行为变更
由于大量请求要求使用 trace
上下文管理器对追踪进行更细粒度的控制,在 Python SDK 的 0.1.95 版本中,我们**改变了 with trace
的行为**,以遵循 LANGSMITH_TRACING
环境变量。您可以在发行说明中找到更多详细信息。在不设置环境变量的情况下禁用/启用追踪的推荐方法是使用 with tracing_context
上下文管理器,如下例所示。
- Python
- TypeScript
在 Python 中,推荐的方法是使用 tracing_context
上下文管理器。这适用于使用 traceable
注解的代码和 trace
上下文管理器内的代码。
import openai
from langsmith import Client, tracing_context, traceable
from langsmith.wrappers import wrap_openai
langsmith_client = Client(
api_key="YOUR_LANGSMITH_API_KEY", # This can be retrieved from a secrets manager
api_url="https://api.smith.langchain.com", # Update appropriately for self-hosted installations or the EU region
)
client = wrap_openai(openai.Client())
@traceable(run_type="tool", name="Retrieve Context")
def my_tool(question: str) -> str:
return "During this morning's meeting, we solved all world conflict."
@traceable
def chat_pipeline(question: str):
context = my_tool(question)
messages = [
{ "role": "system", "content": "You are a helpful assistant. Please respond to the user's request only based on the given context." },
{ "role": "user", "content": f"Question: {question}
Context: {context}"}
]
chat_completion = client.chat.completions.create(
model="gpt-4o-mini", messages=messages
)
return chat_completion.choices[0].message.content
# Can set to False to disable tracing here without changing code structure
with tracing_context(enabled=True):
# Use langsmith_extra to pass in a custom client
chat_pipeline("Can you summarize this morning's meetings?", langsmith_extra={"client": langsmith_client})
在 TypeScript 中,您可以将客户端和 tracingEnabled
标志都传递给 traceable
装饰器。
import { Client } from "langsmith";
import { traceable } from "langsmith/traceable";
import { wrapOpenAI } from "langsmith/wrappers";
import { OpenAI } from "openai";
const client = new Client({
apiKey: "YOUR_API_KEY", // This can be retrieved from a secrets manager
apiUrl: "https://api.smith.langchain.com", // Update appropriately for self-hosted installations or the EU region
});
const openai = wrapOpenAI(new OpenAI());
const tool = traceable((question: string) => {
return "During this morning's meeting, we solved all world conflict.";
}, { name: "Retrieve Context", runType: "tool" });
const pipeline = traceable(
async (question: string) => {
const context = await tool(question);
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [
{ role: "system" as const, content: "You are a helpful assistant. Please respond to the user's request only based on the given context." },
{ role: "user" as const, content: `Question: ${question}\nContext: ${context}`}
]
});
return completion.choices[0].message.content;
},
{ name: "Chat", client, tracingEnabled: true }
);
await pipeline("Can you summarize this morning's meetings?");
如果您更喜欢视频教程,请查看 LangSmith 入门课程中的其他追踪方法视频。