跳至主要内容

使用 Vercel AI SDK 进行追踪(仅限 JS/TS)

您可以使用 LangSmith 结合我们内置的 AISDKExporter OpenTelemetry 追踪导出器,追踪来自 Vercel AI SDK 的运行。本指南将通过一个示例进行说明。

注意

AISDKExporter 类仅在 langsmith JS SDK >=0.2.1 版本中可用。

0. 安装

安装 Vercel AI SDK。我们在以下代码片段中使用其 OpenAI 集成,但您也可以使用其任何其他选项。

yarn add ai @ai-sdk/openai zod

1. 配置您的环境

export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
# This example uses OpenAI, but you can use any LLM provider of choice
export OPENAI_API_KEY=<your-openai-api-key>

2. 记录追踪

Next.js

首先,在您的项目根目录中创建一个 instrumentation.js 文件。了解如何在您的 Next.js 应用程序中设置 OpenTelemetry 检测 此处

import { registerOTel } from "@vercel/otel";
import { AISDKExporter } from "langsmith/vercel";

export function register() {
registerOTel({
serviceName: "langsmith-vercel-ai-sdk-example",
traceExporter: new AISDKExporter(),
});
}

然后,将 experimental_telemetry 参数添加到您要追踪的 AI SDK 调用中。为方便起见,我们包含了 AISDKExporter.getSettings() 方法,该方法会为 LangSmith 附加额外的元数据。

import { AISDKExporter } from "langsmith/vercel";
import { streamText } from "ai";
import { openai } from "@ai-sdk/openai";

await streamText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings(),
});

您应该会在 LangSmith 仪表盘中看到一个追踪,像这样

您还可以追踪带有工具调用的运行

import { AISDKExporter } from "langsmith/vercel";
import { generateText, tool } from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";

await generateText({
model: openai("gpt-4o-mini"),
messages: [
{
role: "user",
content: "What are my orders and where are they? My user ID is 123",
},
],
tools: {
listOrders: tool({
description: "list all orders",
parameters: z.object({ userId: z.string() }),
execute: async ({ userId }) =>
`User ${userId} has the following orders: 1`,
}),
viewTrackingInformation: tool({
description: "view tracking information for a specific order",
parameters: z.object({ orderId: z.string() }),
execute: async ({ orderId }) =>
`Here is the tracking information for ${orderId}`,
}),
},
experimental_telemetry: AISDKExporter.getSettings(),
maxSteps: 10,
});

这会生成一个追踪,像这样

Node.js

注意

官方的 @opentelemetry/sdk-node 客户端 SDK 目前处于实验阶段,并在次版本更新中会引入重大变更。

仅当您安装了 langsmith>=0.3.22 时,才完全支持 @opentelemetry/sdk-node@0.200.0

如果您使用的是较旧版本的 langsmith,请安装以前的版本,@opentelemetry/sdk-node@0.57.2@opentelemetry/auto-instrumentations-node@0.57.1

首先,了解如何在您的 Node.js 应用程序中设置 OpenTelemetry 检测 此处

特别是,您需要确保 OTEL 设置和配置在您的应用程序逻辑运行之前执行。此任务常用的工具是 Node 的 --require--import 标志。

AISDKExporter 添加到 OpenTelemetry 设置中的追踪导出器。

import { AISDKExporter } from "langsmith/vercel";

import { NodeSDK } from "@opentelemetry/sdk-node";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";

const sdk = new NodeSDK({
traceExporter: new AISDKExporter(),
instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

然后,将 experimental_telemetry 参数添加到您要追踪的 AI SDK 调用中。

信息

请务必在应用程序关闭之前调用 await sdk.shutdown(),以将所有剩余的追踪刷新到 LangSmith。

import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { AISDKExporter } from "langsmith/vercel";

const result = await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings(),
});

await sdk.shutdown();

Sentry

如果您正在使用 Sentry,可以将 LangSmith 追踪导出器附加到 Sentry 的默认 OpenTelemetry 检测中,如下所示

import * as Sentry from "@sentry/node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { AISDKExporter } from "langsmith/vercel";

const client = Sentry.init({
dsn: "[Sentry DSN]",
tracesSampleRate: 1.0,
});

client?.traceProvider?.addSpanProcessor(
new BatchSpanProcessor(new AISDKExporter())
);

或者,您可以通过在 Sentry.init() 调用中设置 skipOpenTelemetrySetup: true 来使用现有的 OpenTelemetry 设置。在这种情况下,我们建议遵循官方的 Sentry OpenTelemetry Setup 文档

Cloudflare Workers

要在 Cloudflare Workers 中检测 AI SDK 调用,您可以将 AISDKExporter@microlabs/otel-cf-workers 一起使用。请参阅 otel-cf-workers 的文档此处

import { Client } from "langsmith";
import { instrument } from "@microlabs/otel-cf-workers";
import { AISDKExporter } from "langsmith/vercel";

import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";

interface Env {
OPENAI_API_KEY: string;
LANGSMITH_TRACING: string;
LANGSMITH_ENDPOINT: string;
LANGSMITH_API_KEY: string;
}

const handler = {
async fetch(request, env): Promise<Response> {
const openai = createOpenAI({ apiKey: env.OPENAI_API_KEY });
const model = openai("gpt-4o-mini");

const response = await generateText({
model,
prompt: "Tell me a joke",
experimental_telemetry: AISDKExporter.getSettings({
// As `process.env.LANGSMITH_TRACING` is undefined in Cloudflare Workers,
// we need to check the environment variable directly.
isEnabled: env.LANGSMITH_TRACING === "true",
}),
});

return new Response(response.text);
},
} satisfies ExportedHandler<Env>;

export default instrument<Env, unknown, unknown>(handler, (env) => ({
exporter: new AISDKExporter({
client: new Client({
// Batching is handled by OTEL by default, we need to
// disable LangSmith batch tracing to avoid losing traces
autoBatchTracing: false,
apiKey: env.LANGSMITH_API_KEY,
apiUrl: env.LANGSMITH_ENDPOINT,
}),
}),
service: { name: "ai-sdk-service" },
}));

您应该会在 LangSmith 仪表盘中看到一个追踪,像这样

自定义运行名称

您可以通过将 runName 参数传递给 AISDKExporter.getSettings() 方法来自定义运行名称。

import { AISDKExporter } from "langsmith/vercel";

await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings({
runName: "my-custom-run-name",
}),
});

自定义运行 ID

您可以通过将 runId 参数传递给 AISDKExporter.getSettings() 方法来自定义运行 ID。如果您想在运行完成前就知道运行 ID,这会特别有用。请注意,运行 ID 必须是有效的 UUID。

import { AISDKExporter } from "langsmith/vercel";
import { v4 as uuidv4 } from "uuid";

await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings({
runId: uuidv4(),
}),
});

嵌套运行

您还可以将运行嵌套在其他被追踪的函数中,以创建相关运行的层次结构。这是一个使用 traceable 方法的示例

import { AISDKExporter } from "langsmith/vercel";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";

import { traceable } from "langsmith/traceable";

const wrappedGenerateText = traceable(
async (content: string) => {
const { text } = await generateText({
model: openai("gpt-4o-mini"),
messages: [{ role: "user", content }],
experimental_telemetry: AISDKExporter.getSettings(),
});

const reverseText = traceable(
async (text: string) => {
return text.split("").reverse().join("");
},
{ name: "reverseText" }
);

const reversedText = await reverseText(text);
return { text, reversedText };
},
{ name: "parentTraceable" }
);

const result = await wrappedGenerateText(
"What color is the sky? Respond with one word."
);

生成的追踪将看起来 像这样

自定义 LangSmith 客户端

您还可以将 LangSmith 客户端实例传递给 AISDKExporter 构造函数

import { AISDKExporter } from "langsmith/vercel";
import { Client } from "langsmith";

import { NodeSDK } from "@opentelemetry/sdk-node";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";

const langsmithClient = new Client({});

const sdk = new NodeSDK({
traceExporter: new AISDKExporter({ client: langsmithClient }),
instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings(),
});

调试导出器

您可以通过将 debug 参数传递给构造函数来启用 AISDKExporter 的调试日志。

const traceExporter = new AISDKExporter({ debug: true });

或者,您可以设置 OTEL_LOG_LEVEL=DEBUG 环境变量,为导出器以及 OpenTelemetry 堆栈的其余部分启用调试日志。

添加元数据

您可以向追踪添加元数据,以帮助在 LangSmith UI 中组织和筛选它们

import { AISDKExporter } from "langsmith/vercel";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";

await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings({
metadata: { userId: "123", language: "english" },
}),
});

元数据将在您的 LangSmith 仪表盘中可见,可用于筛选和搜索特定追踪。

wrapAISDKModel(已弃用)

注意

wrapAISDKModel 方法已弃用,并将在未来版本中移除。

wrapAISDKModel 方法封装了 Vercel 模型包装器并拦截模型调用,以便将追踪发送到 LangSmith。如果您正在使用旧版本的 LangSmith,或者您正在使用当前不支持 experimental_telemetrystreamUI / Vercel AI RSC,此方法会很有用。

import { wrapAISDKModel } from "langsmith/wrappers/vercel";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";

const vercelModel = openai("gpt-4o-mini");

const modelWithTracing = wrapAISDKModel(vercelModel);

await generateText({
model: modelWithTracing,
prompt: "Write a vegetarian lasagna recipe for 4 people.",
});

此页面有帮助吗?


您可以留下详细反馈 在 GitHub 上.