实现分布式追踪
有时,你需要跨多个服务追踪一个请求。
LangSmith 开箱即用地支持分布式追踪,通过上下文传播标头(langsmith-trace 和可选的 baggage 用于元数据/标签)在服务之间链接追踪中的运行。
客户端-服务器设置示例
- 追踪在客户端开始
- 在服务器上继续
Python 中的分布式追踪
# client.py
from langsmith.run_helpers import get_current_run_tree, traceable
import httpx
@traceable
async def my_client_function():
headers = {}
async with httpx.AsyncClient(base_url="...") as client:
if run_tree := get_current_run_tree():
# add langsmith-id to headers
headers.update(run_tree.to_headers())
return await client.post("/my-route", headers=headers)
然后,服务器(或其他服务)可以通过适当处理标头来继续追踪。如果你正在使用 asgi 应用 Starlette 或 FastAPI,你可以使用 LangSmith 的 TracingMiddleware 连接分布式追踪。
信息
TracingMiddleware 类在 langsmith==0.1.133 中添加。
使用 FastAPI 的示例
from langsmith import traceable
from langsmith.middleware import TracingMiddleware
from fastapi import FastAPI, Request
app = FastAPI() # Or Flask, Django, or any other framework
app.add_middleware(TracingMiddleware)
@traceable
async def some_function():
...
@app.post("/my-route")
async def fake_route(request: Request):
return await some_function()
或者在 Starlette 中
from starlette.applications import Starlette
from starlette.middleware import Middleware
from langsmith.middleware import TracingMiddleware
routes = ...
middleware = [
Middleware(TracingMiddleware),
]
app = Starlette(..., middleware=middleware)
如果你正在使用其他服务器框架,你始终可以通过 langsmith_extra 传入标头来“接收”分布式追踪
# server.py
from langsmith import traceable
from langsmith.run_helpers import tracing_context
from fastapi import FastAPI, Request
@traceable
async def my_application():
...
app = FastAPI() # Or Flask, Django, or any other framework
@app.post("/my-route")
async def fake_route(request: Request):
# request.headers: {"langsmith-trace": "..."}
# as well as optional metadata/tags in `baggage`
with tracing_context(parent=request.headers):
return await my_application()
上面的示例使用了 tracing_context 上下文管理器。你也可以在用 @traceable 包装的方法的 langsmith_extra 参数中直接指定父运行上下文。
from langsmith.run_helpers import traceable, trace
# ... same as above
@app.post("/my-route")
async def fake_route(request: Request):
# request.headers: {"langsmith-trace": "..."}
my_application(langsmith_extra={"parent": request.headers})
TypeScript 中的分布式追踪
注意
TypeScript 中的分布式追踪需要 langsmith 版本 >=0.1.31
首先,我们从客户端获取当前的运行树,并将其转换为 langsmith-trace 和 baggage 标头值,我们可以将其传递给服务器
// client.mts
import { getCurrentRunTree, traceable } from "langsmith/traceable";
const client = traceable(
async () => {
const runTree = getCurrentRunTree();
return await fetch("...", {
method: "POST",
headers: runTree.toHeaders(),
}).then((a) => a.text());
},
{ name: "client" }
);
await client();
然后,服务器将标头转换回运行树,并使用它来进一步继续追踪。
要将新创建的运行树传递给可追踪的函数,我们可以使用 withRunTree 助手,这将确保运行树在可追踪的调用中传播。
- Express.JS
- Hono
// server.mts
import { RunTree } from "langsmith";
import { traceable, withRunTree } from "langsmith/traceable";
import express from "express";
import bodyParser from "body-parser";
const server = traceable(
(text: string) => `Hello from the server! Received "${text}"`,
{ name: "server" }
);
const app = express();
app.use(bodyParser.text());
app.post("/", async (req, res) => {
const runTree = RunTree.fromHeaders(req.headers);
const result = await withRunTree(runTree, () => server(req.body));
res.send(result);
});
// server.mts
import { RunTree } from "langsmith";
import { traceable, withRunTree } from "langsmith/traceable";
import { Hono } from "hono";
const server = traceable(
(text: string) => `Hello from the server! Received "${text}"`,
{ name: "server" }
);
const app = new Hono();
app.post("/", async (c) => {
const body = await c.req.text();
const runTree = RunTree.fromHeaders(c.req.raw.headers);
const result = await withRunTree(runTree, () => server(body));
return c.body(result);
});