实现分布式追踪
有时,您需要跨多个服务追踪请求。
LangSmith 开箱即用地支持分布式追踪,通过上下文传播头(langsmith-trace
和可选的 baggage
用于元数据/标签)链接跨服务的追踪运行。
客户端-服务器设置示例
- 追踪从客户端开始
- 在服务器上继续
Python 中的分布式追踪
# client.py
from langsmith.run_helpers import get_current_run_tree, traceable
import httpx
@traceable
async def my_client_function():
headers = {}
async with httpx.AsyncClient(base_url="...") as client:
if run_tree := get_current_run_tree():
# add langsmith-id to headers
headers.update(run_tree.to_headers())
return await client.post("/my-route", headers=headers)
然后服务器(或其他服务)可以通过适当处理头部信息来继续追踪。如果您正在使用 ASGI 应用程序 Starlette 或 FastAPI,您可以使用 LangSmith 的 TracingMiddleware
连接分布式追踪。
信息
TracingMiddleware
类已在 langsmith==0.1.133
中添加。
使用 FastAPI 的示例
from langsmith import traceable
from langsmith.middleware import TracingMiddleware
from fastapi import FastAPI, Request
app = FastAPI() # Or Flask, Django, or any other framework
app.add_middleware(TracingMiddleware)
@traceable
async def some_function():
...
@app.post("/my-route")
async def fake_route(request: Request):
return await some_function()
或在 Starlette 中
from starlette.applications import Starlette
from starlette.middleware import Middleware
from langsmith.middleware import TracingMiddleware
routes = ...
middleware = [
Middleware(TracingMiddleware),
]
app = Starlette(..., middleware=middleware)
如果您正在使用其他服务器框架,您始终可以通过 langsmith_extra
传递头部信息来“接收”分布式追踪
# server.py
from langsmith import traceable
from langsmith.run_helpers import tracing_context
from fastapi import FastAPI, Request
@traceable
async def my_application():
...
app = FastAPI() # Or Flask, Django, or any other framework
@app.post("/my-route")
async def fake_route(request: Request):
# request.headers: {"langsmith-trace": "..."}
# as well as optional metadata/tags in `baggage`
with tracing_context(parent=request.headers):
return await my_application()
上述示例使用了 tracing_context
上下文管理器。您也可以直接在用 @traceable
包装的方法的 langsmith_extra
参数中指定父运行上下文。
from langsmith.run_helpers import traceable, trace
# ... same as above
@app.post("/my-route")
async def fake_route(request: Request):
# request.headers: {"langsmith-trace": "..."}
my_application(langsmith_extra={"parent": request.headers})
TypeScript 中的分布式追踪
注意
TypeScript 中的分布式追踪需要 langsmith
版本 >=0.1.31
首先,我们从客户端获取当前的运行树,并将其转换为 langsmith-trace
和 baggage
头部值,然后可以将其传递给服务器
// client.mts
import { getCurrentRunTree, traceable } from "langsmith/traceable";
const client = traceable(
async () => {
const runTree = getCurrentRunTree();
return await fetch("...", {
method: "POST",
headers: runTree.toHeaders(),
}).then((a) => a.text());
},
{ name: "client" }
);
await client();
然后,服务器将头部信息转换回运行树,并使用它进一步继续追踪。
为了将新创建的运行树传递给可追踪函数,我们可以使用 withRunTree
辅助函数,它将确保运行树在可追踪调用中传播。
- Express.JS
- Hono
// server.mts
import { RunTree } from "langsmith";
import { traceable, withRunTree } from "langsmith/traceable";
import express from "express";
import bodyParser from "body-parser";
const server = traceable(
(text: string) => `Hello from the server! Received "${text}"`,
{ name: "server" }
);
const app = express();
app.use(bodyParser.text());
app.post("/", async (req, res) => {
const runTree = RunTree.fromHeaders(req.headers);
const result = await withRunTree(runTree, () => server(req.body));
res.send(result);
});
// server.mts
import { RunTree } from "langsmith";
import { traceable, withRunTree } from "langsmith/traceable";
import { Hono } from "hono";
const server = traceable(
(text: string) => `Hello from the server! Received "${text}"`,
{ name: "server" }
);
const app = new Hono();
app.post("/", async (c) => {
const body = await c.req.text();
const runTree = RunTree.fromHeaders(c.req.raw.headers);
const result = await withRunTree(runTree, () => server(body));
return c.body(result);
});