Framework-agnostic tracing
Debug AI agents with a clean, visual trace.
Trace LLM calls, tool calls, latency, logs, and failures. Send raw events from any stack—TraceLens turns them into an explorable timeline + graph.
Works with OpenAI, Anthropic, Gemini, LangChain, custom stacks, and more.
Live trace preview
timeline • metrics • logs
LLMToolsTokensLatencyLogs
Built for tracing and debugging agentic workflows
When runs branch, retry, and call tools, logs aren’t enough. TraceLens gives you a clean graph and the details you need to ship.
Timeline-first UI
See steps laid out by time. Fit-to-screen by default. Click any step to inspect inputs, outputs, tokens, and metadata.
Tool-agnostic
HTTP calls, function calls, and agent events are captured as raw events. The backend turns them into consistent steps for the UI.
Multi-tenant workspaces
Keep traces separated by workspace. Invite teammates with roles. Add tags, annotations, and share read-only links.
Secure ingest
Per-workspace API keys, strict payload validation, rate limiting, audit logging, and org-scoped trace access.
Exports & sharing
Export a trace as JSON or a graph as PNG. Create a share link for external review without giving account access.
Usage visibility
Org usage dashboard, monthly limits, and top users—built-in primitives for SaaS plan enforcement.
How it works
01
Create a workspace
Sign up, verify email, and generate an ingest API key for dev/prod.
02
Instrument your agent
Wrap your entrypoint with trace_agent. Your code stays generic; the SDK collects events automatically.
03
Debug in TraceLens
Browse traces in timeline or graph view. Inspect steps, logs, and metrics. Share a link when needed.
Start tracing in minutes
Create a workspace, generate an ingest key, and wrap your agent with trace_agent. TraceLens handles the rest.