TypeScript SDK
- OpenAI
- Anthropic
- Vercel AI SDK
- OpenRouter
Wrap your OpenAI client for automatic tracing:
Python SDK
- OpenAI
- Anthropic
- Google AI
- OpenRouter
Session Context
Sessions group related LLM calls together (e.g., a conversation or agent run):- TypeScript
- Python
| Parameter | Description |
|---|---|
configKey | Your experiment/config identifier (e.g., "summarizer") |
sessionId | Unique ID for this session (e.g., conversation ID) |
customerId | Optional user identifier for per-user analytics |
What Gets Captured
Every LLM call automatically includes:| Field | Description |
|---|---|
| Model | The model used (e.g., gpt-4o, claude-3-opus) |
| Duration | Total request time in milliseconds |
| Time to First Token | TTFT for streaming requests |
| Tokens | Input, output, and cached token counts |
| Cost | Calculated from token usage + model pricing |
| Prompts | Full input messages |
| Completions | Model responses |
| Session | Config key, session ID, customer ID |
| Status | OK or ERROR |
Multimodal (Images)
Images in prompts are automatically handled:- URL images - Stored as-is
- Base64 images - Uploaded to secure storage, replaced with URL
Configuration
- TypeScript
- Python

