Mastra Integration
Mastra is a TypeScript-first AI agent framework. Wrap your agent for automatic tracing.Quick Start
Session & User Tracking
Sessions are created per conversation/request:With Prompt Management
Pull prompts from Fallom and use them with your agent:With Prompt A/B Testing
Run A/B tests on your agent prompts:With Tools
Tool calls are automatically captured in traces:With Mastra Instance
If using a Mastra instance with multiple agents:What Gets Traced
| Field | Description |
|---|---|
| Model | The LLM model used |
| Tokens | Input and output token counts |
| Latency | Request duration in ms |
| Tool Calls | Tools invoked by the agent |
| Tool Results | Responses from tool executions |
| Messages | Full input/output conversation |
| Session ID | Groups related messages |
| User ID | Tracks specific customer |
| Prompt Key | Which prompt was used |
| Prompt Variant | A/B test variant if applicable |
Alternative: OTLP with mastra dev
Using Mastra's native telemetry (for mastra dev only)
Using Mastra's native telemetry (for mastra dev only)
If you’re running with Note: This only works when running via
mastra dev, you can use OTLP environment variables:mastra dev. For standalone deployments, use session.wrapMastraAgent().Related
Prompt Management
Manage and version your prompts
Vercel AI SDK
Use Vercel AI SDK directly
OpenRouter
Route to multiple providers
Session Tracking
Learn more about session context

