Skip to main content

Mastra Integration

Mastra is a TypeScript-first AI agent framework. Wrap your agent for automatic tracing.

Quick Start

import fallom from "@fallom/trace";
import { Agent } from "@mastra/core";
import { createOpenAI } from "@ai-sdk/openai";

// 1. Initialize Fallom once at app startup
await fallom.init({ apiKey: process.env.FALLOM_API_KEY });

// 2. Create a session for this conversation/request
const session = fallom.session({
  configKey: "my-app",
  sessionId: "session-abc-123",
  customerId: "user-456",
});

// 3. Create your agent
const openrouter = createOpenAI({
  apiKey: process.env.OPENROUTER_API_KEY,
  baseURL: "https://openrouter.ai/api/v1",
});

const agent = new Agent({
  name: "SupportAgent",
  instructions: "You are a helpful customer support agent.",
  model: openrouter("openai/gpt-4o-mini"),
});

// 4. Wrap the agent for automatic tracing
const tracedAgent = session.wrapMastraAgent(agent);

// 5. Use as normal - automatically traced!
const result = await tracedAgent.generate([
  { role: "user", content: "I need help with my order" },
]);

console.log(result.text);

Session & User Tracking

Sessions are created per conversation/request:
import fallom from "@fallom/trace";

await fallom.init({ apiKey: process.env.FALLOM_API_KEY });

// Create a session for this conversation
const session = fallom.session({
  configKey: "support-chat",  // Identifies your app/feature
  sessionId: "session-abc-123",  // Groups related messages
  customerId: "user-456",  // Tracks specific customer
});

const tracedAgent = session.wrapMastraAgent(agent);

// All generate() calls use this session context
const result = await tracedAgent.generate([
  { role: "user", content: "What's my order status?" },
]);

With Prompt Management

Pull prompts from Fallom and use them with your agent:
import fallom, { prompts } from "@fallom/trace";
import { Agent } from "@mastra/core";
import { createOpenAI } from "@ai-sdk/openai";

await fallom.init({ apiKey: process.env.FALLOM_API_KEY });

// Get managed prompt with variables
const prompt = await prompts.get("support-agent-instructions", {
  variables: {
    companyName: "Acme Corp",
    supportEmail: "help@acme.com",
  },
});

const session = fallom.session({
  configKey: "support-chat",
  sessionId: sessionId,
  customerId: userId,
});

const openrouter = createOpenAI({
  apiKey: process.env.OPENROUTER_API_KEY,
  baseURL: "https://openrouter.ai/api/v1",
});

// Use prompt content as agent instructions
const agent = new Agent({
  name: "SupportAgent",
  instructions: prompt.system || prompt.user, // Use system prompt if available
  model: openrouter("openai/gpt-4o-mini"),
});

const tracedAgent = session.wrapMastraAgent(agent);

const result = await tracedAgent.generate([
  { role: "user", content: "How do I reset my password?" },
]);

With Prompt A/B Testing

Run A/B tests on your agent prompts:
import fallom, { prompts } from "@fallom/trace";
import { Agent } from "@mastra/core";

await fallom.init({ apiKey: process.env.FALLOM_API_KEY });

const sessionId = "user-session-123";
const userId = "user-456";

// Get A/B test variant - consistent per session
const prompt = await prompts.getAB("support-agent-test", sessionId);
// Variant is automatically tracked and linked to traces

const session = fallom.session({
  configKey: "support-chat",
  sessionId,
  customerId: userId,
});

const agent = new Agent({
  name: "SupportAgent",
  instructions: prompt.system,
  model: openrouter("openai/gpt-4o-mini"),
});

const tracedAgent = session.wrapMastraAgent(agent);

const result = await tracedAgent.generate([
  { role: "user", content: "I have a billing question" },
]);
// Trace automatically includes which prompt variant was used

With Tools

Tool calls are automatically captured in traces:
import fallom from "@fallom/trace";
import { Agent, createTool } from "@mastra/core";
import { z } from "zod";

await fallom.init({ apiKey: process.env.FALLOM_API_KEY });

// Define tools
const getOrderStatus = createTool({
  id: "get_order_status",
  description: "Get the status of a customer order",
  inputSchema: z.object({
    orderId: z.string().describe("The order ID"),
  }),
  execute: async ({ context }) => {
    // Your order lookup logic
    return {
      orderId: context.orderId,
      status: "shipped",
      eta: "2024-01-15",
    };
  },
});

const cancelOrder = createTool({
  id: "cancel_order",
  description: "Cancel a customer order",
  inputSchema: z.object({
    orderId: z.string(),
    reason: z.string(),
  }),
  execute: async ({ context }) => {
    return { success: true, message: "Order cancelled" };
  },
});

const session = fallom.session({
  configKey: "order-support",
  sessionId: "session-123",
  customerId: "user-456",
});

const agent = new Agent({
  name: "OrderAgent",
  instructions: "Help customers with their orders.",
  model: openrouter("openai/gpt-4o-mini"),
  tools: {
    get_order_status: getOrderStatus,
    cancel_order: cancelOrder,
  },
});

const tracedAgent = session.wrapMastraAgent(agent);

const result = await tracedAgent.generate([
  { role: "user", content: "What's the status of order #12345?" },
]);
// Tool calls and results are captured in the trace

With Mastra Instance

If using a Mastra instance with multiple agents:
import fallom from "@fallom/trace";
import { Mastra, Agent } from "@mastra/core";

await fallom.init({ apiKey: process.env.FALLOM_API_KEY });

const supportAgent = new Agent({ name: "SupportAgent" /* ... */ });
const salesAgent = new Agent({ name: "SalesAgent" /* ... */ });

const mastra = new Mastra({
  agents: { supportAgent, salesAgent },
});

const session = fallom.session({
  configKey: "my-app",
  sessionId: "session-123",
  customerId: "user-456",
});

// Get and wrap specific agent
const agent = mastra.getAgent("supportAgent");
const tracedAgent = session.wrapMastraAgent(agent);

const result = await tracedAgent.generate([
  { role: "user", content: "Hello!" },
]);

What Gets Traced

FieldDescription
ModelThe LLM model used
TokensInput and output token counts
LatencyRequest duration in ms
Tool CallsTools invoked by the agent
Tool ResultsResponses from tool executions
MessagesFull input/output conversation
Session IDGroups related messages
User IDTracks specific customer
Prompt KeyWhich prompt was used
Prompt VariantA/B test variant if applicable

Alternative: OTLP with mastra dev

If you’re running with mastra dev, you can use OTLP environment variables:
OTEL_EXPORTER_OTLP_ENDPOINT=https://traces.fallom.com/v1/traces
OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_FALLOM_API_KEY"
const mastra = new Mastra({
  agents: { myAgent },
  telemetry: {
    serviceName: "my-app",
    enabled: true,
    export: { type: "otlp" },
  },
});
Note: This only works when running via mastra dev. For standalone deployments, use session.wrapMastraAgent().

Prompt Management

Manage and version your prompts

Vercel AI SDK

Use Vercel AI SDK directly

OpenRouter

Route to multiple providers

Session Tracking

Learn more about session context