Skip to main content

Mastra Integration

Mastra is a TypeScript-first AI agent framework. Wrap your agent for automatic tracing.

Quick Start

import { trace } from "@fallom/trace";
import { Agent } from "@mastra/core";
import { createOpenAI } from "@ai-sdk/openai";

// 1. Initialize Fallom
await trace.init({ apiKey: process.env.FALLOM_API_KEY });

// 2. Create your agent
const openrouter = createOpenAI({
  apiKey: process.env.OPENROUTER_API_KEY,
  baseURL: "https://openrouter.ai/api/v1",
});

const agent = new Agent({
  name: "SupportAgent",
  instructions: "You are a helpful customer support agent.",
  model: openrouter("openai/gpt-4o-mini"),
});

// 3. Wrap the agent for automatic tracing
const tracedAgent = trace.wrapMastraAgent(agent);

// 4. Set session context (configKey, sessionId, userId)
trace.setSession("my-app", "session-abc-123", "user-456");

// 5. Use as normal - automatically traced!
const result = await tracedAgent.generate([
  { role: "user", content: "I need help with my order" },
]);

console.log(result.text);

Session & User Tracking

Always set session context before calling the agent:
import { trace } from "@fallom/trace";

await trace.init({ apiKey: process.env.FALLOM_API_KEY });

const tracedAgent = trace.wrapMastraAgent(agent);

// Set session with user ID for customer tracking
trace.setSession(
  "support-chat", // configKey - identifies your app/feature
  "session-abc-123", // sessionId - groups related messages
  "user-456" // userId - tracks specific customer
);

// All subsequent generate() calls will include this context
const result = await tracedAgent.generate([
  { role: "user", content: "What's my order status?" },
]);

// Update session for a different user/conversation
trace.setSession("support-chat", "session-xyz-789", "user-999");

const result2 = await tracedAgent.generate([
  { role: "user", content: "Cancel my subscription" },
]);

With Prompt Management

Pull prompts from Fallom and use them with your agent:
import { trace, prompts } from "@fallom/trace";
import { Agent } from "@mastra/core";
import { createOpenAI } from "@ai-sdk/openai";

// Initialize both trace and prompts
await trace.init({ apiKey: process.env.FALLOM_API_KEY });
await prompts.init({ apiKey: process.env.FALLOM_API_KEY });

// Get managed prompt with variables
const prompt = await prompts.get("support-agent-instructions", {
  variables: {
    companyName: "Acme Corp",
    supportEmail: "help@acme.com",
  },
});

const openrouter = createOpenAI({
  apiKey: process.env.OPENROUTER_API_KEY,
  baseURL: "https://openrouter.ai/api/v1",
});

// Use prompt content as agent instructions
const agent = new Agent({
  name: "SupportAgent",
  instructions: prompt.system || prompt.user, // Use system prompt if available
  model: openrouter("openai/gpt-4o-mini"),
});

const tracedAgent = trace.wrapMastraAgent(agent);
trace.setSession("support-chat", sessionId, userId);

const result = await tracedAgent.generate([
  { role: "user", content: "How do I reset my password?" },
]);

With Prompt A/B Testing

Run A/B tests on your agent prompts:
import { trace, prompts } from "@fallom/trace";
import { Agent } from "@mastra/core";

await trace.init({ apiKey: process.env.FALLOM_API_KEY });
await prompts.init({ apiKey: process.env.FALLOM_API_KEY });

// Get A/B test variant - consistent per session
const sessionId = "user-session-123";
const userId = "user-456";

const prompt = await prompts.getAB("support-agent-test", sessionId);
// Variant is automatically tracked and linked to traces

const agent = new Agent({
  name: "SupportAgent",
  instructions: prompt.system,
  model: openrouter("openai/gpt-4o-mini"),
});

const tracedAgent = trace.wrapMastraAgent(agent);
trace.setSession("support-chat", sessionId, userId);

const result = await tracedAgent.generate([
  { role: "user", content: "I have a billing question" },
]);
// Trace automatically includes which prompt variant was used

With Tools

Tool calls are automatically captured in traces:
import { trace } from "@fallom/trace";
import { Agent, createTool } from "@mastra/core";
import { z } from "zod";

await trace.init({ apiKey: process.env.FALLOM_API_KEY });

// Define tools
const getOrderStatus = createTool({
  id: "get_order_status",
  description: "Get the status of a customer order",
  inputSchema: z.object({
    orderId: z.string().describe("The order ID"),
  }),
  execute: async ({ context }) => {
    // Your order lookup logic
    return {
      orderId: context.orderId,
      status: "shipped",
      eta: "2024-01-15",
    };
  },
});

const cancelOrder = createTool({
  id: "cancel_order",
  description: "Cancel a customer order",
  inputSchema: z.object({
    orderId: z.string(),
    reason: z.string(),
  }),
  execute: async ({ context }) => {
    return { success: true, message: "Order cancelled" };
  },
});

const agent = new Agent({
  name: "OrderAgent",
  instructions: "Help customers with their orders.",
  model: openrouter("openai/gpt-4o-mini"),
  tools: {
    get_order_status: getOrderStatus,
    cancel_order: cancelOrder,
  },
});

const tracedAgent = trace.wrapMastraAgent(agent);
trace.setSession("order-support", "session-123", "user-456");

const result = await tracedAgent.generate([
  { role: "user", content: "What's the status of order #12345?" },
]);
// Tool calls and results are captured in the trace

With Mastra Instance

If using a Mastra instance with multiple agents:
import { trace } from "@fallom/trace";
import { Mastra, Agent } from "@mastra/core";

await trace.init({ apiKey: process.env.FALLOM_API_KEY });

const supportAgent = new Agent({ name: "SupportAgent" /* ... */ });
const salesAgent = new Agent({ name: "SalesAgent" /* ... */ });

const mastra = new Mastra({
  agents: { supportAgent, salesAgent },
});

// Get and wrap specific agent
const agent = mastra.getAgent("supportAgent");
const tracedAgent = trace.wrapMastraAgent(agent);

trace.setSession("my-app", "session-123", "user-456");

const result = await tracedAgent.generate([
  { role: "user", content: "Hello!" },
]);

What Gets Traced

FieldDescription
ModelThe LLM model used
TokensInput and output token counts
LatencyRequest duration in ms
Tool CallsTools invoked by the agent
Tool ResultsResponses from tool executions
MessagesFull input/output conversation
Session IDGroups related messages
User IDTracks specific customer
Prompt KeyWhich prompt was used
Prompt VariantA/B test variant if applicable

Alternative: OTLP with mastra dev

If you’re running with mastra dev, you can use OTLP environment variables:
OTEL_EXPORTER_OTLP_ENDPOINT=https://traces.fallom.com/v1/traces
OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_FALLOM_API_KEY"
const mastra = new Mastra({
  agents: { myAgent },
  telemetry: {
    serviceName: "my-app",
    enabled: true,
    export: { type: "otlp" },
  },
});
Note: This only works when running via mastra dev. For standalone deployments, use wrapMastraAgent().