Skip to main content
  • Python
  • TypeScript

Installation

pip install fallom

# With auto-instrumentation for your LLM provider:
pip install fallom opentelemetry-instrumentation-openai
pip install fallom opentelemetry-instrumentation-anthropic

Quick Start

Import order matters! You must import and initialize Fallom before importing OpenAI or other LLM libraries.
import fallom
fallom.init(api_key="your-api-key")

# NOW import OpenAI (after instrumentation is set up)
from openai import OpenAI
client = OpenAI()

# Set default session context for tracing
fallom.trace.set_session("my-agent", session_id)

# All LLM calls are now automatically traced!
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)

What Gets Traced?

Every LLM call is automatically captured with:
FieldDescription
ModelThe LLM model used (e.g., gpt-4o, claude-3-opus)
TokensInput and output token counts
LatencyRequest duration in milliseconds
PromptsFull prompt messages sent to the model
CompletionsModel responses
SessionYour config key and session ID for grouping

Get Your API Key

1

Sign up

Create a free account at bench.fallom.com
2

Create a project

Set up your first project in the dashboard
3

Copy your API key

Find your API key in project settings

Open Dashboard

Get your API key and view your traces

Next Steps