Documentation Index
Fetch the complete documentation index at: https://docs.fallom.com/llms.txt
Use this file to discover all available pages before exploring further.
Initialize
fallom.init(api_key=None, base_url=None, capture_content=True)
Initialize the SDK. Call this before importing LLM libraries for auto-instrumentation.| Parameter | Type | Description |
|---|
api_key | str | Your Fallom API key (or use FALLOM_API_KEY env var) |
base_url | str | API endpoint (default:https://spans.fallom.com) |
capture_content | bool | Whether to capture prompt/completion text (default: True) |
Get Model
fallom.models.get(config_key, session_id, version=None, fallback=None) -> str
Get model assignment for a session.| Parameter | Type | Description |
|---|
config_key | str | Your config name from the dashboard |
session_id | str | Unique session/conversation ID (sticky assignment) |
version | int | Pin to specific version (default: latest) |
fallback | str | Model to return if anything fails |
Get Prompt
fallom.prompts.get(prompt_key, variables=None, version=None) -> PromptResult
Get a managed prompt.| Parameter | Type | Description |
|---|
prompt_key | str | Your prompt key from the dashboard |
variables | dict | Template variables (e.g., {"user_name": "John"}) |
version | int | Pin to specific version (default: latest) |
Returns: PromptResult with key, version, system, user
Get Prompt A/B Test
fallom.prompts.get_ab(ab_test_key, session_id, variables=None) -> PromptResult
Get a prompt from an A/B test (sticky assignment).| Parameter | Type | Description |
|---|
ab_test_key | str | Your A/B test key from the dashboard |
session_id | str | Unique session/conversation ID (for sticky assignment) |
variables | dict | Template variables |
Returns: PromptResult with key, version, system, user, ab_test_key, variant_index
Set Session
fallom.trace.set_session(config_key, session_id, customer_id=None)
Set trace context. All subsequent LLM calls will be tagged with this config_key and session_id.
Clear Session
fallom.trace.clear_session()
Clear trace context.
Record Custom Metrics
fallom.trace.span(data, config_key=None, session_id=None)
Record custom business metrics.| Parameter | Type | Description |
|---|
data | dict | Metrics to record |
config_key | str | Optional if set_session() was called |
session_id | str | Optional if set_session() was called |
Supported LLM Providers
Auto-instrumentation available for:
- OpenAI (+ OpenAI-compatible APIs: OpenRouter, LiteLLM, vLLM, Ollama, etc.)
- Anthropic
- Cohere
- AWS Bedrock
- Google Generative AI
- Mistral AI
- LangChain
- Replicate
- Vertex AI
Install the corresponding opentelemetry-instrumentation-* package for your provider.You must use the official SDK for your provider. Raw HTTP requests (e.g., requests.post()) will not be traced. For OpenAI-compatible APIs, use the OpenAI SDK with a custom base_url.
Initialize
fallom.init({ apiKey, baseUrl, captureContent })
Initialize the SDK. Call once at app startup.| Option | Type | Description |
|---|
apiKey | string | Your Fallom API key (or use FALLOM_API_KEY env var) |
baseUrl | string | API endpoint (default:https://spans.fallom.com) |
captureContent | boolean | Whether to capture prompt/completion text (default: true) |
Create Session
fallom.session(options): FallomSession
Create a session-scoped tracer. Returns a FallomSession instance.| Option | Type | Description |
|---|
configKey | string | Your config name from the dashboard |
sessionId | string | Unique session/conversation ID |
customerId | string | Optional user identifier for per-user analytics |
FallomSession Methods
const session = fallom.session({ configKey, sessionId, customerId });
| Method | Description |
|---|
wrapOpenAI(client) | Wrap OpenAI client for automatic tracing |
wrapAnthropic(client) | Wrap Anthropic client for automatic tracing |
wrapGoogleAI(model) | Wrap Google AI model for automatic tracing |
wrapAISDK(ai) | Wrap Vercel AI SDK module |
wrapMastraAgent(agent) | Wrap Mastra agent for automatic tracing |
traceModel(model) | Wrap a Vercel AI SDK model directly (PostHog style) |
getModel(options?) | Get model assignment for A/B testing |
getContext() | Get the session context object |
Wrap OpenAI
session.wrapOpenAI(client)
Wrap OpenAI client for automatic tracing. Works with any OpenAI-compatible API.
Wrap Anthropic
session.wrapAnthropic(client)
Wrap Anthropic client for automatic tracing.
Wrap Google AI
session.wrapGoogleAI(model)
Wrap Google AI model for automatic tracing.
Wrap Vercel AI SDK
Wrap Vercel AI SDK module. Returns { generateText, streamText, generateObject, streamObject }.
Trace Model (PostHog Style)
session.traceModel(model)
Wrap a Vercel AI SDK model directly for automatic tracing. Returns a traced model that can be used with the original AI SDK functions.const model = session.traceModel(createOpenAI()("gpt-4o"));
await generateText({ model, prompt: "Hello!" });
Get Model (Session-scoped)
session.getModel(options?): Promise<string>
Get model assignment for A/B testing using the session’s config key.| Option | Type | Description |
|---|
fallback | string | Model to return if anything fails |
version | number | Pin to specific config version |
Get Model (Standalone)
fallom.models.get(configKey, sessionId, options?): Promise<string>
Get model assignment for A/B testing without a session.| Parameter | Type | Description |
|---|
configKey | string | Your config name from the dashboard |
sessionId | string | Unique session/conversation ID |
options.fallback | string | Model to return if anything fails |
Get Prompt
fallom.prompts.get(promptKey, options?): Promise<PromptResult>
Get a managed prompt.| Parameter | Type | Description |
|---|
promptKey | string | Your prompt key from the dashboard |
options.variables | object | Template variables (e.g., { userName: "John" }) |
options.version | number | Pin to specific version (default: latest) |
Get Prompt A/B Test
fallom.prompts.getAB(abTestKey, sessionId, options?): Promise<PromptResult>
Get a prompt from an A/B test.| Parameter | Type | Description |
|---|
abTestKey | string | Your A/B test key from the dashboard |
sessionId | string | Session ID for sticky assignment |
options.variables | object | Template variables |