Skip to main content
  • Python
  • TypeScript

Initialize

fallom.init(api_key=None, base_url=None, capture_content=True)
Initialize the SDK. Call this before importing LLM libraries for auto-instrumentation.
ParameterTypeDescription
api_keystrYour Fallom API key (or use FALLOM_API_KEY env var)
base_urlstrAPI endpoint (default:https://spans.fallom.com)
capture_contentboolWhether to capture prompt/completion text (default: True)

Get Model

fallom.models.get(config_key, session_id, version=None, fallback=None) -> str
Get model assignment for a session.
ParameterTypeDescription
config_keystrYour config name from the dashboard
session_idstrUnique session/conversation ID (sticky assignment)
versionintPin to specific version (default: latest)
fallbackstrModel to return if anything fails

Get Prompt

fallom.prompts.get(prompt_key, variables=None, version=None) -> PromptResult
Get a managed prompt.
ParameterTypeDescription
prompt_keystrYour prompt key from the dashboard
variablesdictTemplate variables (e.g., {"user_name": "John"})
versionintPin to specific version (default: latest)
Returns: PromptResult with key, version, system, user

Get Prompt A/B Test

fallom.prompts.get_ab(ab_test_key, session_id, variables=None) -> PromptResult
Get a prompt from an A/B test (sticky assignment).
ParameterTypeDescription
ab_test_keystrYour A/B test key from the dashboard
session_idstrUnique session/conversation ID (for sticky assignment)
variablesdictTemplate variables
Returns: PromptResult with key, version, system, user, ab_test_key, variant_index

Set Session

fallom.trace.set_session(config_key, session_id)
Set trace context. All subsequent LLM calls will be tagged with this config_key and session_id.

Clear Session

fallom.trace.clear_session()
Clear trace context.

Record Custom Metrics

fallom.trace.span(data, config_key=None, session_id=None)
Record custom business metrics.
ParameterTypeDescription
datadictMetrics to record
config_keystrOptional if set_session() was called
session_idstrOptional if set_session() was called

Supported LLM Providers

Auto-instrumentation available for:
  • OpenAI (+ OpenAI-compatible APIs: OpenRouter, LiteLLM, vLLM, Ollama, etc.)
  • Anthropic
  • Cohere
  • AWS Bedrock
  • Google Generative AI
  • Mistral AI
  • LangChain
  • Replicate
  • Vertex AI
Install the corresponding opentelemetry-instrumentation-* package for your provider.
You must use the official SDK for your provider. Raw HTTP requests (e.g., requests.post()) will not be traced. For OpenAI-compatible APIs, use the OpenAI SDK with a custom base_url.