Get Your API Key
Sign up
Create a free account at app.fallom.com
Install the SDK
- TypeScript
- Python
Add 3 Lines of Code
- TypeScript
- Python
What You Get
Every LLM call is automatically captured:| Field | Description |
|---|---|
| Model | Which model was used |
| Tokens | Input, output, and cached counts |
| Latency | Request duration + time to first token |
| Cost | Calculated from token usage |
| Prompts | Full messages sent |
| Completions | Model responses |
| Session | Grouped by user/conversation |
View Your Traces
Open the dashboard to see your LLM calls in real-time
Next Steps
Tracing
Custom spans, metadata, and advanced tracing
Model A/B Testing
Test different models in production
Evals
Run evaluations on your outputs
Integrations
Anthropic, OpenRouter, Vercel AI SDK, and more

