What Gets Traced?
Every LLM call is automatically captured with:| Field | Description |
|---|---|
| Model | The LLM model used (e.g., gpt-4o, claude-3-opus) |
| Tokens | Input and output token counts |
| Latency | Request duration in milliseconds |
| Prompts | Full prompt messages sent to the model |
| Completions | Model responses |
| Session | Your config key and session ID for grouping |
Get Your API Key
1
Sign up
Create a free account at bench.fallom.com
2
Create a project
Set up your first project in the dashboard
3
Copy your API key
Find your API key in project settings
Open Dashboard
Get your API key and view your traces

