Langfuse
Langfuse is an open-source LLM observability platform. auxilia ships with a built-in Langfuse callback that traces every agent run — model calls, tool invocations, token counts, latency, and cost — without any code changes.
What you get
- Traces — per-message traces showing LLM calls, MCP tool calls, latencies, and token usage
- Cost tracking — token and cost breakdowns per agent, user, or thread
- Evaluation — score and annotate agent responses in the Langfuse UI
- Prompt versioning — manage and compare system prompts across agents
Enable it
Set these three environment variables on the backend:
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_BASE_URL=https://cloud.langfuse.com # or your self-hosted URLOn startup, the backend instantiates a shared Langfuse client and attaches a CallbackHandler to every LangGraph invocation. If any of the three variables is missing, the integration is silently disabled.
See app/integrations/langfuse/callback.py for the wiring.
Self-hosted Langfuse
Langfuse is itself open source. You can run it alongside auxilia for full data ownership — follow the Langfuse self-hosting guide and point LANGFUSE_BASE_URL at your instance.