Skip to Content
IntegrationsOverview

Integrations

auxilia ships with two first-party integrations plus a pluggable LLM provider layer.

First-party integrations

IntegrationPurpose
SlackInvoke agents, approve tool calls, and stream responses inside Slack threads
LangfuseTrace LLM calls, tool calls, token costs, and agent runs

Both integrations are activated by setting environment variables — there is nothing else to install.

LLM providers

auxilia supports multiple providers out of the box. At least one key is required; every configured provider shows up in the model picker.

ProviderEnvironment VariableModels
AnthropicANTHROPIC_API_KEYclaude-haiku-4-5, claude-sonnet-4-6, claude-opus-4-6
OpenAIOPENAI_API_KEYgpt-4o-mini
GoogleGOOGLE_API_KEYgemini-3-flash-preview, gemini-3-pro-preview
DeepSeekDEEPSEEK_API_KEYdeepseek-chat, deepseek-reasoner

Anthropic models run with thinking enabled by default; Google models run with include_thoughts=True. Change the defaults in app/model_providers/catalog.py.

Authentication providers

ProviderEnvironment variables
Email/passwordBuilt-in
Google OAuthGOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET, GOOGLE_REDIRECT_URI

Set AUTH_GOOGLE_EXCLUSIVE=true if you want to disable the email/password path and force Google sign-in.

Personal access tokens

For headless usage (scripts, CI, external services), workspace admins can mint personal access tokens from Settings → Tokens. PATs authenticate API calls as the creating admin; they’re hashed at rest and shown once at creation time.