API Reference

LLM provider keys

TokenSaver still needs a vendor secret to call the model. You can pass it per request (SDK) or store it encrypted in Settings.

Governance remains on the TokenSaver key. The provider secret only authorises the outbound LLM call. Ephemeral provider_api_key on POST /pipelines/run overrides organisation keys for that run and is never stored.

Ephemeral (SDK)

Pass provider_api_key on the client or on each ask().

Stored (console)

Settings → LLM provider keys: encrypted at rest, reused until rotated.

Hosted SaaS — supported providers

On the default public API URL, the Python SDK accepts openai, anthropic, google, mistral, groq, deepseek for provider (same set as HOSTED_SAAS_LLM_PROVIDERS in the SDK). Configure the matching vendor key in Settings (OpenAI, Anthropic, Google Gemini, Mistral, Grok — code groq — DeepSeek) or pass provider_api_key for that run. For a self-hosted TokenSaver API, set base_url to your stack; the SDK may allow additional provider codes if the backend enables them.

python
import os
from tokensaver_sdk import TokenSaver
 
# OpenAI (ephemeral key from env)
ts = TokenSaver(api_key="ts_...", provider_api_key=os.environ["OPENAI_API_KEY"])
ts.ask("Hello", provider="openai", model="gpt-4o")
 
# Anthropic — use ANTHROPIC_API_KEY or stored org key
ts.ask("Hello", provider="anthropic", model="claude-sonnet-4-6", provider_api_key=os.environ.get("ANTHROPIC_API_KEY"))