Overview
BYOK lets you provide your own API keys for LLM providers. a21e handles prompt engineering and execution orchestration; inference calls go directly to your provider account.
Supported providers
| Provider | Key prefix | Models available |
|---|
| Anthropic | sk-ant- | Claude Sonnet 4, Claude Opus 4 |
| OpenAI | sk- | GPT-4o, GPT-4.1 |
| Google | AIza | Gemini 2.5 Pro, Gemini 2.5 Flash |
| xAI | xai- | Grok 3 |
Adding a provider key
- Go to Settings > Provider Keys in the dashboard
- Click Add provider key
- Select the provider and paste your API key
- The key is encrypted at rest and never logged
Or via API:
curl -X POST https://api.a21e.com/v1/provider-keys \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"provider": "anthropic",
"api_key": "sk-ant-..."
}'
How billing works with BYOK
| Mode | a21e credits | LLM cost |
|---|
| Managed (no BYOK key) | 1 credit per enhancement (covers prompt engineering + inference) | Included |
| BYOK (your key) | 1 credit per enhancement (covers prompt engineering only) | Billed by provider directly |
Credits are still required in BYOK mode — they cover prompt synthesis, technique selection, quality scoring, and memory. The LLM inference cost is the only part that shifts to your provider account.
Key security
- Keys are encrypted with AES-256 before storage
- Keys are never included in logs, error messages, or API responses
- Keys are only decrypted at the moment of an LLM call, in memory
- You can rotate or delete keys at any time