LLMVault stores LLM credentials with envelope encryption, mints scoped tokens for sandboxes, and proxies requests to any provider. Your code never sees a plaintext key.
curl -X POST https://api.llmvault.dev/v1/credentials \
-H "Authorization: Bearer $ORG_TOKEN" \
-d '{"provider":"anthropic","api_key":"sk-ant-..."}'curl -X POST https://api.llmvault.dev/v1/tokens \
-d '{"credential_id":"cred_8x7k","ttl":"1h"}'
# → {"token":"ptok_eyJhbG..."}curl https://api.llmvault.dev/v1/proxy/v1/messages \
-H "X-Proxy-Token: ptok_eyJhbG..." \
-d '{"model":"claude-sonnet-4-20250514","messages":[...]}'
# → streams response, key never exposedWhen your customers ask how their keys are protected, you'll have specific, technical answers — not hand-waving.
POST /v1/credentials
{
"provider": "anthropic",
"api_key": "sk-ant-..."
}POST /v1/tokens
{
"credential_id": "cred_8x7k",
"ttl": "1h"
}POST /v1/proxy/v1/messages
X-Proxy-Token: ptok_eyJhbG...
{
"model": "claude-sonnet-4-20250514"
}Free tier. 10 credentials. 10,000 proxy requests. No credit card.