A Wallet for Your AI Keys
Your AI Keys. Your Rules.
Like MetaMask for AI. Store your API keys in a secure browser wallet, then let any web app use AI models through it — OpenAI, Anthropic, OpenRouter, or local WASM. Apps request capabilities, you approve. Keys never leave your browser.
npm install @byokapi/client
Think of it as...
A familiar pattern, applied to AI
Like MetaMask, but for AI
MetaMask is a wallet that holds your crypto keys and lets dApps request transactions. BYOK API is a wallet that holds your AI keys and lets apps request inference. Same pattern: your keys, your browser, your consent.
Like a password manager for APIs
1Password stores credentials and auto-fills them into websites. BYOK stores API keys and auto-routes AI calls. Apps never see the keys — they just get the results.
Like Sign in with Google, but you keep control
OAuth delegates identity. BYOK delegates AI access. But unlike OAuth, there's no central server — everything runs in your browser. You grant capabilities (chat, image, TTS) not blanket access.
How it works
Install the client
Add @byokapi/client to your app. It's a lightweight SDK that wraps the AI SDK v6 provider interface.
npm install @byokapi/client
User connects their wallet
The client opens a hidden iframe to the user's BYOK bridge — their personal AI key wallet. No keys ever leave the wallet.
await client.connect()
App requests capabilities
Like a dApp requesting a transaction, your app requests specific capabilities. The user approves via a consent popup.
await client.requestGrant({ capabilities: ["language"] })Use any AI model
Get a standard AI SDK provider. Calls are routed through the wallet to the user's chosen provider — OpenAI, Anthropic, or local WASM.
generateText({ model: provider("gpt-4o"), prompt: "Hello" })Built for developers
No server. No keys. No infrastructure. Just AI.
No server required
No backend, no API routes, no secure key storage, no encryption at rest. Add AI to a static site or SPA. The bridge handles all the security — your app is just a client.
Few lines to add AI
Add an LLM to your task manager in 5 lines. Add voice to your app in 5 more. No infrastructure to set up, no keys to manage, no billing to configure. Just install, connect, use.
Zero vendor lock-in
Your code uses standard AI SDK v6. Need to move to server-side later? Swap ByokClient for a direct AI SDK provider — same interface, zero code changes. BYOK is a deployment detail, not an architecture decision.
Zero cost to you
Users pay their own providers or use free local WASM models — either way, it costs you nothing. No API budget, no billing infrastructure, no payment processing.
Free WASM models
Offer local AI models that run entirely in the user's browser. No server, no API calls, no cost to anyone. Perfect for demos, offline use, or privacy-sensitive apps.
Everything is transparent
Users see exactly which model runs, how many tokens they used, and what it costs. No black box. They trust your app more because they control every aspect of AI access.
See it in action
Add AI to real apps in minutes — no backend, no key headache
Add AI to a task manager
Summarize tasks, generate subtasks, or prioritize with AI — no backend needed.
import { ByokClient } from "@byokapi/client"
import { generateText } from "ai"
const client = new ByokClient({ bridgeUrl, appName: "TaskFlow" })
await client.connect()
await client.requestGrant({ capabilities: ["language"] })
// That's it. Now use AI anywhere in your app:
const { text } = await generateText({
model: client.getProvider()("gpt-4o"),
prompt: `Summarize these tasks: ${tasks.join(", ")}`,
})Add voice to any app
Text-to-speech and speech-to-text — one grant, a few lines of code.
await client.requestGrant({ capabilities: ["speech"] })
// Generate speech from text
const { audio } = await api.generateSpeech({
grantId, text: "Hello from your app!", voice: "nova",
})
new Audio("data:audio/mp3;base64," + audio).play()
// Or use a local WASM model — zero API cost
// User just picks "Local (Kokoro)" in the consent popupReady to scale? Just swap the provider
When you outgrow BYOK, migration is one line. Same AI SDK, same code.
// Before: BYOK client-side (no server needed)
const model = client.getProvider()("gpt-4o")
// After: direct AI SDK server-side (same interface)
import { openai } from "@ai-sdk/openai"
const model = openai("gpt-4o")
// Your generateText / streamText / generateObject calls
// don't change at all. Zero refactoring.Designed for users
Your keys, your budget, your rules — zero trust required
Keys never leave your browser
Your API keys are stored in an isolated browser wallet. Apps only get inference results — they can't see, copy, or leak your keys.
You control your tokens
Your API tokens, your budget. Set usage limits per app, monitor every request in your wallet dashboard, and revoke access instantly.
Pay your own rates
No markup from app developers. You pay OpenAI/Anthropic directly at their published rates. Full cost transparency.
Grant specific capabilities
Approve only what each app needs — chat, image, TTS, or transcription. Not blanket access, just the capabilities requested.