One endpoint for every model. Smart routing picks the cheapest one. Sponsored content earns you free credits. Drop-in OpenAI compatible.
// Cursor, VS Code, Continue.dev
baseUrl: "https://adllm.vercel.app/api/<token>/v1"
apiKey: "anything"
✓ Routed to gemini-2.0-flash (saved 80%)
✓ +$0.005 credits from sponsored content
Bring your own OpenAI, Anthropic, or Google keys. They're encrypted with AES-256 and never leave the server.
Paste the proxy URL as your OpenAI base URL. Works with Cursor, VS Code, Continue.dev, and any OpenAI-compatible tool.
Every response includes a small sponsor message. That earns you credits that pay for future requests automatically.
No config files, no CLI tools. Just a URL that makes your AI usage smarter and cheaper.
AI classifier routes simple prompts to cheap models, complex ones to powerful models. Save 60-80% automatically.
Ask the same question twice and pay $0.00 the second time. Cache hits are instant.
Set your project context once. Every request gets it as a system prompt — no manual setup per IDE.
A small sponsor note at the end of each response earns you credits. Use them to make free requests.
Works with any OpenAI-compatible tool