TokenRouter Quickstart Guide - Get Started in 5 Minutes
Get Up and Running in 5 Minutes
Section titled “Get Up and Running in 5 Minutes”This quickstart guide will help you make your first TokenRouter API request. We’ll use the TypeScript SDK, but you can also use Python or cURL.
Prerequisites
Section titled “Prerequisites”- A TokenRouter account (sign up here)
- Node.js 18+ or Python 3.8+ installed
- At least one AI provider API key (OpenAI, Anthropic, Google, etc.)
Step 1: Create a TokenRouter API Key
Section titled “Step 1: Create a TokenRouter API Key”- Log in to tokenrouter.io
- Navigate to Console → API Keys
- Click Create API Key
- Name it “Quickstart” and copy the key
Step 2: Add Provider Keys
Section titled “Step 2: Add Provider Keys”You need at least one AI provider key to route requests. Let’s add an OpenAI key:
- Go to Console → Providers
- Click Add Provider → OpenAI
- Paste your OpenAI API key (get one from platform.openai.com)
- Click Save
Step 3: Install the SDK
Section titled “Step 3: Install the SDK”npm install tokenrouterpip install tokenrouterNo installation needed! cURL is pre-installed on most systems.
Step 4: Make Your First Request
Section titled “Step 4: Make Your First Request”Create a file named index.ts:
import Tokenrouter from 'tokenrouter';
const client = new Tokenrouter({ apiKey: 'tr_your_api_key_here', // Replace with your actual key});
async function main() { const response = await client.responses.create({ model: 'auto:balance', input: 'What is the capital of France?' });
console.log(response.output[0].content[0].text); console.log('\nUsage:', response.usage); console.log('Provider:', response.metadata?.provider);}
main();Run it:
npx tsx index.ts# or with ts-node:npx ts-node index.tsCreate a file named main.py:
from tokenrouter import Tokenrouter
client = Tokenrouter( api_key="tr_your_api_key_here", # Replace with your actual key)
response = client.responses.create( model="auto:balance", input="What is the capital of France?")
print(response.output[0].content[0].text)print(f"\nUsage: {response.usage}")print(f"Provider: {response.metadata.get('provider')}")Run it:
python main.pycurl https://api.tokenrouter.io/v1/responses \ -H "Content-Type: application/json" \ -H "Authorization: Bearer tr_your_api_key_here" \ -d '{ "model": "auto:balance", "input": "What is the capital of France?" }'Expected Response
Section titled “Expected Response”{ "id": "resp_...", "object": "response", "created_at": 1704067200, "status": "completed", "model": "gpt-4o-2024-11-20", "output": [ { "role": "assistant", "content": [ { "type": "text", "text": "The capital of France is Paris." } ] } ], "usage": { "input_tokens": 12, "output_tokens": 8, "total_tokens": 20 }, "metadata": { "provider": "openai", "routing_mode": "balance", "routing_confidence": 0.95 }}Step 5: Try Different Routing Modes
Section titled “Step 5: Try Different Routing Modes”Now that you have a working setup, try different routing modes:
const response = await client.responses.create({ model: 'auto:cost', input: 'Summarize quantum computing in one sentence'});Routes to the most cost-effective provider for this request.
const response = await client.responses.create({ model: 'auto:quality', input: 'Write a detailed business plan for a tech startup'});Routes to the highest-quality model available.
const response = await client.responses.create({ model: 'auto:latency', input: 'What is 2+2?'});Routes to the fastest provider for quick responses.
const response = await client.responses.create({ model: 'claude-3-7-sonnet-latest:balance', input: 'Explain machine learning'});Uses Claude 3.7 Sonnet with balanced routing.
Step 6: Enable Streaming
Section titled “Step 6: Enable Streaming”Get real-time responses with Server-Sent Events:
const stream = await client.responses.create({ model: 'auto:balance', input: 'Write a short poem about coding', stream: true});
for await (const chunk of stream) { if (chunk.event === 'content.delta') { process.stdout.write(chunk.delta.text); }}stream = client.responses.create( model="auto:balance", input="Write a short poem about coding", stream=True)
for chunk in stream: if chunk.event == 'content.delta': print(chunk.delta.text, end='', flush=True)curl https://api.tokenrouter.io/v1/responses \ -H "Content-Type: application/json" \ -H "Authorization: Bearer tr_your_api_key_here" \ -d '{ "model": "auto:balance", "input": "Write a short poem about coding", "stream": true }'Common Use Cases
Section titled “Common Use Cases”1. Chat Application
Section titled “1. Chat Application”const response = await client.responses.create({ model: 'auto:balance', input: userMessage, instructions: 'You are a helpful assistant', stream: true});2. Code Generation
Section titled “2. Code Generation”const response = await client.responses.create({ model: 'claude-3-7-sonnet-latest:quality', input: 'Write a Python function to calculate Fibonacci numbers', temperature: 0.3});3. Data Analysis
Section titled “3. Data Analysis”const response = await client.responses.create({ model: 'gpt-4o:balance', input: 'Analyze this dataset and provide insights', max_output_tokens: 2000});4. Content Summarization
Section titled “4. Content Summarization”const response = await client.responses.create({ model: 'auto:cost', input: `Summarize this article: ${articleText}`, max_output_tokens: 200});Using Environment Variables
Section titled “Using Environment Variables”For security, always use environment variables for API keys:
TOKENROUTER_API_KEY=tr_your_api_key_hereimport 'dotenv/config';import Tokenrouter from 'tokenrouter';
const client = new Tokenrouter({ apiKey: process.env.TOKENROUTER_API_KEY});TOKENROUTER_API_KEY=tr_your_api_key_hereimport osfrom dotenv import load_dotenvfrom tokenrouter import Tokenrouter
load_dotenv()
client = Tokenrouter( api_key=os.getenv("TOKENROUTER_API_KEY"))Inline Provider Keys (Optional)
Section titled “Inline Provider Keys (Optional)”Instead of configuring provider keys in the console, you can pass them directly in requests using custom headers:
const response = await client.responses.create({ model: 'gpt-4o', input: 'Hello!',}, { headers: { 'X-OpenAI-Key': process.env.OPENAI_API_KEY }});response = client.responses.create( model="gpt-4o", input="Hello!", extra_headers={ "X-OpenAI-Key": os.getenv("OPENAI_API_KEY") })curl https://api.tokenrouter.io/v1/responses \ -H "Authorization: Bearer tr_your_api_key_here" \ -H "X-OpenAI-Key: sk-proj-..." \ -d '{"model": "gpt-4o", "input": "Hello!"}'Supported headers:
X-OpenAI-Key- OpenAI API keyX-Anthropic-Key- Anthropic API keyX-Gemini-Key- Google Gemini API keyX-Mistral-Key- Mistral API keyX-DeepSeek-Key- DeepSeek API key
Next Steps
Section titled “Next Steps”Congratulations! You’ve made your first TokenRouter request. Here’s what to explore next:
- Provider Keys - Add more AI providers
- Auto Routing - Learn how intelligent routing works
- Streaming - Real-time response streaming
- Tool Calls - Function calling and tools
- Routing Rules - Custom routing logic
- Firewall - Content filtering and security
Troubleshooting
Section titled “Troubleshooting””No API key provided”
Section titled “”No API key provided””Make sure you’re passing the Authorization header with your TokenRouter API key.
”No provider keys configured”
Section titled “”No provider keys configured””You need to add at least one provider key in the Console → Providers.
”Invalid API key”
Section titled “”Invalid API key””Double-check your API key and make sure it’s active in Console → API Keys.
Network errors
Section titled “Network errors”Ensure you’re making requests over HTTPS to https://api.tokenrouter.io/v1/responses.
Need help? Contact support@tokenrouter.io.