Share
Explore

Lab Lecture Workbook: Introduction to OpenRouter

Course: AML 3304 - Advanced Machine Learning ## Workbook Title: Exploring OpenRouter for Unified AI Model Access ## Duration: 1-1.5 hours (lecture + hands-on setup) ## Date: July 17, 2025
### Workbook Overview This workbook serves as a guided lecture and hands-on resource for understanding OpenRouter, a powerful platform for accessing AI models.
We'll cover its fundamentals, practical benefits, and step-by-step setup process.
By the end, you'll have your own OpenRouter API key and be ready to integrate it into projects like the previous lab on DeepSeek-V2.
**Key Concepts Covered:** - What OpenRouter is and how it functions. - Why it's useful in machine learning workflows. - Hands-on setup, including signing up and obtaining an API key. - Basic usage examples in code.
**Materials Needed:** - A web browser. - A code editor or Google Colab for testing. - An email address for sign-up (no credit card required for free tier).
**Learning Objectives:** - Explain OpenRouter's role as a unified API for LLMs. - Articulate the benefits of using OpenRouter over direct provider APIs. - Successfully sign up and generate an OpenRouter API key. - Implement a simple API call using OpenRouter in Python.
---
### Section 1: What is OpenRouter?
OpenRouter is a unified API platform that provides access to hundreds of AI models from various providers through a single endpoint.
It acts as a "router" for large language models (LLMs), handling requests intelligently by selecting the most cost-effective or available model, managing fallbacks (e.g., if one model is down), and ensuring compatibility with popular SDKs like OpenAI's.
#### Key Features: - **Unified Endpoint:** One API to rule them all—access models from providers like OpenAI, Anthropic, Google, DeepSeek, and more without switching credentials or endpoints.
- **Model Variety:** Supports hundreds of models, including specialized ones like DeepSeek-Coder-V2 (used in our previous lab), GPT-4o, Claude-3, Gemini, and open-source options.
- **Intelligent Routing:** Automatically chooses the best model based on cost, speed, or availability. It also supports fallbacks to prevent downtime.
- **Compatibility:** Works seamlessly with the OpenAI SDK format, making it easy to drop into existing codebases.
- **Additional Tools:** Includes features like leaderboards for model rankings (based on usage from your site), streaming responses, and community integrations with frameworks.
- **Pricing Model:** Pay-per-use with no upfront costs.
Free tier offers limited daily credits (e.g., for testing 10-20 API calls).
Costs are based on tokens processed, often cheaper than direct providers due to routing optimization.
For exact pricing, check the models page on their site.
#### How It Works:
OpenRouter sits between your application and the AI providers.
When you make an API call (e.g., for chat completion), it routes the request to the specified model or an optimal alternative.
This abstraction simplifies development, especially for projects involving multiple models, like knowledge distillation in our DeepSeek lab.
**Quick Fact:** As of 2025, OpenRouter powers applications in coding assistance, chatbots, and research by democratizing access to cutting-edge LLMs without lock-in to a single provider.

**Exercise 1: Reflection Question** Write a short paragraph (2-3 sentences) explaining OpenRouter in your own words.
How does it differ from directly using an API from OpenAI or DeepSeek?
---
### Section 2: Why Use OpenRouter?
In machine learning and AI development, accessing models efficiently is crucial.
OpenRouter addresses common pain points, making it an ideal tool for students, researchers, and developers.
#### Benefits:
- **Cost Savings:** Routes to the cheapest available model that meets your needs, reducing expenses compared to premium providers.
- **Reliability:** Built-in fallbacks ensure your application doesn't fail if a model is rate-limited or offline.
- **Ease of Integration:** Compatible with OpenAI's SDK, so you can switch models with one line of code. This is perfect for experimentation, like testing DeepSeek-V2 alongside GPT-4o.
- **Scalability:** Handles high-volume requests and supports streaming for real-time applications (e.g., chatbots).
- **Privacy and Control:** No data is stored unless specified, and you can add custom headers for tracking (e.g., for leaderboards).
- **Free Access to Advanced Models:** Ideal for educational labs—get started with free credits to prototype without budgets.
- **Community and Leaderboards:** Contribute to model rankings by including site metadata in headers, helping the community evaluate models.

#### Real-World Applications in AML 3304:
- In our DeepSeek-V2 lab, we used OpenRouter to access DeepSeek models via a unified API, avoiding the need for separate setups.
- For projects: Build chatbots, generate code, or distill knowledge from large models without managing multiple accounts.
- Why over alternatives? Unlike direct APIs (e.g., OpenAI's), OpenRouter offers multi-provider access, cost optimization, and no vendor lock-in.
**Exercise 2: Discussion Prompt**
List 3 scenarios where OpenRouter would be preferable to a single-provider API (e.g., for a startup, researcher, or student project). Be specific to machine learning tasks.
---
### Section 3: How to Get Set Up with OpenRouter
Setting up is straightforward and takes about 5-10 minutes.
We'll walk through signing up, creating an API key, and testing it.
Step 1: Signing Up
1. Open your web browser and navigate to
2. Click on the "Sign Up" button (usually in the top-right corner or hero section of the homepage). 3. Choose your sign-up method: - Email and password (recommended for simplicity). - Or use GitHub/Google for quick authentication.
image.png
4. Enter your details and verify your email if prompted (check your inbox/spam for a confirmation link). 5. Once signed in, you'll land on the dashboard. New users get free credits to start experimenting.
**Tip:** No credit card is required for the free tier, but you can add payment methods later for heavier usage.

Step 2: Creating and Managing API Keys

1. From the dashboard, navigate to the "Keys" section (or directly visit https://openrouter.ai/keys).
2. Click "Create Key" or "Generate API Key."
3. Give the key a name (e.g., "AML3304-Lab-Key") for easy identification.
4. The key will be generated—copy it immediately and store it securely (e.g., in an environment variable or .env file). It won't be shown again. 5. Manage keys: You can view, revoke, or regenerate keys from this page. Set usage limits if needed for cost control.
**Security Note:** Treat API keys like passwords—never share them in code repositories or public forums. Use environment variables in your projects.

Step 3: Testing Your Setup Once you have your key, test it with a simple API call. We'll use Python and the OpenAI SDK for compatibility.
1. Install the OpenAI library (if not already): In your terminal or Colab, run `pip install openai`. 2. Create a script or Colab cell with the following code (replace `<OPENROUTER_API_KEY>` with your key): ``` from openai import OpenAI
client = OpenAI( base_url="https://openrouter.ai/api/v1", api_key="<OPENROUTER_API_KEY>", )
completion = client.chat.completions.create( extra_headers={ "HTTP-Referer": "https://example.com", # Optional: Your site URL for leaderboards "X-Title": "AML 3304 Lab", # Optional: Your app name }, model="openai/gpt-4o", # Example model; try "deepseek-ai/deepseek-coder-v2" for coding tasks messages=[ {"role": "user", "content": "Hello! What is OpenRouter?"} ] )
print(completion.choices[0].message.content) ``` 3. Run the code. If successful, you'll see a response explaining OpenRouter (generated by the model). 4. Experiment: Change the model to another (e.g., "anthropic/claude-3-opus") and rerun.
For direct HTTP requests without SDK: ``` import requests import json
response = requests.post( url="https://openrouter.ai/api/v1/chat/completions", headers={ "Authorization": "Bearer <OPENROUTER_API_KEY>", "HTTP-Referer": "https://example.com", # Optional "X-Title": "AML 3304 Lab", # Optional }, data=json.dumps({ "model": "openai/gpt-4o", "messages": [{"role": "user", "content": "Hello! What is OpenRouter?"}] }) )
print(response.json()['choices'][0]['message']['content']) ```
**Troubleshooting:** - Rate limits: Free tier has daily caps—monitor usage on your dashboard. - Errors: Check for invalid keys or model names. Refer to https://openrouter.ai/docs for API reference. - Streaming: Add `stream=True` for real-time responses in interactive apps.
**Exercise 3: Hands-On Activity** 1. Sign up for OpenRouter and generate an API key. 2. Run the test code above with at least two different models. 3. Modify the prompt to ask a question related to our DeepSeek-V2 lab (e.g., "How can I use DeepSeek-Coder-V2 for code generation?"). 4. Document your results: What was the response? Did routing work seamlessly?
---
### Assessment and Next Steps - **Quiz Questions:** 1. What is the base URL for OpenRouter's API? (Answer: https://openrouter.ai/api/v1) 2. Name two benefits of using OpenRouter. 3. True/False: OpenRouter requires a credit card for sign-up. (False)
- **Submission:** Share a screenshot of your successful API response and answers to exercises via the course portal.
- **Extensions:** Integrate OpenRouter into your DeepSeek-V2 chatbot from the previous lab. Explore the docs for advanced features like model fallbacks or fine-tuning proxies.
This workbook equips you with the basics of OpenRouter—leveraging it will streamline your AI experiments in AML 3304. For more, visit https://openrouter.ai/docs or the community forums!
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.