Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
231 changes: 231 additions & 0 deletions docs/adapters/openrouter.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,231 @@
---
title: OpenRouter Adapter
id: openrouter-adapter
---

The OpenRouter adapter provides access to 300+ AI models from various providers through a single unified API, including models from OpenAI, Anthropic, Google, Meta, Mistral, and many more.

## Installation

```bash
npm install @tanstack/ai-openrouter
```

## Basic Usage

```typescript
import { chat } from "@tanstack/ai";
import { openrouter } from "@tanstack/ai-openrouter";

const adapter = openrouter();

const stream = chat({
adapter,
messages: [{ role: "user", content: "Hello!" }],
model: "openai/gpt-4o",
});
```

## Configuration

```typescript
import { createOpenRouter, type OpenRouterConfig } from "@tanstack/ai-openrouter";

const config: OpenRouterConfig = {
apiKey: process.env.OPENROUTER_API_KEY!,
baseURL: "https://openrouter.ai/api/v1", // Optional
httpReferer: "https://your-app.com", // Optional, for rankings
xTitle: "Your App Name", // Optional, for rankings
};

const adapter = createOpenRouter(config.apiKey, config);
```
Comment on lines +31 to +42
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Find the createOpenRouter function definition and exports
find packages/typescript/ai-openrouter -name "*.ts" -o -name "*.tsx" | head -20

Repository: TanStack/ai

Length of output: 809


🏁 Script executed:

#!/bin/bash
# Search for createOpenRouter function definition
rg -n "export.*createOpenRouter|function createOpenRouter" packages/typescript/ai-openrouter -A 5

Repository: TanStack/ai

Length of output: 600


🏁 Script executed:

#!/bin/bash
# Search for createOpenRouter usage in the codebase
rg -n "createOpenRouter\(" --type=ts -B 2 -A 2 | head -100

Repository: TanStack/ai

Length of output: 3031


Fix the createOpenRouter call in the example—remove apiKey from the config object.

The function signature is createOpenRouter(apiKey: string, config?: Omit<OpenRouterConfig, 'apiKey'>). The second parameter must exclude apiKey, so either:

  • Pass config without apiKey: createOpenRouter(config.apiKey, { baseURL: "...", httpReferer: "...", xTitle: "..." })
  • Or simplify to: createOpenRouter(process.env.OPENROUTER_API_KEY!, { baseURL: "...", httpReferer: "...", xTitle: "..." })

The current example passes the full config (with apiKey), which violates the type signature.

🤖 Prompt for AI Agents
In docs/adapters/openrouter.md around lines 31 to 42, the example calls
createOpenRouter with a config object that still contains apiKey which violates
the function signature; update the call to pass the API key as the first
argument and pass a config object that omits apiKey (e.g. call
createOpenRouter(process.env.OPENROUTER_API_KEY!, { baseURL: "...", httpReferer:
"...", xTitle: "..." }) or extract apiKey then call
createOpenRouter(config.apiKey, { baseURL: "...", httpReferer: "...", xTitle:
"..." })).


## Available Models

OpenRouter provides access to 300+ models from various providers. Models use the format `provider/model-name`:

```typescript
model: "openai/gpt-5.1"
model: "anthropic/claude-sonnet-4.5"
model: "google/gemini-3-pro-preview"
model: "meta-llama/llama-4-maverick"
model: "deepseek/deepseek-v3.2"
```

See the full list at [openrouter.ai/models](https://openrouter.ai/models).

## Example: Chat Completion

```typescript
import { chat, toStreamResponse } from "@tanstack/ai";
import { openrouter } from "@tanstack/ai-openrouter";

const adapter = openrouter();

export async function POST(request: Request) {
const { messages } = await request.json();

const stream = chat({
adapter,
messages,
model: "openai/gpt-4o",
});

return toStreamResponse(stream);
}
```

## Example: With Tools

```typescript
import { chat, toolDefinition } from "@tanstack/ai";
import { openrouter } from "@tanstack/ai-openrouter";
import { z } from "zod";

const adapter = openrouter();

const getWeatherDef = toolDefinition({
name: "get_weather",
description: "Get the current weather",
inputSchema: z.object({
location: z.string(),
}),
});

const getWeather = getWeatherDef.server(async ({ location }) => {
return { temperature: 72, conditions: "sunny" };
});

const stream = chat({
adapter,
messages,
model: "openai/gpt-4o",
tools: [getWeather],
});
```

## Web Search

OpenRouter supports web search through the `plugins` configuration. This enables real-time web search capabilities for any model:

```typescript
const stream = chat({
adapter,
messages: [{ role: "user", content: "What's the latest AI news?" }],
model: "openai/gpt-4o-mini",
providerOptions: {
plugins: [
{
id: "web",
engine: "exa", // "native" or "exa"
max_results: 5, // default: 5
},
],
},
});
```

Alternatively, use the `:online` model suffix:

```typescript
const stream = chat({
adapter,
messages,
model: "openai/gpt-4o-mini:online",
});
```

## Provider Options

OpenRouter supports extensive provider-specific options:

```typescript
const stream = chat({
adapter,
messages,
model: "openai/gpt-4o",
providerOptions: {
temperature: 0.7,
max_tokens: 1000,
top_p: 0.9,
top_k: 40,
frequency_penalty: 0.5,
presence_penalty: 0.5,
repetition_penalty: 1.1,
seed: 42,
tool_choice: "auto",
response_format: { type: "json_object" },
// Routing options
models: ["openai/gpt-4o", "anthropic/claude-3.5-sonnet"], // Fallback models
route: "fallback",
// Provider preferences
provider: {
order: ["OpenAI", "Anthropic"],
allow_fallbacks: true,
},
},
});
```

## Environment Variables

Set your API key in environment variables:

```bash
OPENROUTER_API_KEY=sk-or-...
```

## Model Routing

OpenRouter can automatically route requests to the best available provider:

```typescript
const stream = chat({
adapter,
messages,
model: "openrouter/auto", // Automatic model selection
providerOptions: {
models: [
"openai/gpt-4o",
"anthropic/claude-3.5-sonnet",
"google/gemini-pro",
],
route: "fallback", // Use fallback if primary fails
},
});
```

## API Reference

### `openrouter(config?)`

Creates an OpenRouter adapter with automatic API key detection from `OPENROUTER_API_KEY`.

**Parameters:**

- `config.baseURL?` - Custom base URL (optional)
- `config.httpReferer?` - HTTP Referer header for rankings (optional)
- `config.xTitle?` - X-Title header for rankings (optional)

**Returns:** An OpenRouter adapter instance.

### `createOpenRouter(apiKey, config?)`

Creates an OpenRouter adapter with explicit API key.

**Parameters:**

- `apiKey` - OpenRouter API key (required)
- `config.baseURL?` - Custom base URL (optional)
- `config.httpReferer?` - HTTP Referer header (optional)
- `config.xTitle?` - X-Title header (optional)

**Returns:** An OpenRouter adapter instance.

## Next Steps

- [Getting Started](../getting-started/quick-start) - Learn the basics
- [Tools Guide](../guides/tools) - Learn about tools
- [Other Adapters](./openai) - Explore other providers

4 changes: 4 additions & 0 deletions docs/config.json
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,10 @@
"label": "OpenAI",
"to": "adapters/openai"
},
{
"label": "OpenRouter",
"to": "adapters/openrouter"
},
{
"label": "Anthropic",
"to": "adapters/anthropic"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,12 +38,39 @@
A powerful, type-safe AI SDK for building AI-powered applications.

- Provider-agnostic adapters (OpenAI, Anthropic, Gemini, Ollama, etc.)
- **Multimodal content support** - Send images, audio, video, and documents
- Chat completion, streaming, and agent loop strategies
- Headless chat state management with adapters (SSE, HTTP stream, custom)
- Type-safe tools with server/client execution
- Isomorphic type-safe tools with server/client execution
- **Enhanced integration with TanStack Start** - Share implementations between AI tools and server functions

### <a href="https://tanstack.com/ai">Read the docs →</b></a>

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).

**With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`:

```typescript
import { createServerFnTool } from '@tanstack/ai-react'

// Define once, get AI tool AND server function (TanStack Start only)
const getProducts = createServerFnTool({
name: 'getProducts',
inputSchema: z.object({ query: z.string() }),
execute: async ({ query }) => db.products.search(query),
})

// Use in AI chat
chat({ tools: [getProducts.server] })

// Call directly from components (no API endpoint needed!)
const products = await getProducts.serverFn({ query: 'laptop' })
```

No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details.

Comment on lines +49 to +73
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix markdown heading level + code sample missing z import.

  • Markdownlint is right: after # TanStack AI, the next heading level should be ## (or drop the heading styling for the “Read the docs” line).
  • The snippet uses z.object(...) but doesn’t import z.
-### <a href="https://tanstack.com/ai">Read the docs →</b></a>
+## <a href="https://tanstack.com/ai">Read the docs →</b></a>
@@
 ```typescript
 import { createServerFnTool } from '@tanstack/ai-react'
+import { z } from 'zod'
🤖 Prompt for AI Agents
In packages/typescript/ai-openrouter/README.md around lines 49 to 73, adjust the
markdown heading to ensure it is a second-level heading (##) immediately after
the top-level "# TanStack AI" header (or remove heading styling for the “Read
the docs” line) and update the code example to include the missing zod import by
adding an import for z (import { z } from 'zod') alongside the existing
createServerFnTool import so the z.object usage resolves.

## Get Involved

- We welcome issues and pull requests!
Expand Down Expand Up @@ -88,7 +115,7 @@ We're looking for TanStack AI Partners to join our mission! Partner with us to p

- <a href="https://github.com/tanstack/config"><b>TanStack Config</b></a> – Tooling for JS/TS packages
- <a href="https://github.com/tanstack/db"><b>TanStack DB</b></a> – Reactive sync client store
- <a href="https://github.com/tanstack/devtools">TanStack Devtools</a> – Unified devtools panel
- <a href="https://github.com/tanstack/devtools"><b>TanStack Devtools</b></a> – Unified devtools panel
- <a href="https://github.com/tanstack/form"><b>TanStack Form</b></a> – Type‑safe form state
- <a href="https://github.com/tanstack/pacer"><b>TanStack Pacer</b></a> – Debouncing, throttling, batching
- <a href="https://github.com/tanstack/query"><b>TanStack Query</b></a> – Async state & caching
Expand Down
Loading