Skip to main content
Use @ai-stats/ai-sdk-provider to connect the Vercel AI SDK to the AI Stats Gateway.

Install

npm install @ai-stats/ai-sdk-provider ai@^6

Basic text generation

import { aiStats } from "@ai-stats/ai-sdk-provider";
import { generateText } from "ai";

const result = await generateText({
  model: aiStats("openai/gpt-5-nano"),
  prompt: "Summarise the benefits of vector search.",
});

console.log(result.text);

Streaming

import { aiStats } from "@ai-stats/ai-sdk-provider";
import { streamText } from "ai";

const { textStream } = await streamText({
  model: aiStats("openai/gpt-5-nano"),
  prompt: "Stream a short greeting.",
});

for await (const chunk of textStream) {
  process.stdout.write(chunk);
}

Embeddings

import { aiStats } from "@ai-stats/ai-sdk-provider";
import { embed } from "ai";

const { embedding } = await embed({
  model: aiStats.textEmbeddingModel("google/gemini-embedding-001"),
  value: "Vector search uses embeddings to find similar items.",
});

console.log(embedding.length);

Custom configuration

import { createAIStats } from "@ai-stats/ai-sdk-provider";
import { generateText } from "ai";

const aiStats = createAIStats({
  apiKey: process.env.AI_STATS_API_KEY,
  baseURL: process.env.AI_STATS_BASE_URL,
});

const result = await generateText({
  model: aiStats("openai/gpt-5-nano"),
  prompt: "Hello from a custom gateway.",
});

console.log(result.text);
Last modified on February 11, 2026