APISDK
Headless SDK
Use Satori with existing OpenAI-compatible clients
Overview
If you already use OpenAI-compatible clients, you can point them at Satori by only changing environment variables, skipping any additional dependencies.
Env
export OPENAI_BASE_URL="https://api.satori.sh/v1"
export OPENAI_API_KEY=<openai-api-key>
export SATORI_API_KEY=<satori-key>
export MODEL=<model name, e.g. 'gpt-4o'>For local dev:
export OPENAI_BASE_URL="http://localhost:8000/v1"If you want memory scoping, set an ID and pass it in request metadata:
If you already have a memory layer that you're using, provide it through metadata:
export SATORI_MEMORY_ID=<some-memory-id>OpenAI SDK example (unchanged client)
import OpenAI from "openai";
const client = new OpenAI();
const response = await client.responses.create({
model: process.env.MODEL ?? "gpt-4o",
input: "Write a two-sentence bedtime story about a friendly dragon.",
metadata: {
memory_id: process.env.SATORI_MEMORY_ID,
},
});AI SDK example (OpenAI provider)
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
const result = await generateText({
model: openai(process.env.MODEL ?? "gpt-4o"),
prompt: "Write a two-sentence bedtime story about a friendly dragon.",
});