-
-
-
-
-
-
-#### Videos
-
-Use the HTML video element for self-hosted video content:
-
-
-
-Embed YouTube videos using iframe elements:
-
-
-
-#### Tooltips
-
-Example of tooltip usage:
-
-
-
-
-## Code formatting
-
-We suggest using extensions on your IDE to recognize and format MDX. If you're a VSCode user, consider the [MDX VSCode extension](https://marketplace.visualstudio.com/items?itemName=unifiedjs.vscode-mdx) for syntax highlighting, and [Prettier](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode) for code formatting.
-
-## Troubleshooting
-
-
-
-## Image
-
-### Using Markdown
-
-The [markdown syntax](https://www.markdownguide.org/basic-syntax/#images) lets you add images using the following code
-
-```md
-
-```
-
-Note that the image file size must be less than 5MB. Otherwise, we recommend hosting on a service like [Cloudinary](https://cloudinary.com/) or [S3](https://aws.amazon.com/s3/). You can then use that URL and embed.
-
-### Using embeds
-
-To get more customizability with images, you can also use [embeds](/writing-content/embed) to add images
-
-```html
-
-```
-
-## Embeds and HTML elements
-
-
-
-... snippet content ...
-Agent[] |
+
+#### Examples
+
+```javascript
+// Write a log entry
+await client.app.log({
+ body: {
+ service: "my-app",
+ level: "info",
+ message: "Operation completed",
+ },
+})
+
+// List available agents
+const agents = await client.app.agents()
+```
+
+### Project
+
+| Method | Description | Response |
+| :------------------ | :------------------ | :-------------------------------------------- |
+| `project.list()` | List all projects | Project[] |
+| `project.current()` | Get current project | Project |
+
+#### Examples
+
+```javascript
+// List all projects
+const projects = await client.project.list()
+
+// Get current project
+const currentProject = await client.project.current()
+```
+
+### Path
+
+| Method | Description | Response |
+| :----------- | :--------------- | :--------------------------------------- |
+| `path.get()` | Get current path | Path |
+
+#### Examples
+
+```javascript
+// Get current path information
+const pathInfo = await client.path.get()
+```
+
+### Config
+
+| Method | Description | Response |
+| :------------------- | :-------------------------------- | :---------------------------------------------------------------------------------------------------- |
+| `config.get()` | Get config info | Config |
+| `config.providers()` | List providers and default models | `{ providers: `Provider[]`, default: { [key: string]: string } }` |
+
+#### Examples
+
+```javascript
+const config = await client.config.get()
+
+const { providers, default: defaults } = await client.config.providers()
+```
+
+### Sessions
+
+| Method | Description | Notes |
+| :--------------------------------------------------------- | :--------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------- |
+| `session.list()` | List sessions | Returns Session[] |
+| `session.get({ path })` | Get session | Returns Session |
+| `session.children({ path })` | List child sessions | Returns Session[] |
+| `session.create({ body })` | Create session | Returns Session |
+| `session.delete({ path })` | Delete session | Returns `boolean` |
+| `session.update({ path, body })` | Update session properties | Returns Session |
+| `session.init({ path, body })` | Analyze app and create `AGENTS.md` | Returns `boolean` |
+| `session.abort({ path })` | Abort a running session | Returns `boolean` |
+| `session.share({ path })` | Share session | Returns Session |
+| `session.unshare({ path })` | Unshare session | Returns Session |
+| `session.summarize({ path, body })` | Summarize session | Returns `boolean` |
+| `session.messages({ path })` | List messages in a session | Returns `{ info: `Message`, parts: `Part[]`}[]` |
+| `session.message({ path })` | Get message details | Returns `{ info: `Message`, parts: `Part[]`}` |
+| `session.prompt({ path, body })` | Send prompt message | `body.noReply: true` returns UserMessage (context only). Default returns AssistantMessage with AI response |
+| `session.command({ path, body })` | Send command to session | Returns `{ info: `AssistantMessage`, parts: `Part[]`}` |
+| `session.shell({ path, body })` | Run a shell command | Returns AssistantMessage |
+| `session.revert({ path, body })` | Revert a message | Returns Session |
+| `session.unrevert({ path })` | Restore reverted messages | Returns Session |
+| `postSessionByIdPermissionsByPermissionId({ path, body })` | Respond to a permission request | Returns `boolean` |
+
+#### Examples
+
+```javascript expandable
+// Create and manage sessions
+const session = await client.session.create({
+ body: { title: "My session" },
+})
+
+const sessions = await client.session.list()
+
+// Send a prompt message
+const result = await client.session.prompt({
+ path: { id: session.id },
+ body: {
+ model: { providerID: "anthropic", modelID: "claude-3-5-sonnet-20241022" },
+ parts: [{ type: "text", text: "Hello!" }],
+ },
+})
+
+// Inject context without triggering AI response (useful for plugins)
+await client.session.prompt({
+ path: { id: session.id },
+ body: {
+ noReply: true,
+ parts: [{ type: "text", text: "You are a helpful assistant." }],
+ },
+})
+```
+
+### Files
+
+| Method | Description | Response |
+| :------------------------ | :--------------------------- | :------------------------------------------------------------------------------------------ |
+| `find.text({ query })` | Search for text in files | Array of match objects with `path`, `lines`, `line_number`, `absolute_offset`, `submatches` |
+| `find.files({ query })` | Find files by name | `string[]` (file paths) |
+| `find.symbols({ query })` | Find workspace symbols | Symbol[] |
+| `file.read({ query })` | Read a file | `{ type: "raw" \| "patch", content: string }` |
+| `file.status({ query? })` | Get status for tracked files | File[] |
+
+#### Examples
+
+```javascript
+// Search and read files
+const textResults = await client.find.text({
+ query: { pattern: "function.*opencode" },
+})
+
+const files = await client.find.files({
+ query: { query: "*.ts" },
+})
+
+const content = await client.file.read({
+ query: { path: "src/index.ts" },
+})
+```
+
+### TUI
+
+| Method | Description | Response |
+| :----------------------------- | :------------------------ | :-------- |
+| `tui.appendPrompt({ body })` | Append text to the prompt | `boolean` |
+| `tui.openHelp()` | Open the help dialog | `boolean` |
+| `tui.openSessions()` | Open the session selector | `boolean` |
+| `tui.openThemes()` | Open the theme selector | `boolean` |
+| `tui.openModels()` | Open the model selector | `boolean` |
+| `tui.submitPrompt()` | Submit the current prompt | `boolean` |
+| `tui.clearPrompt()` | Clear the prompt | `boolean` |
+| `tui.executeCommand({ body })` | Execute a command | `boolean` |
+| `tui.showToast({ body })` | Show toast notification | `boolean` |
+
+#### Examples
+
+```javascript
+// Control TUI interface
+await client.tui.appendPrompt({
+ body: { text: "Add this to prompt" },
+})
+
+await client.tui.showToast({
+ body: { message: "Task completed", variant: "success" },
+})
+```
+
+### Auth
+
+| Method | Description | Response |
+| :------------------ | :----------------------------- | :-------- |
+| `auth.set({ ... })` | Set authentication credentials | `boolean` |
+
+#### Examples
+
+```javascript
+await client.auth.set({
+ path: { id: "anthropic" },
+ body: { type: "api", key: "your-api-key" },
+})
+```
+
+### Events
+
+| Method | Description | Response |
+| :------------------ | :------------------------ | :------------------------ |
+| `event.subscribe()` | Server-sent events stream | Server-sent events stream |
+
+#### Examples
+
+```javascript
+// Listen to real-time events
+const events = await client.event.subscribe()
+for await (const event of events.stream) {
+ console.log("Event:", event.type, event.properties)
+}
+```
diff --git a/packages/docs/server.mdx b/packages/docs/server.mdx
new file mode 100644
index 00000000000..2c440bd8032
--- /dev/null
+++ b/packages/docs/server.mdx
@@ -0,0 +1,213 @@
+---
+title: Server
+description: Interact with opencode server over HTTP.
+---
+
+The `opencode serve` command runs a headless HTTP server that exposes an OpenAPI endpoint that an opencode client can use.
+
+### Usage
+
+```bash
+opencode serve [--port Project[] |
+| `GET` | `/project/current` | Get the current project | Project |
+
+
+### Path & VCS
+
+| Method | Path | Description | Response |
+| :----- | :------ | :----------------------------------- | :------------------------------------------ |
+| `GET` | `/path` | Get the current path | Path |
+| `GET` | `/vcs` | Get VCS info for the current project | VcsInfo |
+
+### Instance
+
+| Method | Path | Description | Response |
+| :----- | :------------------ | :--------------------------- | :-------- |
+| `POST` | `/instance/dispose` | Dispose the current instance | `boolean` |
+
+### Config
+
+| Method | Path | Description | Response |
+| :------ | :------------------ | :-------------------------------- | :--------------------------------------------------------------------------------------- |
+| `GET` | `/config` | Get config info | Config |
+| `PATCH` | `/config` | Update config | Config |
+| `GET` | `/config/providers` | List providers and default models | `{ providers: `Provider[]`, default: { [key: string]: string } }` |
+
+### Provider
+
+| Method | Path | Description | Response |
+| :----- | :------------------------------- | :----------------------------------- | :---------------------------------------------------------------------------------- |
+| `GET` | `/provider` | List all providers | `{ all: `Provider[]`, default: {...}, connected: string[] }` |
+| `GET` | `/provider/auth` | Get provider authentication methods | `{ [providerID: string]: `ProviderAuthMethod[]` }` |
+| `POST` | `/provider/{id}/oauth/authorize` | Authorize a provider using OAuth | ProviderAuthAuthorization |
+| `POST` | `/provider/{id}/oauth/callback` | Handle OAuth callback for a provider | `boolean` |
+
+### Sessions
+
+| Method | Path | Description | Notes |
+| :------- | :--------------------------------------- | :------------------------------------ | :--------------------------------------------------------------------------------- |
+| `GET` | `/session` | List all sessions | Returns Session[] |
+| `POST` | `/session` | Create a new session | body: `{ parentID?, title? }`, returns Session |
+| `GET` | `/session/status` | Get session status for all sessions | Returns `{ [sessionID: string]: `SessionStatus` }` |
+| `GET` | `/session/:id` | Get session details | Returns Session |
+| `DELETE` | `/session/:id` | Delete a session and all its data | Returns `boolean` |
+| `PATCH` | `/session/:id` | Update session properties | body: `{ title? }`, returns Session |
+| `GET` | `/session/:id/children` | Get a session's child sessions | Returns Session[] |
+| `GET` | `/session/:id/todo` | Get the todo list for a session | Returns Todo[] |
+| `POST` | `/session/:id/init` | Analyze app and create `AGENTS.md` | body: `{ messageID, providerID, modelID }`, returns `boolean` |
+| `POST` | `/session/:id/fork` | Fork an existing session at a message | body: `{ messageID? }`, returns Session |
+| `POST` | `/session/:id/abort` | Abort a running session | Returns `boolean` |
+| `POST` | `/session/:id/share` | Share a session | Returns Session |
+| `DELETE` | `/session/:id/share` | Unshare a session | Returns Session |
+| `GET` | `/session/:id/diff` | Get the diff for this session | query: `messageID?`, returns FileDiff[] |
+| `POST` | `/session/:id/summarize` | Summarize the session | body: `{ providerID, modelID }`, returns `boolean` |
+| `POST` | `/session/:id/revert` | Revert a message | body: `{ messageID, partID? }`, returns `boolean` |
+| `POST` | `/session/:id/unrevert` | Restore all reverted messages | Returns `boolean` |
+| `POST` | `/session/:id/permissions/:permissionID` | Respond to a permission request | body: `{ response, remember? }`, returns `boolean` |
+
+### Messages
+
+| Method | Path | Description | Notes |
+| :----- | :--------------------------------- | :-------------------------------------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| `GET` | `/session/:id/message` | List messages in a session | query: `limit?`, returns `{ info: `Message`, parts: `Part[]`}[]` |
+| `POST` | `/session/:id/message` | Send a message and wait for response | body: `{ messageID?, model?, agent?, noReply?, system?, tools?, parts }`, returns `{ info: `Message`, parts: `Part[]`}` |
+| `GET` | `/session/:id/message/:messageID` | Get message details | Returns `{ info: `Message`, parts: `Part[]`}` |
+| `POST` | `/session/:id/prompt_async` | Send a message asynchronously (no wait) | body: same as `/session/:id/message`, returns `204 No Content` |
+| `POST` | `/session/:id/command` | Execute a slash command | body: `{ messageID?, agent?, model?, command, arguments }`, returns `{ info: `Message`, parts: `Part[]`}` |
+| `POST` | `/session/:id/shell` | Run a shell command | body: `{ agent, model?, command }`, returns `{ info: `Message`, parts: `Part[]`}` |
+
+### Commands
+
+| Method | Path | Description | Response |
+| :----- | :--------- | :---------------- | :-------------------------------------------- |
+| `GET` | `/command` | List all commands | Command[] |
+
+### Files
+
+| Method | Path | Description | Response |
+| :----- | :----------------------- | :--------------------------- | :------------------------------------------------------------------------------------------ |
+| `GET` | `/find?pattern=` | Find files by name | `string[]` (file paths) | +| `GET` | `/find/symbol?query=` | Find workspace symbols |Symbol[]| +| `GET` | `/file?path=` | List files and directories | FileNode[]| +| `GET` | `/file/content?path=` | Read a file |
FileContent| +| `GET` | `/file/status` | Get status for tracked files |File[]| + +### Tools (Experimental) + +| Method | Path | Description | Response | +| :----- | :------------------------------------------ | :--------------------------------------- | :------------------------------------------- | +| `GET` | `/experimental/tool/ids` | List all tool IDs |ToolIDs| +| `GET` | `/experimental/tool?provider=&model=
` | List tools with JSON schemas for a model | ToolList| + +### LSP, Formatters & MCP + +| Method | Path | Description | Response | +| :----- | :----------- | :------------------------- | :------------------------------------------------------- | +| `GET` | `/lsp` | Get LSP server status |LSPStatus[]| +| `GET` | `/formatter` | Get formatter status |FormatterStatus[]| +| `GET` | `/mcp` | Get MCP server status | `{ [name: string]: `MCPStatus` }` | +| `POST` | `/mcp` | Add MCP server dynamically | body: `{ name, config }`, returns MCP status object | + +### Agents + +| Method | Path | Description | Response | +| :----- | :------- | :------------------------ | :------------------------------------------ | +| `GET` | `/agent` | List all available agents |Agent[]| + +### Logging + +| Method | Path | Description | Response | +| :----- | :----- | :----------------------------------------------------------- | :-------- | +| `POST` | `/log` | Write log entry. Body: `{ service, level, message, extra? }` | `boolean` | + +### TUI + +| Method | Path | Description | Response | +| :----- | :---------------------- | :------------------------------------------ | :--------------------- | +| `POST` | `/tui/append-prompt` | Append text to the prompt | `boolean` | +| `POST` | `/tui/open-help` | Open the help dialog | `boolean` | +| `POST` | `/tui/open-sessions` | Open the session selector | `boolean` | +| `POST` | `/tui/open-themes` | Open the theme selector | `boolean` | +| `POST` | `/tui/open-models` | Open the model selector | `boolean` | +| `POST` | `/tui/submit-prompt` | Submit the current prompt | `boolean` | +| `POST` | `/tui/clear-prompt` | Clear the prompt | `boolean` | +| `POST` | `/tui/execute-command` | Execute a command (`{ command }`) | `boolean` | +| `POST` | `/tui/show-toast` | Show toast (`{ title?, message, variant }`) | `boolean` | +| `GET` | `/tui/control/next` | Wait for the next control request | Control request object | +| `POST` | `/tui/control/response` | Respond to a control request (`{ body }`) | `boolean` | + +### Auth + +| Method | Path | Description | Response | +| :----- | :---------- | :-------------------------------------------------------------- | :-------- | +| `PUT` | `/auth/:id` | Set authentication credentials. Body must match provider schema | `boolean` | + +### Events + +| Method | Path | Description | Response | +| :----- | :------- | :---------------------------------------------------------------------------- | :------------------------ | +| `GET` | `/event` | Server-sent events stream. First event is `server.connected`, then bus events | Server-sent events stream | + +### Docs + +| Method | Path | Description | Response | +| :----- | :----- | :------------------------ | :-------------------------- | +| `GET` | `/doc` | OpenAPI 3.1 specification | HTML page with OpenAPI spec | diff --git a/packages/docs/share.mdx b/packages/docs/share.mdx new file mode 100644 index 00000000000..4cf6a39f487 --- /dev/null +++ b/packages/docs/share.mdx @@ -0,0 +1,110 @@ +--- +title: Share +description: Share your OpenCode conversations. +--- + +OpenCode's share feature allows you to create public links to your OpenCode conversations, so you can collaborate with teammates or get help from others. + ++**NOTE** + +Shared conversations are publicly accessible to anyone with the link. + + +## How it works + +When you share a conversation, OpenCode: + +1. Creates a unique public URL for your session +2. Syncs your conversation history to our servers +3. Makes the conversation accessible via the shareable link — `opncd.ai/s/` + +## Sharing + +OpenCode supports three sharing modes that control how conversations are shared: + +### Manual (default) + +By default, OpenCode uses manual sharing mode. Sessions are not shared automatically, but you can manually share them using the `/share` command: + +``` +/share +``` + +This will generate a unique URL that'll be copied to your clipboard. + +To explicitly set manual mode in your [config file](/config): + +```json opencode.json +{ + "$schema": "https://opncd.ai/config.json", + "share": "manual" +} +``` + +### Auto-share + +You can enable automatic sharing for all new conversations by setting the `share` option to `"auto"` in your [config file](/config): + +```json opencode.json +{ + "$schema": "https://opncd.ai/config.json", + "share": "auto" +} +``` + +With auto-share enabled, every new conversation will automatically be shared and a link will be generated. + +### Disabled + +You can disable sharing entirely by setting the `share` option to `"disabled"` in your [config file](/config): + +```json opencode.json +{ + "$schema": "https://opncd.ai/config.json", + "share": "disabled" +} +``` + +To enforce this across your team for a given project, add it to the `opencode.json` in your project and check into Git. + +## Un-sharing + +To stop sharing a conversation and remove it from public access: + +```bash +/unshare +``` + +This will remove the share link and delete the data related to the conversation. + +## Privacy + +There are a few things to keep in mind when sharing a conversation. + +### Data retention + +Shared conversations remain accessible until you explicitly unshare them. This +includes: + +- Full conversation history +- All messages and responses +- Session metadata + +### Recommendations + +- Only share conversations that don't contain sensitive information. +- Review conversation content before sharing. +- Unshare conversations when collaboration is complete. +- Avoid sharing conversations with proprietary code or confidential data. +- For sensitive projects, disable sharing entirely. + +## For enterprises + +For enterprise deployments, the share feature can be: + +- **Disabled** entirely for security compliance +- **Restricted** to users authenticated through SSO only +- **Self-hosted** on your own infrastructure + +[Learn more](/enterprise) about using opencode in your organization. diff --git a/packages/docs/snippets/snippet-intro.mdx b/packages/docs/snippets/snippet-intro.mdx deleted file mode 100644 index e20fbb6fc93..00000000000 --- a/packages/docs/snippets/snippet-intro.mdx +++ /dev/null @@ -1,4 +0,0 @@ -One of the core principles of software development is DRY (Don't Repeat -Yourself). This is a principle that applies to documentation as -well. If you find yourself repeating the same content in multiple places, you -should consider creating a custom snippet to keep your content in sync. diff --git a/packages/docs/themes.mdx b/packages/docs/themes.mdx new file mode 100644 index 00000000000..0ab731e3e52 --- /dev/null +++ b/packages/docs/themes.mdx @@ -0,0 +1,349 @@ +--- +title: Themes +description: Select a built-in theme or define your own. +--- + +With OpenCode you can select from one of several built-in themes, use a theme that adapts to your terminal theme, or define your own custom theme. + +By default, OpenCode uses our own `opencode` theme. + +## Terminal requirements + +For themes to display correctly with their full color palette, your terminal must support **truecolor** (24-bit color). Most modern terminals support this by default, but you may need to enable it: + +- **Check support**: Run `echo $COLORTERM` - it should output `truecolor` or `24bit` +- **Enable truecolor**: Set the environment variable `COLORTERM=truecolor` in your shell profile +- **Terminal compatibility**: Ensure your terminal emulator supports 24-bit color (most modern terminals like iTerm2, Alacritty, Kitty, Windows Terminal, and recent versions of GNOME Terminal do) + +Without truecolor support, themes may appear with reduced color accuracy or fall back to the nearest 256-color approximation. + +## Built-in themes + +OpenCode comes with several built-in themes. + +| Name | Description | +| :--------------------- | :--------------------------------------------------------------------------- | +| `system` | Adapts to your terminal’s background color | +| `tokyonight` | Based on the [Tokyonight](https://github.com/folke/tokyonight.nvim) theme | +| `everforest` | Based on the [Everforest](https://github.com/sainnhe/everforest) theme | +| `ayu` | Based on the [Ayu](https://github.com/ayu-theme) dark theme | +| `catppuccin` | Based on the [Catppuccin](https://github.com/catppuccin) theme | +| `catppuccin-macchiato` | Based on the [Catppuccin](https://github.com/catppuccin) theme | +| `gruvbox` | Based on the [Gruvbox](https://github.com/morhetz/gruvbox) theme | +| `kanagawa` | Based on the [Kanagawa](https://github.com/rebelot/kanagawa.nvim) theme | +| `nord` | Based on the [Nord](https://github.com/nordtheme/nord) theme | +| `matrix` | Hacker-style green on black theme | +| `one-dark` | Based on the [Atom One](https://github.com/Th3Whit3Wolf/one-nvim) Dark theme | + +And more, we are constantly adding new themes. + +## System theme + +The `system` theme is designed to automatically adapt to your terminal's color scheme. Unlike traditional themes that use fixed colors, the _system_ theme: + +- **Generates gray scale**: Creates a custom gray scale based on your terminal's background color, ensuring optimal contrast. +- **Uses ANSI colors**: Leverages standard ANSI colors (0-15) for syntax highlighting and UI elements, which respect your terminal's color palette. +- **Preserves terminal defaults**: Uses `none` for text and background colors to maintain your terminal's native appearance. + +The system theme is for users who: + +- Want OpenCode to match their terminal's appearance +- Use custom terminal color schemes +- Prefer a consistent look across all terminal applications + +## Using a theme + +You can select a theme by bringing up the theme select with the `/theme` command. Or you can specify it in your [config](/config). + +```json opencode.json highlight={3} +{ + "$schema": "https://opencode.ai/config.json", + "theme": "tokyonight" +} +``` + + +## Custom themes + +OpenCode supports a flexible JSON-based theme system that allows users to create and customize themes easily. + + +### Hierarchy + +Themes are loaded from multiple directories in the following order where later directories override earlier ones: + +1. **Built-in themes** - These are embedded in the binary +2. **User config directory** - Defined in `~/.config/opencode/themes/*.json` or `$XDG_CONFIG_HOME/opencode/themes/*.json` +3. **Project root directory** - Defined in the ` /.opencode/themes/*.json` +4. **Current working directory** - Defined in `./.opencode/themes/*.json` + +If multiple directories contain a theme with the same name, the theme from the directory with higher priority will be used. + +### Creating a theme + +To create a custom theme, create a JSON file in one of the theme directories. + +For user-wide themes: + +```bash +mkdir -p ~/.config/opencode/themes +vim ~/.config/opencode/themes/my-theme.json +``` + +And for project-specific themes. + +```bash +mkdir -p .opencode/themes +vim .opencode/themes/my-theme.json +``` + +### JSON format + +Themes use a flexible JSON format with support for: + +- **Hex colors**: `"#ffffff"` +- **ANSI colors**: `3` (0-255) +- **Color references**: `"primary"` or custom definitions +- **Dark/light variants**: `{"dark": "#000", "light": "#fff"}` +- **No color**: `"none"` - Uses the terminal's default color or transparent + +### Color definitions + +The `defs` section is optional and it allows you to define reusable colors that can be referenced in the theme. + +### Terminal defaults + +The special value `"none"` can be used for any color to inherit the terminal's default color. This is particularly useful for creating themes that blend seamlessly with your terminal's color scheme: + +- `"text": "none"` - Uses terminal's default foreground color +- `"background": "none"` - Uses terminal's default background color + +### Example + +Here's an example of a custom theme: + +```json my-theme.json expandable +{ + "$schema": "https://opencode.ai/theme.json", + "defs": { + "nord0": "#2E3440", + "nord1": "#3B4252", + "nord2": "#434C5E", + "nord3": "#4C566A", + "nord4": "#D8DEE9", + "nord5": "#E5E9F0", + "nord6": "#ECEFF4", + "nord7": "#8FBCBB", + "nord8": "#88C0D0", + "nord9": "#81A1C1", + "nord10": "#5E81AC", + "nord11": "#BF616A", + "nord12": "#D08770", + "nord13": "#EBCB8B", + "nord14": "#A3BE8C", + "nord15": "#B48EAD" + }, + "theme": { + "primary": { + "dark": "nord8", + "light": "nord10" + }, + "secondary": { + "dark": "nord9", + "light": "nord9" + }, + "accent": { + "dark": "nord7", + "light": "nord7" + }, + "error": { + "dark": "nord11", + "light": "nord11" + }, + "warning": { + "dark": "nord12", + "light": "nord12" + }, + "success": { + "dark": "nord14", + "light": "nord14" + }, + "info": { + "dark": "nord8", + "light": "nord10" + }, + "text": { + "dark": "nord4", + "light": "nord0" + }, + "textMuted": { + "dark": "nord3", + "light": "nord1" + }, + "background": { + "dark": "nord0", + "light": "nord6" + }, + "backgroundPanel": { + "dark": "nord1", + "light": "nord5" + }, + "backgroundElement": { + "dark": "nord1", + "light": "nord4" + }, + "border": { + "dark": "nord2", + "light": "nord3" + }, + "borderActive": { + "dark": "nord3", + "light": "nord2" + }, + "borderSubtle": { + "dark": "nord2", + "light": "nord3" + }, + "diffAdded": { + "dark": "nord14", + "light": "nord14" + }, + "diffRemoved": { + "dark": "nord11", + "light": "nord11" + }, + "diffContext": { + "dark": "nord3", + "light": "nord3" + }, + "diffHunkHeader": { + "dark": "nord3", + "light": "nord3" + }, + "diffHighlightAdded": { + "dark": "nord14", + "light": "nord14" + }, + "diffHighlightRemoved": { + "dark": "nord11", + "light": "nord11" + }, + "diffAddedBg": { + "dark": "#3B4252", + "light": "#E5E9F0" + }, + "diffRemovedBg": { + "dark": "#3B4252", + "light": "#E5E9F0" + }, + "diffContextBg": { + "dark": "nord1", + "light": "nord5" + }, + "diffLineNumber": { + "dark": "nord2", + "light": "nord4" + }, + "diffAddedLineNumberBg": { + "dark": "#3B4252", + "light": "#E5E9F0" + }, + "diffRemovedLineNumberBg": { + "dark": "#3B4252", + "light": "#E5E9F0" + }, + "markdownText": { + "dark": "nord4", + "light": "nord0" + }, + "markdownHeading": { + "dark": "nord8", + "light": "nord10" + }, + "markdownLink": { + "dark": "nord9", + "light": "nord9" + }, + "markdownLinkText": { + "dark": "nord7", + "light": "nord7" + }, + "markdownCode": { + "dark": "nord14", + "light": "nord14" + }, + "markdownBlockQuote": { + "dark": "nord3", + "light": "nord3" + }, + "markdownEmph": { + "dark": "nord12", + "light": "nord12" + }, + "markdownStrong": { + "dark": "nord13", + "light": "nord13" + }, + "markdownHorizontalRule": { + "dark": "nord3", + "light": "nord3" + }, + "markdownListItem": { + "dark": "nord8", + "light": "nord10" + }, + "markdownListEnumeration": { + "dark": "nord7", + "light": "nord7" + }, + "markdownImage": { + "dark": "nord9", + "light": "nord9" + }, + "markdownImageText": { + "dark": "nord7", + "light": "nord7" + }, + "markdownCodeBlock": { + "dark": "nord4", + "light": "nord0" + }, + "syntaxComment": { + "dark": "nord3", + "light": "nord3" + }, + "syntaxKeyword": { + "dark": "nord9", + "light": "nord9" + }, + "syntaxFunction": { + "dark": "nord8", + "light": "nord8" + }, + "syntaxVariable": { + "dark": "nord7", + "light": "nord7" + }, + "syntaxString": { + "dark": "nord14", + "light": "nord14" + }, + "syntaxNumber": { + "dark": "nord15", + "light": "nord15" + }, + "syntaxType": { + "dark": "nord7", + "light": "nord7" + }, + "syntaxOperator": { + "dark": "nord9", + "light": "nord9" + }, + "syntaxPunctuation": { + "dark": "nord4", + "light": "nord0" + } + } +} +``` diff --git a/packages/docs/tools.mdx b/packages/docs/tools.mdx new file mode 100644 index 00000000000..1fe104b34bd --- /dev/null +++ b/packages/docs/tools.mdx @@ -0,0 +1,292 @@ +--- +title: Tools +description: Manage the tools an LLM can use. +--- + +Tools allow the LLM to perform actions in your codebase. OpenCode comes with a set of built-in tools, but you can extend it with [custom tools](/custom-tools) or [MCP servers](/mcp-servers). + +By default, all tools are **enabled** and don't need permission to run. But you can configure this and control the [permissions](/permissions) through your config. + +## Configure + +You can configure tools globally or per agent. Agent-specific configs override global settings. + +By default, all tools are set to `true`. To disable a tool, set it to `false`. + + +### Global + +Disable or enable tools globally using the `tools` option. + +```json opencode.json +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "write": false, + "bash": false, + "webfetch": true + } +} +``` + +You can also use wildcards to control multiple tools at once. For example, to disable all tools from an MCP server: + +```json opencode.json +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "mymcp_*": false + } +} +``` + +### Per agent + +Override global tool settings for specific agents using the `tools` config in the agent definition. + +```json opencode.json highlight={3-6,9-12} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "write": true, + "bash": true + }, + "agent": { + "plan": { + "tools": { + "write": false, + "bash": false + } + } + } +} +``` + +For example, here the `plan` agent overrides the global config to disable `write` and `bash` tools. + +You can also configure tools for agents in Markdown. + +```markdown ~/.config/opencode/agent/readonly.md +--- +description: Read-only analysis agent +mode: subagent +tools: + write: false + edit: false + bash: false +--- + +Analyze code without making any modifications. +``` + +[Learn more](/agents#tools) about configuring tools per agent. + +## Built-in + +Here are all the built-in tools available in OpenCode. + +### bash + +Execute shell commands in your project environment. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "bash": true + } +} +``` + +This tool allows the LLM to run terminal commands like `npm install`, `git status`, or any other shell command. + +### edit + +Modify existing files using exact string replacements. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "edit": true + } +} +``` + +This tool performs precise edits to files by replacing exact text matches. It's the primary way the LLM modifies code. + +### write + +Create new files or overwrite existing ones. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "write": true + } +} +``` + +Use this to allow the LLM to create new files. It will overwrite existing files if they already exist. + +### read + +Read file contents from your codebase. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "read": true + } +} +``` + +This tool reads files and returns their contents. It supports reading specific line ranges for large files. + +### grep + +Search file contents using regular expressions. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "grep": true + } +} +``` + +Fast content search across your codebase. Supports full regex syntax and file pattern filtering. + +### glob + +Find files by pattern matching. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "glob": true + } +} +``` + +Search for files using glob patterns like `**/*.js` or `src/**/*.ts`. Returns matching file paths sorted by modification time. + +--- + +### list + +List files and directories in a given path. + +```json title="opencode.json" {4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "list": true + } +} +``` + +This tool lists directory contents. It accepts glob patterns to filter results. + +### patch + +Apply patches to files. + +```json title="opencode.json" {4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "patch": true + } +} +``` + +This tool applies patch files to your codebase. Useful for applying diffs and patches from various sources. + +### todowrite + +Manage todo lists during coding sessions. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "todowrite": true + } +} +``` + +Creates and updates task lists to track progress during complex operations. The LLM uses this to organize multi-step tasks. + + +**NOTE** + +This tool is disabled for subagents by default, but you can enable it manually. [Learn more](/agents#tools) + + +### todoread + +Read existing todo lists. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "todoread": true + } +} +``` + +Reads the current todo list state. Used by the LLM to track what tasks are pending or completed. + ++**NOTE** + +This tool is disabled for subagents by default, but you can enable it manually. [Learn more](/agents#tools) + +### webfetch + +Fetch web content. + +```json opencode.json highlight={4} +{ + "$schema": "https://opencode.ai/config.json", + "tools": { + "webfetch": true + } +} +``` + +Allows the LLM to fetch and read web pages. Useful for looking up documentation or researching online resources. + +## Custom tools + +Custom tools let you define your own functions that the LLM can call. These are defined in your config file and can execute arbitrary code. + +[Learn more](/custom-tools) about creating custom tools. + +## MCP servers + +MCP (Model Context Protocol) servers allow you to integrate external tools and services. This includes database access, API integrations, and third-party services. + +[Learn more](/mcp-servers) about configuring MCP servers. + +## Internals + +Internally, tools like `grep`, `glob`, and `list` use [ripgrep](https://github.com/BurntSushi/ripgrep) under the hood. By default, ripgrep respects `.gitignore` patterns, which means files and directories listed in your `.gitignore` will be excluded from searches and listings. + +### Ignore patterns + +To include files that would normally be ignored, create a `.ignore` file in your project root. This file can explicitly allow certain paths. + +```text .ignore +!node_modules/ +!dist/ +!build/ +``` + +For example, this `.ignore` file allows ripgrep to search within `node_modules/`, `dist/`, and `build/` directories even if they're listed in `.gitignore`. diff --git a/packages/docs/troubleshooting.mdx b/packages/docs/troubleshooting.mdx new file mode 100644 index 00000000000..937f8601efd --- /dev/null +++ b/packages/docs/troubleshooting.mdx @@ -0,0 +1,147 @@ +--- +title: Troubleshooting +description: Common issues and how to resolve them. +--- + +To debug any issues with OpenCode, you can check the logs or the session data +that it stores locally. + +### Logs + +Log files are written to: + +- **macOS/Linux**: `~/.local/share/opencode/log/` +- **Windows**: `%USERPROFILE%\.local\share\opencode\log\` + +Log files are named with timestamps (e.g., `2025-01-09T123456.log`) and the most recent 10 log files are kept. + +You can set the log level with the `--log-level` command-line option to get more detailed debug information. For example, `opencode --log-level DEBUG`. + +### Storage + +opencode stores session data and other application data on disk at: + +- **macOS/Linux**: `~/.local/share/opencode/` +- **Windows**: `%USERPROFILE%\.local\share\opencode` + +This directory contains: + +- `auth.json` - Authentication data like API keys, OAuth tokens +- `log/` - Application logs +- `project/` - Project-specific data like session and message data + - If the project is within a Git repo, it is stored in `.//storage/` + - If it is not a Git repo, it is stored in `./global/storage/` + +## Getting help + +If you're experiencing issues with OpenCode: + +1. **Report issues on GitHub** + + The best way to report bugs or request features is through our GitHub repository: + + [**github.com/sst/opencode/issues**](https://github.com/sst/opencode/issues) + + Before creating a new issue, search existing issues to see if your problem has already been reported. + +2. **Join our Discord** + + For real-time help and community discussion, join our Discord server: + + [**opencode.ai/discord**](https://opencode.ai/discord) + + +## Common issues + +Here are some common issues and how to resolve them. + +### OpenCode won't start + +1. Check the logs for error messages +2. Try running with `--print-logs` to see output in the terminal +3. Ensure you have the latest version with `opencode upgrade` + + +### Authentication issues + +1. Try re-authenticating with the `/connect` command in the TUI +2. Check that your API keys are valid +3. Ensure your network allows connections to the provider's API + +### Model not available + +1. Check that you've authenticated with the provider +2. Verify the model name in your config is correct +3. Some models may require specific access or subscriptions + +If you encounter `ProviderModelNotFoundError` you are most likely incorrectly +referencing a model somewhere. +Models should be referenced like so: ` / ` + +Examples: + +- `openai/gpt-4.1` +- `openrouter/google/gemini-2.5-flash` +- `opencode/kimi-k2` + +To figure out what models you have access to, run `opencode models` + +### ProviderInitError + +If you encounter a ProviderInitError, you likely have an invalid or corrupted configuration. + +To resolve this: + +1. First, verify your provider is set up correctly by following the [providers guide](/providers) +2. If the issue persists, try clearing your stored configuration: + + ```bash + rm -rf ~/.local/share/opencode + ``` + +3. Re-authenticate with your provider using the `/connect` command in the TUI. + +### AI_APICallError and provider package issues + +If you encounter API call errors, this may be due to outdated provider packages. opencode dynamically installs provider packages (OpenAI, Anthropic, Google, etc.) as needed and caches them locally. + +To resolve provider package issues: + +1. Clear the provider package cache: + + ```bash + rm -rf ~/.cache/opencode + ``` + +2. Restart opencode to reinstall the latest provider packages + +This will force opencode to download the most recent versions of provider packages, which often resolves compatibility issues with model parameters and API changes. + +### Copy/paste not working on Linux + +Linux users need to have one of the following clipboard utilities installed for copy/paste functionality to work: + +**For X11 systems:** + +```bash +apt install -y xclip +# or +apt install -y xsel +``` + +**For Wayland systems:** + +```bash +apt install -y wl-clipboard +``` + +**For headless environments:** + +```bash +apt install -y xvfb +# and run: +Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 & +export DISPLAY=:99.0 +``` + +opencode will detect if you're using Wayland and prefer `wl-clipboard`, otherwise it will try to find clipboard tools in order of: `xclip` and `xsel`. diff --git a/packages/docs/tui.mdx b/packages/docs/tui.mdx new file mode 100644 index 00000000000..835c8eeffe1 --- /dev/null +++ b/packages/docs/tui.mdx @@ -0,0 +1,359 @@ +--- +title: TUI +description: Using the OpenCode terminal user interface. +--- + + +OpenCode provides an interactive terminal interface or TUI for working on your projects with an LLM. + +Running OpenCode starts the TUI for the current directory. + +```bash +opencode +``` + +Or you can start it for a specific working directory. + +```bash +opencode /path/to/project +``` + +Once you're in the TUI, you can prompt it with a message. + +```text +Give me a quick summary of the codebase. +``` + +## File references + +You can reference files in your messages using `@`. This does a fuzzy file search in the current working directory. + + +**TIP** + +You can also use `@` to reference files in your messages. + + +```text +How is auth handled in @packages/functions/src/api/index.ts? +``` + +The content of the file is added to the conversation automatically. + +## Bash commands + +Start a message with `!` to run a shell command. + +```bash frame="none" +!ls -la +``` + +The output of the command is added to the conversation as a tool result. +## Commands + +When using the OpenCode TUI, you can type `/` followed by a command name to quickly execute actions. For example: + +```bash +/help +``` + +Most commands also have keybind using `ctrl+x` as the leader key, where `ctrl+x` is the default leader key. [Learn more](/keybinds). + +Here are all available slash commands: + + + +### connect + +Add a provider to OpenCode. Allows you to select from available providers and add their API keys. + +```bash +/connect +``` + +### compact + +Compact the current session. _Alias_: `/summarize` + +```bash +/compact +``` + +**Keybind:** `ctrl+x c` + +### details + +Toggle tool execution details. + +```bash +/details +``` + +**Keybind:** `ctrl+x d` + +### editor + +Open external editor for composing messages. Uses the editor set in your `EDITOR` environment variable. [Learn more](#editor-setup). + +```bash +/editor +``` + +**Keybind:** `ctrl+x e` + +### exit + +Exit OpenCode. _Aliases_: `/quit`, `/q` + +```bash frame="none" +/exit +``` + +**Keybind:** `ctrl+x q` + +### export + +Export current conversation to Markdown and open in your default editor. Uses the editor set in your `EDITOR` environment variable. [Learn more](#editor-setup). + +```bash frame="none" +/export +``` + +**Keybind:** `ctrl+x x` + + +### help + +Show the help dialog. + +```bash frame="none" +/help +``` + +**Keybind:** `ctrl+x h` + + + +### init + +Create or update `AGENTS.md` file. [Learn more](/rules). + +```bash frame="none" +/init +``` + +**Keybind:** `ctrl+x i` + + + +### models + +List available models. + +```bash frame="none" +/models +``` + +**Keybind:** `ctrl+x m` + + + +### new + +Start a new session. _Alias_: `/clear` + +```bash frame="none" +/new +``` + +**Keybind:** `ctrl+x n` + + + +### redo + +Redo a previously undone message. Only available after using `/undo`. + ++**TIP** + +Any file changes will also be restored. + + +Internally, this uses Git to manage the file changes. So your project **needs to +be a Git repository**. + +```bash frame="none" +/redo +``` + +**Keybind:** `ctrl+x r` + + + +### sessions + +List and switch between sessions. _Aliases_: `/resume`, `/continue` + +```bash frame="none" +/sessions +``` + +**Keybind:** `ctrl+x l` + + + +### share + +Share current session. [Learn more](/share). + +```bash frame="none" +/share +``` + +**Keybind:** `ctrl+x s` + + + +### themes + +List available themes. + +```bash frame="none" +/themes +``` + +**Keybind:** `ctrl+x t` + +### undo + +Undo last message in the conversation. Removes the most recent user message, all subsequent responses, and any file changes. + ++**TIP** + +Any file changes made will also be reverted. + + +Internally, this uses Git to manage the file changes. So your project **needs to +be a Git repository**. + +```bash frame="none" +/undo +``` + +**Keybind:** `ctrl+x u` + + +### unshare + +Unshare current session. [Learn more](/share#un-sharing). + +```bash +/unshare +``` + + + +## Editor setup + +Both the `/editor` and `/export` commands use the editor specified in your `EDITOR` environment variable. + ++ + +Popular editor options include: + +- `code` - Visual Studio Code +- `cursor` - Cursor +- `windsurf` - Windsurf +- `vim` - Vim editor +- `nano` - Nano editor +- `notepad` - Windows Notepad +- `subl` - Sublime Text + ++ ```bash + # Example for nano or vim + export EDITOR=nano + export EDITOR=vim + + # For GUI editors, VS Code, Cursor, VSCodium, Windsurf, Zed, etc. + # include --wait + export EDITOR="code --wait" + ``` + + To make it permanent, add this to your shell profile; + `~/.bashrc`, `~/.zshrc`, etc. + + + ++ ```bash + set EDITOR=notepad + + # For GUI editors, VS Code, Cursor, VSCodium, Windsurf, Zed, etc. + # include --wait + set EDITOR=code --wait + ``` + + To make it permanent, use **System Properties** > **Environment + Variables**. + + + ++ ```powershell + $env:EDITOR = "notepad" + + # For GUI editors, VS Code, Cursor, VSCodium, Windsurf, Zed, etc. + # include --wait + $env:EDITOR = "code --wait" + ``` + + To make it permanent, add this to your PowerShell profile. + + ++**NOTE** + +Some editors like VS Code need to be started with the `--wait` flag. + + +Some editors need command-line arguments to run in blocking mode. The `--wait` flag makes the editor process block until closed. + + + +## Configure + +You can customize TUI behavior through your OpenCode config file. + +```json title="opencode.json" +{ + "$schema": "https://opencode.ai/config.json", + "tui": { + "scroll_speed": 3, + "scroll_acceleration": { + "enabled": true + } + } +} +``` + +### Options + +- `scroll_acceleration` - Enable macOS-style scroll acceleration for smooth, natural scrolling. When enabled, scroll speed increases with rapid scrolling gestures and stays precise for slower movements. **This setting takes precedence over `scroll_speed` and overrides it when enabled.** +- `scroll_speed` - Controls how fast the TUI scrolls when using scroll commands (minimum: `1`). Defaults to `1` on Unix and `3` on Windows. **Note: This is ignored if `scroll_acceleration.enabled` is set to `true`.** + + + +## View customization + +You can customize various aspects of the TUI view using the command palette (`ctrl+x h` or `/help`). These settings persist across restarts. + +### Username display + +Toggle whether your username appears in chat messages. Access this through: + +- Command palette: Search for "username" or "hide username" +- The setting persists automatically and will be remembered across TUI sessions diff --git a/packages/docs/zen.mdx b/packages/docs/zen.mdx new file mode 100644 index 00000000000..1da7be88e9a --- /dev/null +++ b/packages/docs/zen.mdx @@ -0,0 +1,205 @@ +--- +title: Zen +description: Curated list of models provided by OpenCode. +--- + +OpenCode Zen is a list of tested and verified models provided by the OpenCode team. + ++**NOTE** + +OpenCode Zen is currently in beta. + + +Zen works like any other provider in OpenCode. You login to OpenCode Zen and get +your API key. It's **completely optional** and you don't need to use it to use +OpenCode. + + +## Background + +There are a large number of models out there but only a few of +these models work well as coding agents. Additionally, most providers are +configured very differently; so you get very different performance and quality. + ++**TIP** + +We tested a select group of models and providers that work well with OpenCode. + + +So if you are using a model through something like OpenRouter, you can never be +sure if you are getting the best version of the model you want. + +To fix this, we did a couple of things: + +1. We tested a select group of models and talked to their teams about how to + best run them. +2. We then worked with a few providers to make sure these were being served + correctly. +3. Finally, we benchmarked the combination of the model/provider and came up + with a list that we feel good recommending. + +OpenCode Zen is an AI gateway that gives you access to these models. + +## How it works + +OpenCode Zen works like any other provider in OpenCode. + ++ + +You are charged per request and you can add credits to your account. + +## Endpoints + +You can also access our models through the following API endpoints. + +| Model | Model ID | Endpoint | AI SDK Package | +| :---------------- | :---------------- | :------------------------------------------------- | :--------------------------- | +| GPT 5.2 | gpt-5.2 | `https://opencode.ai/zen/v1/responses` | `@ai-sdk/openai` | +| GPT 5.1 | gpt-5.1 | `https://opencode.ai/zen/v1/responses` | `@ai-sdk/openai` | +| GPT 5.1 Codex | gpt-5.1-codex | `https://opencode.ai/zen/v1/responses` | `@ai-sdk/openai` | +| GPT 5.1 Codex Max | gpt-5.1-codex-max | `https://opencode.ai/zen/v1/responses` | `@ai-sdk/openai` | +| GPT 5 | gpt-5 | `https://opencode.ai/zen/v1/responses` | `@ai-sdk/openai` | +| GPT 5 Codex | gpt-5-codex | `https://opencode.ai/zen/v1/responses` | `@ai-sdk/openai` | +| GPT 5 Nano | gpt-5-nano | `https://opencode.ai/zen/v1/responses` | `@ai-sdk/openai` | +| Claude Sonnet 4.5 | claude-sonnet-4-5 | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` | +| Claude Sonnet 4 | claude-sonnet-4 | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` | +| Claude Haiku 4.5 | claude-haiku-4-5 | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` | +| Claude Haiku 3.5 | claude-3-5-haiku | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` | +| Claude Opus 4.5 | claude-opus-4-5 | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` | +| Claude Opus 4.1 | claude-opus-4-1 | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` | +| Gemini 3 Pro | gemini-3-pro | `https://opencode.ai/zen/v1/models/gemini-3-pro` | `@ai-sdk/google` | +| Gemini 3 Flash | gemini-3-flash | `https://opencode.ai/zen/v1/models/gemini-3-flash` | `@ai-sdk/google` | +| GLM 4.6 | glm-4.6 | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` | +| Kimi K2 | kimi-k2 | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` | +| Kimi K2 Thinking | kimi-k2-thinking | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` | +| Qwen3 Coder 480B | qwen3-coder | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` | +| Grok Code Fast 1 | grok-code | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` | +| Big Pickle | big-pickle | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` | + +The [model id](/config#models) in your OpenCode config +uses the format `opencode/+You sign in to **[OpenCode Zen](https://opencode.ai/auth)**, add your billing details, and copy your API key. + ++You run the `/connect` command in the TUI, select OpenCode Zen, and paste your API key. + ++Run `/models` in the TUI to see the list of models we recommend. + +`. For example, for GPT 5.1 Codex, you would +use `opencode/gpt-5.1-codex` in your config. + +### Models + +You can fetch the full list of available models and their metadata from: + +```bash +https://opencode.ai/zen/v1/models +``` + +## Pricing + +We support a pay-as-you-go model. Below are the prices **per 1M tokens**. + +| Model | Input | Output | Cached Read | Cached Write | +| :-------------------------------- | :----- | :----- | :---------- | :----------- | +| Big Pickle | Free | Free | Free | - | +| Grok Code Fast 1 | Free | Free | Free | - | +| GLM 4.6 | $0.60 | $2.20 | $0.10 | - | +| Kimi K2 | $0.40 | $2.50 | - | - | +| Kimi K2 Thinking | $0.40 | $2.50 | - | - | +| Qwen3 Coder 480B | $0.45 | $1.50 | - | - | +| Claude Sonnet 4.5 (≤ 200K tokens) | $3.00 | $15.00 | $0.30 | $3.75 | +| Claude Sonnet 4.5 (> 200K tokens) | $6.00 | $22.50 | $0.60 | $7.50 | +| Claude Sonnet 4 (≤ 200K tokens) | $3.00 | $15.00 | $0.30 | $3.75 | +| Claude Sonnet 4 (> 200K tokens) | $6.00 | $22.50 | $0.60 | $7.50 | +| Claude Haiku 4.5 | $1.00 | $5.00 | $0.10 | $1.25 | +| Claude Haiku 3.5 | $0.80 | $4.00 | $0.08 | $1.00 | +| Claude Opus 4.5 | $5.00 | $25.00 | $0.50 | $6.25 | +| Claude Opus 4.1 | $15.00 | $75.00 | $1.50 | $18.75 | +| Gemini 3 Pro (≤ 200K tokens) | $2.00 | $12.00 | $0.20 | - | +| Gemini 3 Pro (> 200K tokens) | $4.00 | $18.00 | $0.40 | - | +| Gemini 3 Flash | $0.50 | $3.00 | $0.05 | - | +| GPT 5.2 | $1.75 | $14.00 | $0.175 | - | +| GPT 5.1 | $1.07 | $8.50 | $0.107 | - | +| GPT 5.1 Codex | $1.07 | $8.50 | $0.107 | - | +| GPT 5.1 Codex Max | $1.25 | $10.00 | $0.125 | - | +| GPT 5 | $1.07 | $8.50 | $0.107 | - | +| GPT 5 Codex | $1.07 | $8.50 | $0.107 | - | +| GPT 5 Nano | Free | Free | Free | - | + +You might notice _Claude Haiku 3.5_ in your usage history. This is a [low cost model](/config#models) that's used to generate the titles of your sessions. + + +**NOTE** + +Credit card fees are passed along at cost; we don't charge anything beyond that. + + +The free models: + +- Grok Code Fast 1 is currently free on OpenCode for a limited time. The xAI team is using this time to collect feedback and improve Grok Code. +- Big Pickle is a stealth model that's free on OpenCode for a limited time. The team is using this time to collect feedback and improve the model. + +Contact us if you have any questions. + +## Privacy + +All our models are hosted in the US. Our providers follow a zero-retention policy and do not use your data for model training, with the following exceptions: + +- Grok Code Fast 1: During its free period, collected data may be used to improve Grok Code. +- Big Pickle: During its free period, collected data may be used to improve the model. +- OpenAI APIs: Requests are retained for 30 days in accordance with [OpenAI's Data Policies](https://platform.openai.com/docs/guides/your-data). +- Anthropic APIs: Requests are retained for 30 days in accordance with [Anthropic's Data Policies](https://docs.anthropic.com/en/docs/claude-code/data-usage). + + +## For Teams + +Zen also works great for teams. You can invite teammates, assign roles, curate +the models your team uses, and more. + ++**NOTE** + +Workspaces are currently free for teams as a part of the beta. + + +Managing your workspace is currently free for teams as a part of the beta. We'll be +sharing more details on the pricing soon. + +### Roles + +You can invite teammates to your workspace and assign roles: + +- **Admin**: Manage models, members, API keys, and billing +- **Member**: Manage only their own API keys + +Admins can also set monthly spending limits for each member to keep costs under control. + +### Model access + +Admins can enable or disable specific models for the workspace. Requests made to a disabled model will return an error. + +This is useful for cases where you want to disable the use of a model that +collects data. + +### Bring your own key + +You can use your own OpenAI or Anthropic API keys while still accessing other models in Zen. + +When you use your own keys, tokens are billed directly by the provider, not by Zen. + +For example, your organization might already have a key for OpenAI or Anthropic +and you want to use that instead of the one that Zen provides. + +## Goals + +We created OpenCode Zen to: + +1. **Benchmark** the best models/providers for coding agents. +2. Have access to the **highest quality** options and not downgrade performance or route to cheaper providers. +3. Pass along any **price drops** by selling at cost; so the only markup is to cover our processing fees. +4. Have **no lock-in** by allowing you to use it with any other coding agent. And always let you use any other provider with OpenCode as well.