Skip to main content
Runpod’s Public Endpoints provide OpenAI-compatible APIs that are compatible with most AI coding assistants. This page shows you how to configure OpenCode, Cursor, and Cline to use Runpod’s Public Endpoints as a model provider.

Requirements

Before you start, you’ll need:
  • A Runpod account with an API key, and at least $5 in Runpod credits.
  • One or more of the following AI coding tools installed on your local machine:
    • OpenCode: Terminal-based AI coding assistant.
    • Cursor: AI-powered code editor.
    • Cline: VS Code extension for AI-assisted coding.

Available endpoints

Runpod provides two Public Endpoints that can be used to power AI coding tools:
ModelBase URLModel IDContext window
GPT OSS 120Bhttps://api.runpod.ai/v2/gpt-oss-120b/openai/v1openai/gpt-oss-120b131,072 tokens
Qwen3 32B AWQhttps://api.runpod.ai/v2/qwen3-32b-awq/openai/v1Qwen/Qwen3-32B-AWQ32,768 tokens
Both endpoints follow the OpenAI API specification, so they work with any tool that supports custom OpenAI-compatible providers.
Many AI coding tools can be configured to use any OpenAI-compatible model API. You can build your own OpenAI-compatible endpoint with Runpod Serverless.

Configure OpenCode

OpenCode supports multiple provider configurations, so you can set up both Runpod endpoints and switch between them.
1

Create the config file

OpenCode looks for its config at ~/.config/opencode/opencode.json. Run this command to create the directory (if it doesn’t exist) and generate the config file:
mkdir -p ~/.config/opencode && cat << 'EOF' > ~/.config/opencode/opencode.json
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "runpod-gpt": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "RunPod GPT OSS 120B",
      "options": {
        "baseURL": "https://api.runpod.ai/v2/gpt-oss-120b/openai/v1",
        "apiKey": "{env:RUNPOD_API_KEY}"
      },
      "models": {
        "gpt-oss-120b": {
          "id": "openai/gpt-oss-120b",
          "name": "GPT OSS 120B (RunPod)",
          "limit": { "context": 131072, "output": 4096 }
        }
      }
    },
    "runpod-qwen": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "RunPod Qwen3",
      "options": {
        "baseURL": "https://api.runpod.ai/v2/qwen3-32b-awq/openai/v1",
        "apiKey": "{env:RUNPOD_API_KEY}"
      },
      "models": {
        "qwen3-32b": {
          "id": "Qwen/Qwen3-32B-AWQ",
          "name": "Qwen3 32B AWQ (RunPod)",
          "limit": { "context": 32768, "output": 4096 }
        }
      }
    }
  }
}
EOF
2

Set your API key

The {env:RUNPOD_API_KEY} syntax in the config file tells OpenCode to read your API key from the RUNPOD_API_KEY environment variable.Run this command to set the environment variable, replacing rpa_YOUR_API_KEY with your actual API key:
export RUNPOD_API_KEY="rpa_YOUR_API_KEY"
This only sets the environment variable for your current shell session. You can add the export command to your shell profile (~/.bashrc, ~/.zshrc, etc.) so you don’t need to set the environment variable every time you open a new shell.
3

Test the configuration

Run this command to check that the configuration is working:
opencode models
You should see output similar to this:
opencode/big-pickle
opencode/glm-4.7-free
opencode/gpt-5-nano
opencode/kimi-k2.5-free
opencode/minimax-m2.1-free
opencode/trinity-large-preview-free
runpod-gpt/gpt-oss-120b
runpod-qwen/qwen3-32b
After confirming that the Runpod endpoints are listed, you can start OpenCode and try out the Runpod endpoints.
opencode
Press Ctrl + p to open the command palette and select Switch model to select a Runpod endpoint.

Configure Cursor

Cursor supports a single global OpenAI-compatible endpoint override, so you can only use one Runpod endpoint at a time.
The Qwen3 32B AWQ endpoint is not compatible with Cursor.
1

Open Cursor settings

Launch Cursor and press Shift + Cmd + J (macOS) or Shift + Ctrl + J (Windows/Linux) to open Settings.
2

Navigate to model settings

Go to Cursor Settings > Models and expand the API Keys section.
3

Enter your API key

Find the OpenAI API Key field. Enable it, then enter your Runpod API key (rpa_...).
4

Set the base URL

Enable Override OpenAI Base URL and enter:
https://api.runpod.ai/v2/gpt-oss-120b/openai/v1
5

Add the model

Scroll up and click View All Models to see the list of available models.Click Add Custom Model and enter the model ID exactly as shown (case-sensitive):
openai/gpt-oss-120b
6

Select the model

Select openai/gpt-oss-120b from the model list when using Cursor’s AI features.

Configure Cline

Cline is a VS Code extension with its own settings panel. Unlike Cursor, Cline supports multiple provider profiles, so you can configure both Runpod endpoints and switch between them.
1

Open Cline settings

Click the Cline icon in the sidebar to open the Cline panel, then click the gear icon to open Settings.
2

Select the API provider

Set API Provider to OpenAI Compatible.
3

Enter the connection settings

Fill in the following fields:
SettingValue
Base URLhttps://api.runpod.ai/v2/gpt-oss-120b/openai/v1
API Keyrpa_YOUR_API_KEY
Model IDopenai/gpt-oss-120b
4

Save the configuration

Click Save to apply your settings.
To use Qwen3 instead, use these values:
SettingValue
Base URLhttps://api.runpod.ai/v2/qwen3-32b-awq/openai/v1
Model IDQwen/Qwen3-32B-AWQ