Deploy OpenClaw in minutes with NEXUS AI
OpenClaw is a lightweight AI coding gateway — Claude Code-compatible — that runs on port 18789 and exposes a secure token-authenticated endpoint. NEXUS AI makes deploying it entirely conversational: no YAML files, no Dockerfiles, no console wrestling. Just ask and it's live.
This guide walks you through deploying OpenClaw on NEXUS AI step by step — covering provider selection, environment setup, connecting your CLI, and optional advanced configuration. By the end you'll have a live gateway you can connect Claude Code or any compatible AI coding tool to.
What is OpenClaw?
OpenClaw (alpine/openclaw:latest) is a minimal, production-ready AI gateway service designed to work with Claude Code and compatible clients. It exposes a single authenticated endpoint on port 18789, identified by a gateway token that's either auto-generated or supplied by you at deploy time.
Once deployed, any Claude Code-compatible tool can point at your OpenClaw instance and authenticate with the token — giving you a self-hosted, cloud-native AI coding backend.
Prerequisites
A NEXUS AI account with at least one project configured
The NEXUS AI MCP connector enabled in Claude — or API access to api.zollo.live/mcp
A supported cloud provider account (for GCP, AWS, or Azure deployments)
The OpenClaw CLI installed locally to connect after deployment
Tip: You can also deploy OpenClaw entirely from within a Claude chat session if you have the NEXUS AI MCP connector enabled. No terminal required for the deploy itself.
Deploying OpenClaw — step by step
01
Choose your provider & environment
NEXUS AI supports four deployment providers out of the box. For local development and testing, Docker is the fastest path. For production workloads, GCP Cloud Run offers the best cold-start performance and auto-scaling.
🐳 Docker Best for local dev & CI
☁️ GCP Cloud Run Serverless, auto-scales
⚡ AWS ECS Fargate Full AWS ecosystem
🔷 Azure Container Apps Enterprise & compliance
Environments map to: DEVELOPMENT for local testing, STAGING for pre-production validation, and PRODUCTION for live traffic.
02
Trigger the deployment
Via the NEXUS AI MCP in Claude, call nexusai_deploy_openclaw with your provider and environment. A minimal invocation looks like this:
NEXUS AI MCP nexusai_deploy_openclaw( provider: "gcp_cloud_run", environment: "PRODUCTION", name: "openclaw-gateway" // optional )If you don't supply a gatewayToken, one is securely auto-generated for you. You can also pass in Claude API credentials (claudeApiKey, claudeWebCookie) if your gateway requires them.
03
Save your deployment credentials
After a successful deploy call, NEXUS AI returns:
Field
Description
Example
id
Deployment UUID
40693d12-…
gatewayToken
Auth token for CLI
6fb631a9d6…
status
Current state
queued → running
url
Public endpoint
Available once live
Important: Copy your gatewayToken immediately — it's only returned once. Treat it like a password and store it in a secrets manager or your .env.
04
Monitor deployment status
Deployments move from queued → provisioning → running. Check status anytime:
NEXUS AI MCP nexusai_deploy_status( deploymentId: "40693d12-0839-4821-aa3c-3621f12c74f4" )Once status shows running, the url field will be populated with your public endpoint.
05
Connect via the OpenClaw CLI
Point your local tooling at the gateway by setting two environment variables:
BASH # Set your gateway token export OPENCLAW_GATEWAY_TOKEN="6fb631a9d6c66c45cb3f5890886b14dc" # Point CLI at your deployment URL export OPENCLAW_GATEWAY_URL="https://your-deployment-url" # Verify the connection openclaw statusYou should see a confirmation that the gateway is reachable and authenticated. Claude Code and any compatible AI coding client will now route through your self-hosted OpenClaw instance.
Tip: For persistent config, add these exports to your .zshrc / .bashrc, or use a .env file with direnv.
Advanced configuration
Custom gateway token
If you want a predictable, rotatable token instead of an auto-generated one, pass it explicitly at deploy time:
NEXUS AI MCP
nexusai_deploy_openclaw(
provider: "gcp_cloud_run",
environment: "PRODUCTION",
gatewayToken: "my-secure-token-from-vault"
)Passing Claude credentials
If your OpenClaw instance needs to authenticate directly with Anthropic, you can pass your Claude session credentials at deploy time via claudeApiKey, claudeWebCookie, or claudeWebSessionKey. These are stored securely in NEXUS AI's secrets system and injected into the container at runtime.
Scaling your deployment
Once your OpenClaw gateway is live, you can scale it up instantly via NEXUS AI without redeploying:
NEXUS AI MCP
nexusai_deploy_scale(
deploymentId: "40693d12-…",
replicas: 3
)Custom domain
Attach a custom domain to your OpenClaw gateway for a clean, branded endpoint:
NEXUS AI MCP
nexusai_domains_add(
deploymentId: "40693d12-…",
domain: "openclaw.yourdomain.com"
)Managing your OpenClaw deployment
NEXUS AI gives you full lifecycle control over every deployment. Here's a quick reference:
Action | MCP Tool | When to use |
|---|---|---|
Check status | nexusai_deploy_status | After deploying, or to verify health |
View logs | nexusai_deploy_logs | Debug connection or auth issues |
Stop gateway | nexusai_deploy_stop | Pause to save compute costs |
Restart | nexusai_deploy_start | After a stop, or after config changes |
Rollback | nexusai_deploy_rollback | Revert to a previous revision |
Delete | nexusai_deploy_delete | Permanently tear down |
Troubleshooting
Gateway stuck in "queued"
Check that your cloud provider credentials are properly configured in NEXUS AI. For GCP, ensure your service account has Cloud Run admin permissions. Use nexusai_deploy_logs to see what's happening inside the container.
CLI returns 401 Unauthorized
Double-check your OPENCLAW_GATEWAY_TOKEN environment variable. Tokens are case-sensitive. If you've lost the token, delete the deployment and redeploy with a custom gatewayToken you control.
Connection refused on port 18789
Verify the deployment status is running (not queued or stopped). Also confirm your OPENCLAW_GATEWAY_URL includes the correct port and protocol.
Wrapping up
OpenClaw on NEXUS AI gives you a fully managed, cloud-native AI coding gateway without any of the infrastructure overhead. You get auto-generated tokens, multi-cloud flexibility, instant scaling, and full lifecycle management — all through natural language or a single API call.
From here, you can explore connecting multiple Claude Code clients to the same gateway, attaching a custom domain, or setting up a staging → production promotion workflow entirely within NEXUS AI.
Try it now: Head to https://nexusai.run and enable the NEXUS AI MCP connector in Claude to deploy your first OpenClaw gateway in under 2 minutes.

Replies