How to Install OpenClaw: Complete Setup Guide for macOS, Linux & Windows (2026)
Step-by-step guide to install OpenClaw (formerly Clawdbot / Moltbot) on macOS, Linux, Ubuntu, and Windows. Covers npm, Docker, source builds, Ollama local models, API key setup, skills, configuration, and troubleshooting.
This is the complete guide to installing OpenClaw on any platform. Whether you are setting up OpenClaw on macOS, Linux, Ubuntu, or Windows, we cover every installation method — npm, Docker, and building from source — plus configuration, API key setup, local model support via Ollama, and troubleshooting.
If you searched for "Clawdbot install" or "Moltbot install," you are in the right place. Both projects were rebranded to OpenClaw in early 2026. Everything in this guide applies.
By the end, you will have a working OpenClaw agent running locally with your model of choice.
TL;DR — Install OpenClaw in 30 seconds
If you already have Node.js 20+ installed:
npm install -g openclaw
openclaw init
openclaw run "List all files in the current directory"That is it. For Docker, platform-specific guides, local model setup, or configuration details, keep reading.
Prerequisites
Before you start, make sure you have the core dependencies installed. The exact steps vary by platform — see the platform-specific sections below for detailed instructions.
| Dependency | Minimum version | Check command | Required for |
|---|---|---|---|
| Node.js | 20.0+ | node --version | npm install |
| npm | 9.0+ | npm --version | npm install |
| Git | 2.30+ | git --version | Source builds |
| Docker | 20.0+ | docker --version | Docker install |
| Ollama | Latest | ollama --version | Local models |
You also need at least one of the following:
- An API key from OpenAI, Anthropic, or Google — for cloud model usage
- Ollama installed with a pulled model — for fully local, offline usage
If you plan to use a local model, install Ollama first and pull a model like llama3.1 or deepseek-coder-v2 before proceeding. This way OpenClaw can start working immediately without any API key.
Method 1: Install OpenClaw via npm (recommended)
The npm method is the fastest way to install OpenClaw. One command, no containers, no cloning.
Install the package
npm install -g openclawVerify the installation:
openclaw --version
# Expected output: openclaw v2.x.xIf you see the version number, you are good. If you get a "command not found" error, your npm global bin directory is not in your PATH — see the troubleshooting section below.
Configure your API key
Run the interactive setup:
openclaw initThis creates a .openclaw/config.yaml file in your home directory. Open it and add your API key:
# ~/.openclaw/config.yaml
model:
provider: anthropic # openai, anthropic, google, ollama
model: claude-sonnet-4-6 # Model identifier
api_key: sk-ant-your-key-here
agent:
max_iterations: 25
timeout: 300
memory: conversationYou can also set the API key as an environment variable instead of putting it in the config file:
# Add to your ~/.bashrc, ~/.zshrc, or ~/.profile
export ANTHROPIC_API_KEY=sk-ant-your-key-here
# Or for OpenAI:
export OPENAI_API_KEY=sk-your-key-hereUsing environment variables for API keys is more secure than hardcoding them in config files. If you work on a shared machine or commit your dotfiles to Git, always use environment variables.
Run your first agent
openclaw run "List all JavaScript files in the current directory and count the total lines of code"OpenClaw will plan the task, use the file system tool to find files, execute a line-counting script, and report the result. You should see a step-by-step execution log in your terminal as the agent works.
Try a few more commands to get a feel for what OpenClaw can do:
# Summarize a file
openclaw run "Read and summarize the README.md in this directory"
# Generate code
openclaw run "Create a Python script that fetches the weather for a given city using the OpenWeatherMap API"
# Multi-step task
openclaw run "Find all TODO comments in this project, categorize them by priority, and create a summary report"Method 2: Install OpenClaw with Docker
Docker is the best option if you want process isolation, reproducible environments, or plan to deploy OpenClaw on a server. The Docker install keeps OpenClaw completely sandboxed — it cannot access files or processes outside the container unless you explicitly mount them.
Pull the image
docker pull openclaw/openclaw:latestVerify the image:
docker run --rm openclaw/openclaw:latest --version
# Expected output: openclaw v2.x.xRun with your config
docker run -it \
-v ~/.openclaw:/root/.openclaw \
-v $(pwd):/workspace \
-e ANTHROPIC_API_KEY=sk-ant-your-key-here \
openclaw/openclaw:latest \
run "Summarize the README.md file in /workspace"Here is what each flag does:
| Flag | Purpose |
|---|---|
-it | Interactive mode with a terminal (required for agent output) |
-v ~/.openclaw:/root/.openclaw | Mounts your config directory into the container |
-v $(pwd):/workspace | Mounts your current working directory as /workspace |
-e ANTHROPIC_API_KEY=... | Passes your API key as an environment variable |
Docker Compose for persistent setups
If you want OpenClaw running as a long-lived service — for example, on a VPS or as part of a larger development stack — use Docker Compose:
# docker-compose.yml
version: "3.8"
services:
openclaw:
image: openclaw/openclaw:latest
volumes:
- ~/.openclaw:/root/.openclaw
- ./workspace:/workspace
environment:
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
stdin_open: true
tty: true
restart: unless-stoppeddocker compose up -d
docker compose exec openclaw openclaw run "Your task here"When to choose Docker over npm
Docker is especially useful when:
- You want to restrict file access — the container acts as a natural sandbox
- You are deploying on a VPS or server where you do not want to install Node.js globally
- You need reproducible environments across machines or team members
- You want to run multiple isolated OpenClaw instances with different configs
Docker is the recommended method for running OpenClaw on a VPS or remote server. The container isolates the agent from your system, and you can limit CPU, memory, and network access using standard Docker flags.
Method 3: Build OpenClaw from source
Building from source is for contributors, anyone who wants the latest development features before they are released, or if you want to modify OpenClaw's behavior.
Clone and build
git clone https://github.com/openclaw/openclaw.git
cd openclaw
npm install
npm run buildLink globally
npm linkNow you can use openclaw from anywhere on your system. Verify with:
openclaw --versionStay updated
Since you are running from source, you need to pull updates manually:
cd /path/to/openclaw
git pull origin main
npm install # In case dependencies changed
npm run buildContributing
If you plan to contribute back to OpenClaw, create a feature branch:
git checkout -b feature/my-improvement
# Make your changes
npm run test
npm run lint
git push origin feature/my-improvementThen open a pull request on GitHub.
Installing OpenClaw on macOS
macOS (both Intel and Apple Silicon) is the best-supported platform for OpenClaw. Here is the recommended setup:
Install Node.js
If you do not have Node.js 20+ yet, install it via Homebrew:
# Install Homebrew if you don't have it
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
# Install Node.js
brew install nodeVerify:
node --version # Should be 20.x or higher
npm --version # Should be 9.x or higherInstall OpenClaw
npm install -g openclaw
openclaw initmacOS-specific notes
- Apple Silicon (M1/M2/M3/M4): OpenClaw runs natively on ARM. No Rosetta needed.
- Gatekeeper warnings: If macOS blocks OpenClaw from running, go to System Settings → Privacy & Security and allow it.
- Ollama on macOS: Ollama runs natively on Apple Silicon and takes advantage of the unified memory architecture. A MacBook with 16GB+ RAM can comfortably run 7B–13B parameter models.
Installing OpenClaw on Linux and Ubuntu
OpenClaw works on all major Linux distributions. Here are the steps for Ubuntu/Debian (the most common setup):
Install Node.js on Ubuntu
The version of Node.js in Ubuntu's default apt repository is often outdated. Use the NodeSource repository instead:
# Add NodeSource repository for Node.js 20
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo bash -
# Install Node.js
sudo apt-get install -y nodejs
# Verify
node --version # Should be 20.x
npm --versionFor Fedora/RHEL:
curl -fsSL https://rpm.nodesource.com/setup_20.x | sudo bash -
sudo dnf install -y nodejsFor Arch Linux:
sudo pacman -S nodejs npmInstall OpenClaw
npm install -g openclaw
openclaw initLinux-specific notes
- Permission errors: If
npm install -gfails withEACCES, either fix your npm prefix (npm config set prefix ~/.npm-globaland add~/.npm-global/binto PATH) or usesudo. Fixing the prefix is the recommended approach. - Headless servers: OpenClaw works fine over SSH. If you are running it on a VPS without a display, everything works identically — the agent uses the terminal for output.
- Systemd service: For running OpenClaw as a background service, see the Docker Compose method above — it is more maintainable than a custom systemd unit.
Installing OpenClaw on Windows (WSL)
OpenClaw does not run natively on Windows. The recommended approach is WSL 2 (Windows Subsystem for Linux), which gives you a full Linux environment inside Windows.
Step 1: Install WSL 2
Open PowerShell as Administrator and run:
wsl --installThis installs WSL 2 with Ubuntu by default. Restart your machine when prompted.
Step 2: Set up Ubuntu in WSL
Open the Ubuntu app from your Start menu. It will ask you to create a username and password. Then update packages:
sudo apt update && sudo apt upgrade -yStep 3: Install Node.js in WSL
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo bash -
sudo apt-get install -y nodejsStep 4: Install OpenClaw
npm install -g openclaw
openclaw initWindows-specific notes
- Docker alternative: If you prefer Docker Desktop, enable the WSL 2 backend in Docker Desktop settings. Then use the Docker install method from within WSL or PowerShell.
- File access: WSL can access your Windows files at
/mnt/c/Users/YourName/. However, performance is significantly better when working with files inside the WSL filesystem (~/projects/) rather than mounted Windows paths. - VS Code integration: Install the "WSL" extension in VS Code to edit files inside your WSL environment seamlessly.
- Windows Terminal: Use Windows Terminal (available from the Microsoft Store) for the best experience. It supports tabs, multiple profiles, and proper color rendering for OpenClaw's output.
Configuration deep dive
OpenClaw's configuration file controls everything about how your agent behaves. Understanding these settings is essential for getting the best results.
The config file lives at ~/.openclaw/config.yaml for global settings. You can also create project-specific configs at .openclaw/config.yaml in any project directory — these override the global config.
Model configuration
model:
provider: anthropic # openai, anthropic, google, ollama
model: claude-sonnet-4-6 # Model identifier
api_key: sk-ant-... # Your API key (or use env var)
temperature: 0.3 # Lower = more deterministic (0.0-1.0)
max_tokens: 8192 # Max response length per turnProvider options and their best models:
| Provider | Best model | Use case |
|---|---|---|
anthropic | claude-sonnet-4-6 | Best all-round coding and reasoning |
anthropic | claude-opus-4-6 | Most capable, best for complex multi-step tasks |
openai | gpt-4o | Strong general-purpose alternative |
openai | o3 | Best for tasks requiring deep reasoning |
google | gemini-2.5-pro | Good for long-context tasks |
ollama | llama3.1:70b | Best local model for general tasks |
ollama | deepseek-coder-v2:33b | Best local model for coding |
Agent behavior
agent:
max_iterations: 25 # Max steps before stopping
timeout: 300 # Seconds before timeout
memory: conversation # conversation, knowledge_base, file
confirm_actions: true # Ask before destructive actions
sandbox: true # Run code in sandboxed environment- max_iterations: Start with 25. Increase to 50–100 for complex tasks. Lower to 10 if you want tighter control.
- timeout: 300 seconds (5 minutes) is a good default. Increase for tasks that involve slow API calls or large file processing.
- memory:
conversationkeeps context within a session.filepersists context across sessions by writing to disk.knowledge_baseuses vector search for long-term retrieval. - confirm_actions: Always keep this
truewhen getting started. It forces the agent to ask before deleting files, running shell commands, or making API calls.
Tool permissions
tools:
file_system:
enabled: true
allowed_paths:
- /home/user/projects
- /tmp
denied_paths:
- /etc
- /var
code_execution:
enabled: true
languages: [python, javascript, bash]
timeout: 30
web_browsing:
enabled: true
allowed_domains: ["*"]
api_calls:
enabled: true
require_confirmation: trueAlways set confirm_actions: true and restrict allowed_paths when you are first getting started. This forces the agent to ask for your approval before taking any destructive action like deleting files or running shell commands.
Environment variables
All config values can also be set via environment variables. This is useful for CI/CD, Docker, or when you do not want to store keys in files:
export OPENCLAW_PROVIDER=anthropic
export OPENCLAW_MODEL=claude-sonnet-4-6
export ANTHROPIC_API_KEY=sk-ant-your-key-here
export OPENCLAW_MAX_ITERATIONS=50
export OPENCLAW_TIMEOUT=600
export OPENCLAW_MEMORY=fileEnvironment variables take precedence over config file values.
Using OpenClaw with local models (Ollama)
If you prefer to keep everything local — no API keys, no data leaving your machine, no usage costs — OpenClaw works with Ollama for fully offline AI agent usage.
Install Ollama and pull a model
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Pull a model (choose based on your hardware)
ollama pull llama3.1:70b # Best quality (needs 40GB+ RAM)
ollama pull llama3.1:8b # Good balance (needs 8GB+ RAM)
ollama pull deepseek-coder-v2:33b # Best for coding (needs 20GB+ RAM)
ollama pull mistral:7b # Lightweight option (needs 8GB RAM)Configure OpenClaw for Ollama
# ~/.openclaw/config.yaml
model:
provider: ollama
model: llama3.1:70b
base_url: http://localhost:11434No API key needed. Start Ollama (ollama serve) and run OpenClaw as usual:
openclaw run "Create a Python function that validates email addresses"Best local models for OpenClaw
Choosing the right local model depends on your hardware and the tasks you need:
| Model | Size | RAM needed | Best for | Quality |
|---|---|---|---|---|
llama3.1:70b | 40GB | 48GB+ | General tasks, reasoning | Excellent |
llama3.1:8b | 4.7GB | 8GB+ | Simple tasks, quick responses | Good |
deepseek-coder-v2:33b | 19GB | 24GB+ | Code generation, debugging | Excellent |
codellama:34b | 19GB | 24GB+ | Code-focused tasks | Very good |
mistral:7b | 4.1GB | 8GB | Lightweight, fast responses | Decent |
qwen2:72b | 41GB | 48GB+ | Multi-language, reasoning | Excellent |
Performance: local vs cloud models
Local models work well for:
- Simple tasks: File operations, code generation for single files, text summarization
- Privacy-sensitive work: Anything involving proprietary code or confidential data
- Offline environments: Air-gapped systems or unreliable internet
Cloud models (Claude Opus, GPT-4o, o3) are significantly better for:
- Complex multi-step reasoning: Tasks requiring 10+ agent iterations
- Large codebase understanding: Understanding relationships across many files
- Tool orchestration: Chaining multiple tools in sophisticated workflows
A practical approach: use a local model for day-to-day tasks and switch to a cloud model for complex work. You can maintain multiple config profiles and switch with openclaw --config ~/.openclaw/config-cloud.yaml run "...".
From the maker
Building an AI-powered product, not just running an agent?
OpenClaw is great for local dev workflows. But if you are building something users will pay for — a SaaS, an AI tool, an internal platform — you need auth, payments, a web UI, and deployment. AnotherWrapper gives you the full product stack in Next.js.
“You've turned 3 months of work into 3 weeks man. Worth every penny.”
Kamara
·Indie Maker
Verified on DiscordTrusted by 2,000+ founders · One-time payment · Lifetime updates
Setting up your first real project
Once OpenClaw is installed and configured, here is a practical walkthrough to get productive quickly.
1. Create a project workspace
mkdir my-openclaw-project
cd my-openclaw-project
openclaw init --projectThe --project flag creates a project-specific .openclaw/ directory. This is useful when different projects need different models or tool permissions.
2. Start with a simple task
openclaw run "Set up a new Express.js API with TypeScript, add a health check endpoint, and write a basic test"Watch the execution log carefully. You will see the agent:
- Plan the task into subtasks
- Create files (package.json, tsconfig.json, source files)
- Install dependencies via npm
- Write tests using the testing framework it chose
- Verify the setup by running the tests
3. Review the execution history
openclaw historyThis shows the complete execution log — every tool call, every decision, every result. Use it to understand how the agent approached the task and whether it made good choices.
4. Iterate on your project
openclaw run "Add a /users endpoint with CRUD operations and connect it to a SQLite database"The agent remembers the project context from previous runs if you configured memory: file or memory: knowledge_base. This means it knows about the files it already created and can build on them.
5. Use the OpenClaw CLI effectively
Some useful CLI commands beyond run:
openclaw history # View past executions
openclaw config # Show current configuration
openclaw skills list # List installed skills
openclaw run --model gpt-4o "..." # Override model for one task
openclaw run --verbose "..." # Extra-detailed execution log
openclaw run --dry-run "..." # Plan without executingInstalling and managing OpenClaw skills
OpenClaw skills are pre-built capability packages that extend what the agent can do. They are one of OpenClaw's most powerful features, and the skills ecosystem is growing rapidly.
What are OpenClaw skills?
A skill is a bundle of tools, prompts, and configurations that teach OpenClaw how to perform a specific type of task. For example:
- web-scraper: Structured web scraping with pagination handling
- git-workflow: Automated PR creation, code review, branch management
- data-analysis: CSV/JSON analysis with chart generation
- api-tester: Automated API endpoint testing with assertions
- db-migration: Database schema migration generation
Browse and install skills
# List all available skills from the registry
openclaw skills list
# Search for skills by keyword
openclaw skills search "database"
# Install a skill
openclaw skills install web-scraper
# View installed skills
openclaw skills installedUsing installed skills
Once a skill is installed, the agent can use it automatically when relevant:
# The web-scraper skill activates automatically for web tasks
openclaw run "Scrape all product listings from example.com and save them as JSON"
# Or invoke a skill explicitly
openclaw run --skill git-workflow "Create a PR for the current branch with a detailed description"Community skills (awesome-openclaw)
The community maintains an awesome-openclaw repository on GitHub with curated skill collections. Browse it for popular third-party skills:
# Install a community skill from a GitHub repo
openclaw skills install github:username/openclaw-skill-nameBefore installing community skills, review their source code. Skills have access to whatever tool permissions you have configured. Only install skills from sources you trust.
How to update OpenClaw
Keeping OpenClaw up to date ensures you get the latest features, bug fixes, and model support.
npm update
npm update -g openclaw
openclaw --versionDocker update
docker pull openclaw/openclaw:latestIf you use Docker Compose:
docker compose pull
docker compose up -dSource build update
cd /path/to/openclaw
git pull origin main
npm install
npm run build
openclaw --versionCheck for updates without installing
npm outdated -g openclawHow to uninstall OpenClaw
If you need to remove OpenClaw from your system, here is how to do a clean uninstall for each installation method.
Uninstall npm installation
npm uninstall -g openclawRemove Docker installation
# Remove the image
docker rmi openclaw/openclaw:latest
# If using Docker Compose, stop and remove
docker compose down
docker compose rm -fClean up configuration files
Regardless of installation method, OpenClaw stores configuration and memory in your home directory:
# Remove global config
rm -rf ~/.openclaw
# Remove project-specific configs (run in each project directory)
rm -rf .openclaw/Before deleting ~/.openclaw, check if it contains any memory or knowledge base data you want to keep. The ~/.openclaw/memory/ directory may have accumulated useful project context.
Migrating from Clawdbot or Moltbot
If you have the older Clawdbot or Moltbot packages installed, remove them first:
npm uninstall -g clawdbot
npm uninstall -g moltbotYour existing ~/.clawdbot/ or ~/.moltbot/ config directories are compatible. Copy them to ~/.openclaw/:
# If migrating from Clawdbot
cp -r ~/.clawdbot/ ~/.openclaw/
# If migrating from Moltbot
cp -r ~/.moltbot/ ~/.openclaw/Troubleshooting
"command not found" after npm install
Your npm global bin directory is not in your PATH. Find it:
npm config get prefixThen add <prefix>/bin to your shell profile:
# For Bash (~/.bashrc)
echo 'export PATH="$PATH:'$(npm config get prefix)'/bin"' >> ~/.bashrc
source ~/.bashrc
# For Zsh (~/.zshrc)
echo 'export PATH="$PATH:'$(npm config get prefix)'/bin"' >> ~/.zshrc
source ~/.zshrcAPI key errors
Make sure your key is correctly set. Test it:
# Check if the environment variable is set
echo $ANTHROPIC_API_KEY
# If empty, set it
export ANTHROPIC_API_KEY=sk-ant-your-key-hereCommon mistakes:
- Extra whitespace in the key (copy-paste artifacts)
- Wrong provider set in config (e.g., config says
provider: openaibut you setANTHROPIC_API_KEY) - Expired or revoked key — regenerate it in your provider's dashboard
How to get a Claude API key
- Go to console.anthropic.com
- Sign up or log in
- Navigate to API Keys in the left sidebar
- Click Create Key
- Copy the key (starts with
sk-ant-) and add it to your config or environment variable
Agent gets stuck in a loop
This usually means the model is not capable enough for the task, or the task is ambiguous. Try:
- Reduce
max_iterationsto catch loops early (set to 10–15) - Use a more capable model — switch from a 7B local model to Claude Sonnet or GPT-4o
- Make your prompt more specific — instead of "fix the bugs," say "fix the TypeError on line 42 of server.ts"
- Enable
confirm_actions— this forces the agent to pause and ask you before each destructive action, letting you course-correct
Docker permission issues
On Linux, Docker requires root access by default. Add your user to the docker group:
sudo usermod -aG docker $USERThen log out and log back in (or restart your terminal). Verify with:
docker run hello-worldOn macOS, Docker Desktop handles permissions automatically.
Memory errors with large projects
If the agent runs out of context when working with large codebases, switch to file-based memory:
agent:
memory: file
memory_path: .openclaw/memoryThis stores context on disk instead of keeping everything in the LLM's context window. The agent can then retrieve relevant context as needed rather than loading everything upfront.
OpenClaw is slow
Performance depends primarily on your model choice:
- Cloud models: Response time depends on the provider's API latency. Claude and GPT-4o typically respond in 2–5 seconds per turn.
- Local models: Speed depends on your hardware. GPU acceleration makes a massive difference — a 7B model on an M2 Mac runs at 30+ tokens/sec, but a 70B model on CPU-only hardware may take minutes per response.
- Network issues: If using cloud models, check your internet connection. High latency or packet loss will slow every agent iteration.
SSL/TLS errors behind a corporate proxy
If you are behind a corporate proxy, you may see SSL certificate errors. Set your proxy:
export HTTPS_PROXY=http://your-proxy:8080
export HTTP_PROXY=http://your-proxy:8080
export NODE_EXTRA_CA_CERTS=/path/to/corporate-ca.pemWhat OpenClaw does not give you
OpenClaw is excellent for local development and automation. But if you are building something that other people will use — a product, a SaaS, an internal tool with a web UI — you need more than an agent framework.
You need:
- Authentication — User signup, login, sessions, OAuth
- A web interface — Not just a terminal, but a polished UI your users interact with
- Payments — Subscription management, billing, usage tracking
- Database — Structured data storage with proper schemas and migrations
- Deployment — CI/CD, hosting, monitoring, error tracking
- Email — Transactional and marketing emails
- Analytics — Usage tracking, funnel analysis, user behavior
These are not OpenClaw's job. They are the product layer. And building them from scratch takes months.
From the maker
Need the product layer that OpenClaw skips?
AnotherWrapper is a production-ready Next.js starter with Supabase auth, Stripe payments, database, email, AI integrations, and 10+ working demo apps. Build what OpenClaw cannot — ship in days, not months.
“I launched my AI writing assistant in a weekend instead of 6 weeks. Got my first paying customer within 3 days.”
john200ok
·Reddit r/SaaS
Verified on RedditTrusted by 2,000+ founders · One-time payment · Lifetime updates
Next steps
Now that OpenClaw is running:
- Read What is OpenClaw? for the full architecture and capabilities overview
- Check our security and privacy guide before using it with sensitive data
- Browse OpenClaw alternatives to compare your options
Frequently asked questions
What are the system requirements for OpenClaw?
You need Node.js 20 or later and Git. OpenClaw runs on macOS (Intel and Apple Silicon), Linux (Ubuntu, Debian, Fedora, Arch), and Windows via WSL 2. For Docker installs, you need Docker Engine 20+. Local model usage via Ollama requires 16GB+ RAM for 7B+ parameter models, and 48GB+ for the best 70B models.
How do I install OpenClaw?
The fastest method is npm: npm install -g openclaw, then openclaw init. You can also install via Docker (docker pull openclaw/openclaw:latest) or build from source by cloning the GitHub repository. See the full walkthrough above for each method.
Can I use OpenClaw without an API key?
Yes — if you run a local model via Ollama. Set provider: ollama in your config and no external API key is needed. Everything stays on your machine. Popular local models include Llama 3.1, DeepSeek Coder V2, and Mistral.
Does OpenClaw work on Windows?
OpenClaw works on Windows via WSL 2 (Windows Subsystem for Linux) or Docker Desktop with WSL 2 backend. Native Windows support is limited. WSL 2 with Ubuntu is the recommended approach — see the Windows installation section above.
Is OpenClaw the same as Clawdbot?
Yes. Clawdbot was rebranded to OpenClaw in early 2026. The core technology is identical — same command-line interface, same configuration format, same plugin architecture. If you had Clawdbot installed, uninstall it with npm uninstall -g clawdbot and install OpenClaw fresh. Your existing config files are compatible. Read Clawdbot is now OpenClaw for the full story.
Is OpenClaw the same as Moltbot?
Yes. Moltbot was the original name before it became Clawdbot, and then rebranded to OpenClaw. The migration path is the same: uninstall the old package and install openclaw. Your configuration files are compatible. Read Moltbot is now OpenClaw for details.
What is the best local model for OpenClaw?
For coding tasks, DeepSeek Coder V2 (33B) delivers the best results. For general-purpose agent work, Llama 3.1 70B is the top recommendation. On machines with limited RAM (8–16GB), Llama 3.1 8B and Mistral 7B handle simpler tasks well. See the full model comparison table above.
How do I get a Claude API key for OpenClaw?
Sign up at console.anthropic.com, navigate to API Keys in the left sidebar, and create a new key. Copy the key (it starts with sk-ant-) and add it to your OpenClaw config at ~/.openclaw/config.yaml under model.api_key, or set the ANTHROPIC_API_KEY environment variable.
How do I install OpenClaw skills?
Use the built-in skill manager: openclaw skills install skill-name. Browse available skills with openclaw skills list or search with openclaw skills search "keyword". Community-maintained skill collections are available in the awesome-openclaw GitHub repository.
How do I update OpenClaw?
For npm: npm update -g openclaw. For Docker: docker pull openclaw/openclaw:latest. For source builds: git pull origin main && npm run build. Check your version anytime with openclaw --version.
How do I uninstall OpenClaw?
For npm: npm uninstall -g openclaw. For Docker: docker rmi openclaw/openclaw:latest. To fully clean up, also delete the config directory at ~/.openclaw and any project-level .openclaw directories.
Is OpenClaw free to use?
Yes. OpenClaw is free and open-source under the MIT license. You can use it, modify it, and distribute it without restriction. The only costs are the API calls to your chosen model provider (OpenAI, Anthropic, Google). If you use local models via Ollama, everything is completely free.
Can I use OpenClaw with Docker on Windows?
Yes. Install Docker Desktop for Windows with the WSL 2 backend enabled. Then use the Docker install method: docker pull openclaw/openclaw:latest. Docker Desktop handles the Linux virtualization for you, so you do not need to set up WSL separately if you only want to run OpenClaw via Docker.
Looking for legacy install instructions? If you searched for "Moltbot install" or "Clawdbot install," those projects have been rebranded to OpenClaw. All commands, configs, and features are the same — just the name changed. Read Moltbot is now OpenClaw or Clawdbot is now OpenClaw for the full story.
Stay ahead of the curve
Weekly insights on AI tools, comparisons, and developer strategies.

Fekri
Building tools for the next generation of AI-powered startups. Sharing what I learn along the way.
Continue reading
You might also enjoy
Clawdbot is Now OpenClaw: Migration Guide and What Changed
Clawdbot is now OpenClaw. Learn what changed, how to migrate, and what OpenClaw offers over the original Clawdbot AI agent fork.
Moltbot is Now OpenClaw: Migration Guide (2026)
Moltbot is now OpenClaw. Learn what changed, why the rebrand happened, and how to migrate your Moltbot installation to OpenClaw in 5 minutes.
OpenClaw Alternatives: 6 Options Compared (2026)
Compare the top OpenClaw alternatives: Claude Code, OpenCode, Goose, Cline, Devon, and the build-your-own approach. Features, pros, and cons.