Getting started with running OpenHands on your own.
MacOS
Settings > Advanced
and ensure Allow the default Docker socket to be used
is enabled.Linux
Windows
wsl --version
in powershell and confirm Default Version: 2
.Settings
and confirm the following:Use the WSL 2 based engine
is enabled.Enable integration with my default WSL distro
is enabled.docker pull docker.all-hands.dev/all-hands-ai/runtime:0.49-nikolaik
docker run -it --rm --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.49-nikolaik \
-e LOG_ALL_EVENTS=true \
-v /var/run/docker.sock:/var/run/docker.sock \
-v ~/.openhands:/.openhands \
-p 3000:3000 \
--add-host host.docker.internal:host-gateway \
--name openhands-app \
docker.all-hands.dev/all-hands-ai/openhands:0.49
Note: If you used OpenHands before version 0.44, you may want to run mv ~/.openhands-state ~/.openhands
to migrate your conversation history to the new location.
You’ll find OpenHands running at http://localhost:3000!
LLM Provider
and LLM Model
and enter a corresponding API Key
.
This can be done during the initial settings popup or by selecting the Settings
button (gear icon) in the UI.
If the required model does not exist in the list, in Settings
under the LLM
tab, you can toggle Advanced
options
and manually enter it with the correct prefix in the Custom Model
text box.
The Advanced
options also allow you to specify a Base URL
if required.
Anthropic (Claude)
Google (Gemini)
Local LLM (e.g. LM Studio, llama.cpp, Ollama)
local-key
, test123
) — it won’t be used.LLM
tab > Search API Key (Tavily)
$VERSION
in openhands:$VERSION
and runtime:$VERSION
, with the version number.
For example, 0.9
will automatically point to the latest 0.9.x
release, and 0
will point to the latest 0.x.x
release.$VERSION
in openhands:$VERSION
and runtime:$VERSION
, with main
.
This version is unstable and is recommended for testing or development purposes only.