This page provides a reference of environment variables that can be used to configure OpenHands. Environment variables provide an alternative to TOML configuration files and are particularly useful for containerized deployments, CI/CD pipelines, and cloud environments.
Environment Variable Naming Convention
OpenHands follows a consistent naming pattern for environment variables:
- Core settings: Direct uppercase mapping (e.g.,
debug → DEBUG)
- LLM settings: Prefixed with
LLM_ (e.g., model → LLM_MODEL)
- Agent settings: Prefixed with
AGENT_ (e.g., enable_browsing → AGENT_ENABLE_BROWSING)
- Sandbox settings: Prefixed with
SANDBOX_ (e.g., timeout → SANDBOX_TIMEOUT)
- Security settings: Prefixed with
SECURITY_ (e.g., confirmation_mode → SECURITY_CONFIRMATION_MODE)
Two-Container Architecture (V1)
This section is important for understanding how environment variables work in Docker deployments.
When running OpenHands with Docker (the default), the system uses a two-container architecture:
- App-server container (
openhands-app) - Handles the web UI, API endpoints, and orchestration
- Agent-server container (
oh-agent-server-*) - Runs the actual AI agent and executes commands
Environment variables set on your host machine or in docker-compose only directly affect the app-server container. To configure the agent-server, you need to understand how variables are forwarded.
Auto-Forwarded Variables
The following environment variable prefixes are automatically forwarded from the app-server to the agent-server:
| Prefix | Description | Examples |
|---|
LLM_* | LLM configuration settings | LLM_TIMEOUT, LLM_NUM_RETRIES, LLM_MODEL |
This means you can set LLM_TIMEOUT=3600 on your host, and it will automatically be available inside the agent-server container.
Explicit Agent-Server Configuration
For environment variables that are not auto-forwarded, you can use the OH_AGENT_SERVER_ENV variable to pass arbitrary settings to the agent-server:
# Pass custom environment variables to the agent-server as a JSON object
export OH_AGENT_SERVER_ENV='{"DEBUG": "true", "CUSTOM_VAR": "value", "LOG_LEVEL": "debug"}'
The OH_AGENT_SERVER_ENV JSON values take precedence over auto-forwarded variables. For example, if you set both LLM_TIMEOUT=3600 and OH_AGENT_SERVER_ENV='{"LLM_TIMEOUT": "7200"}', the agent-server will use 7200.
Agent-Server Networking
| Environment Variable | Type | Default | Description |
|---|
AGENT_SERVER_USE_HOST_NETWORK | boolean | false | Use host networking mode for agent-server containers. Useful for reverse proxy setups. |
SANDBOX_USE_HOST_NETWORK | boolean | false | (Legacy) Same as above, for backward compatibility |
SANDBOX_CONTAINER_URL_PATTERN | string | "http://localhost:{port}" | URL pattern for accessing exposed agent-server ports. Use {port} as placeholder. |
SANDBOX_HOST_PORT | integer | 3000 | The port the app-server is running on (for webhook callbacks) |
Example: Configuring LLM Timeouts
# These LLM_* variables are auto-forwarded to the agent-server
export LLM_TIMEOUT=3600 # 1 hour timeout
export LLM_NUM_RETRIES=10 # More retries for flaky connections
export LLM_RETRY_MIN_WAIT=30 # 30 second minimum wait between retries
# Start OpenHands - these settings will apply to the agent
docker run -e LLM_TIMEOUT -e LLM_NUM_RETRIES -e LLM_RETRY_MIN_WAIT ...
Example: Host Network Mode
# Enable host network mode for the agent-server
# (useful when running behind a reverse proxy)
export AGENT_SERVER_USE_HOST_NETWORK=true
export SANDBOX_CONTAINER_URL_PATTERN="http://localhost:{port}"
Core Configuration Variables
These variables correspond to the [core] section in config.toml:
| Environment Variable | Type | Default | Description |
|---|
DEBUG | boolean | false | Enable debug logging throughout the application |
DISABLE_COLOR | boolean | false | Disable colored output in terminal |
CACHE_DIR | string | "/tmp/cache" | Directory path for caching |
SAVE_TRAJECTORY_PATH | string | "./trajectories" | Path to store conversation trajectories |
REPLAY_TRAJECTORY_PATH | string | "" | Path to load and replay a trajectory file |
FILE_STORE_PATH | string | "/tmp/file_store" | File store directory path |
FILE_STORE | string | "memory" | File store type (memory, local, etc.) |
FILE_UPLOADS_MAX_FILE_SIZE_MB | integer | 0 | Maximum file upload size in MB (0 = no limit) |
FILE_UPLOADS_RESTRICT_FILE_TYPES | boolean | false | Whether to restrict file upload types |
FILE_UPLOADS_ALLOWED_EXTENSIONS | list | [".*"] | List of allowed file extensions for uploads |
MAX_BUDGET_PER_TASK | float | 0.0 | Maximum budget per task (0.0 = no limit) |
MAX_ITERATIONS | integer | 100 | Maximum number of iterations per task |
RUNTIME | string | "docker" | Runtime environment (docker, local, cli, etc.) |
DEFAULT_AGENT | string | "CodeActAgent" | Default agent class to use |
JWT_SECRET | string | auto-generated | JWT secret for authentication |
RUN_AS_OPENHANDS | boolean | true | Whether to run as the openhands user |
VOLUMES | string | "" | Volume mounts in format host:container[:mode] |
LLM Configuration Variables
These variables correspond to the [llm] section in config.toml:
| Environment Variable | Type | Default | Description |
|---|
LLM_MODEL | string | "claude-3-5-sonnet-20241022" | LLM model to use |
LLM_API_KEY | string | "" | API key for the LLM provider |
LLM_BASE_URL | string | "" | Custom API base URL |
LLM_API_VERSION | string | "" | API version to use |
LLM_TEMPERATURE | float | 0.0 | Sampling temperature |
LLM_TOP_P | float | 1.0 | Top-p sampling parameter |
LLM_MAX_INPUT_TOKENS | integer | 0 | Maximum input tokens (0 = no limit) |
LLM_MAX_OUTPUT_TOKENS | integer | 0 | Maximum output tokens (0 = no limit) |
LLM_MAX_MESSAGE_CHARS | integer | 30000 | Maximum characters that will be sent to the model in observation content |
LLM_TIMEOUT | integer | 0 | API timeout in seconds (0 = no timeout) |
LLM_NUM_RETRIES | integer | 8 | Number of retry attempts |
LLM_RETRY_MIN_WAIT | integer | 15 | Minimum wait time between retries (seconds) |
LLM_RETRY_MAX_WAIT | integer | 120 | Maximum wait time between retries (seconds) |
LLM_RETRY_MULTIPLIER | float | 2.0 | Exponential backoff multiplier |
LLM_DROP_PARAMS | boolean | false | Drop unsupported parameters without error |
LLM_CACHING_PROMPT | boolean | true | Enable prompt caching if supported |
LLM_DISABLE_VISION | boolean | false | Disable vision capabilities for cost reduction |
LLM_CUSTOM_LLM_PROVIDER | string | "" | Custom LLM provider name |
LLM_OLLAMA_BASE_URL | string | "" | Base URL for Ollama API |
LLM_INPUT_COST_PER_TOKEN | float | 0.0 | Cost per input token |
LLM_OUTPUT_COST_PER_TOKEN | float | 0.0 | Cost per output token |
LLM_REASONING_EFFORT | string | "" | Reasoning effort for o-series models (low, medium, high) |
AWS Configuration
| Environment Variable | Type | Default | Description |
|---|
LLM_AWS_ACCESS_KEY_ID | string | "" | AWS access key ID |
LLM_AWS_SECRET_ACCESS_KEY | string | "" | AWS secret access key |
LLM_AWS_REGION_NAME | string | "" | AWS region name |
Agent Configuration Variables
These variables correspond to the [agent] section in config.toml:
| Environment Variable | Type | Default | Description |
|---|
AGENT_LLM_CONFIG | string | "" | Name of LLM config group to use |
AGENT_FUNCTION_CALLING | boolean | true | Enable function calling |
AGENT_ENABLE_BROWSING | boolean | false | Enable browsing delegate |
AGENT_ENABLE_LLM_EDITOR | boolean | false | Enable LLM-based editor |
AGENT_ENABLE_JUPYTER | boolean | false | Enable Jupyter integration |
AGENT_ENABLE_HISTORY_TRUNCATION | boolean | true | Enable history truncation |
AGENT_ENABLE_PROMPT_EXTENSIONS | boolean | true | Enable skills (formerly known as microagents) (prompt extensions) |
AGENT_DISABLED_MICROAGENTS | list | [] | List of skills to disable |
Sandbox Configuration Variables
These variables correspond to the [sandbox] section in config.toml:
| Environment Variable | Type | Default | Description |
|---|
SANDBOX_TIMEOUT | integer | 120 | Sandbox timeout in seconds |
SANDBOX_USER_ID | integer | 1000 | User ID for sandbox processes |
SANDBOX_BASE_CONTAINER_IMAGE | string | "nikolaik/python-nodejs:python3.12-nodejs22" | Base container image |
SANDBOX_USE_HOST_NETWORK | boolean | false | Use host networking |
SANDBOX_RUNTIME_BINDING_ADDRESS | string | "0.0.0.0" | Runtime binding address |
SANDBOX_ENABLE_AUTO_LINT | boolean | false | Enable automatic linting |
SANDBOX_INITIALIZE_PLUGINS | boolean | true | Initialize sandbox plugins |
SANDBOX_RUNTIME_EXTRA_DEPS | string | "" | Extra dependencies to install |
SANDBOX_RUNTIME_STARTUP_ENV_VARS | dict | {} | Environment variables for runtime |
SANDBOX_BROWSERGYM_EVAL_ENV | string | "" | BrowserGym evaluation environment |
SANDBOX_VOLUMES | string | "" | Volume mounts (replaces deprecated workspace settings) |
AGENT_SERVER_IMAGE_REPOSITORY | string | "" | Runtime container image repository (e.g., ghcr.io/openhands/agent-server) |
AGENT_SERVER_IMAGE_TAG | string | "" | Runtime container image tag (e.g., 1.11.4-python) |
SANDBOX_KEEP_RUNTIME_ALIVE | boolean | false | Keep runtime alive after session ends |
SANDBOX_PAUSE_CLOSED_RUNTIMES | boolean | false | Pause instead of stopping closed runtimes |
SANDBOX_CLOSE_DELAY | integer | 300 | Delay before closing idle runtimes (seconds) |
SANDBOX_RM_ALL_CONTAINERS | boolean | false | Remove all containers when stopping |
SANDBOX_ENABLE_GPU | boolean | false | Enable GPU support |
SANDBOX_CUDA_VISIBLE_DEVICES | string | "" | Specify GPU devices by ID |
SANDBOX_VSCODE_PORT | integer | auto | Specific port for VSCode server |
Sandbox Environment Variables
Variables prefixed with SANDBOX_ENV_ are passed through to the sandbox environment:
| Environment Variable | Description |
|---|
SANDBOX_ENV_* | Any variable with this prefix is passed to the sandbox (e.g., SANDBOX_ENV_OPENAI_API_KEY) |
Security Configuration Variables
These variables correspond to the [security] section in config.toml:
| Environment Variable | Type | Default | Description |
|---|
SECURITY_CONFIRMATION_MODE | boolean | false | Enable confirmation mode for actions |
SECURITY_SECURITY_ANALYZER | string | "llm" | Security analyzer to use (llm, invariant) |
SECURITY_ENABLE_SECURITY_ANALYZER | boolean | true | Enable security analysis |
Debug and Logging Variables
| Environment Variable | Type | Default | Description |
|---|
DEBUG | boolean | false | Enable general debug logging |
DEBUG_LLM | boolean | false | Enable LLM-specific debug logging |
DEBUG_RUNTIME | boolean | false | Enable runtime debug logging |
LOG_TO_FILE | boolean | auto | Log to file (auto-enabled when DEBUG=true) |
Runtime-Specific Variables
Docker Runtime
| Environment Variable | Type | Default | Description |
|---|
SANDBOX_VOLUME_OVERLAYS | string | "" | Volume overlay configurations |
Remote Runtime
| Environment Variable | Type | Default | Description |
|---|
SANDBOX_API_KEY | string | "" | API key for remote runtime |
SANDBOX_REMOTE_RUNTIME_API_URL | string | "" | Remote runtime API URL |
Local Runtime
| Environment Variable | Type | Default | Description |
|---|
RUNTIME_URL | string | "" | Runtime URL for local runtime |
RUNTIME_URL_PATTERN | string | "" | Runtime URL pattern |
RUNTIME_ID | string | "" | Runtime identifier |
LOCAL_RUNTIME_MODE | string | "" | Enable local runtime mode (1 to enable) |
Integration Variables
GitHub Integration
| Environment Variable | Type | Default | Description |
|---|
GITHUB_TOKEN | string | "" | GitHub personal access token |
Third-Party API Keys
| Environment Variable | Type | Default | Description |
|---|
OPENAI_API_KEY | string | "" | OpenAI API key |
ANTHROPIC_API_KEY | string | "" | Anthropic API key |
GOOGLE_API_KEY | string | "" | Google API key |
AZURE_API_KEY | string | "" | Azure API key |
TAVILY_API_KEY | string | "" | Tavily search API key |
Server Configuration Variables
These are primarily used when running OpenHands as a server:
| Environment Variable | Type | Default | Description |
|---|
FRONTEND_PORT | integer | 3000 | Frontend server port |
BACKEND_PORT | integer | 8000 | Backend server port |
FRONTEND_HOST | string | "localhost" | Frontend host address |
BACKEND_HOST | string | "localhost" | Backend host address |
WEB_HOST | string | "localhost" | Web server host |
SERVE_FRONTEND | boolean | true | Whether to serve frontend |
Deprecated Variables
These variables are deprecated and should be replaced:
| Environment Variable | Replacement | Description |
|---|
WORKSPACE_BASE | SANDBOX_VOLUMES | Use volume mounting instead |
WORKSPACE_MOUNT_PATH | SANDBOX_VOLUMES | Use volume mounting instead |
WORKSPACE_MOUNT_PATH_IN_SANDBOX | SANDBOX_VOLUMES | Use volume mounting instead |
WORKSPACE_MOUNT_REWRITE | SANDBOX_VOLUMES | Use volume mounting instead |
Usage Examples
Basic Setup with OpenAI
export LLM_MODEL="gpt-4o"
export LLM_API_KEY="your-openai-api-key"
export DEBUG=true
Docker Deployment with Custom Volumes
export RUNTIME="docker"
export SANDBOX_VOLUMES="/host/workspace:/workspace:rw,/host/data:/data:ro"
export SANDBOX_TIMEOUT=300
Remote Runtime Configuration
export RUNTIME="remote"
export SANDBOX_API_KEY="your-remote-api-key"
export SANDBOX_REMOTE_RUNTIME_API_URL="https://your-runtime-api.com"
Security-Enhanced Setup
export SECURITY_CONFIRMATION_MODE=true
export SECURITY_SECURITY_ANALYZER="llm"
export DEBUG_RUNTIME=true
Notes
-
Boolean Values: Environment variables expecting boolean values accept
true/false, 1/0, or yes/no (case-insensitive).
-
List Values: Lists should be provided as Python literal strings, e.g.,
AGENT_DISABLED_MICROAGENTS='["skill1", "skill2"]'.
-
Dictionary Values: Dictionaries should be provided as Python literal strings, e.g.,
SANDBOX_RUNTIME_STARTUP_ENV_VARS='{"KEY": "value"}'.
-
Precedence: Environment variables take precedence over TOML configuration files.
-
Docker Usage: When using Docker, pass environment variables with the
-e flag:
docker run -e LLM_API_KEY="your-key" -e DEBUG=true openhands/openhands
-
Validation: Invalid environment variable values will be logged as errors and fall back to defaults.