Skip to main content
This page provides a reference of environment variables that can be used to configure OpenHands. Environment variables provide an alternative to TOML configuration files and are particularly useful for containerized deployments, CI/CD pipelines, and cloud environments.

Environment Variable Naming Convention

OpenHands follows a consistent naming pattern for environment variables:
  • Core settings: Direct uppercase mapping (e.g., debugDEBUG)
  • LLM settings: Prefixed with LLM_ (e.g., modelLLM_MODEL)
  • Agent settings: Prefixed with AGENT_ (e.g., enable_browsingAGENT_ENABLE_BROWSING)
  • Sandbox settings: Prefixed with SANDBOX_ (e.g., timeoutSANDBOX_TIMEOUT)
  • Security settings: Prefixed with SECURITY_ (e.g., confirmation_modeSECURITY_CONFIRMATION_MODE)

Two-Container Architecture (V1)

This section is important for understanding how environment variables work in Docker deployments.
When running OpenHands with Docker (the default), the system uses a two-container architecture:
  1. App-server container (openhands-app) - Handles the web UI, API endpoints, and orchestration
  2. Agent-server container (oh-agent-server-*) - Runs the actual AI agent and executes commands
Environment variables set on your host machine or in docker-compose only directly affect the app-server container. To configure the agent-server, you need to understand how variables are forwarded.

Auto-Forwarded Variables

The following environment variable prefixes are automatically forwarded from the app-server to the agent-server:
PrefixDescriptionExamples
LLM_*LLM configuration settingsLLM_TIMEOUT, LLM_NUM_RETRIES, LLM_MODEL
This means you can set LLM_TIMEOUT=3600 on your host, and it will automatically be available inside the agent-server container.

Explicit Agent-Server Configuration

For environment variables that are not auto-forwarded, you can use the OH_AGENT_SERVER_ENV variable to pass arbitrary settings to the agent-server:
# Pass custom environment variables to the agent-server as a JSON object
export OH_AGENT_SERVER_ENV='{"DEBUG": "true", "CUSTOM_VAR": "value", "LOG_LEVEL": "debug"}'
The OH_AGENT_SERVER_ENV JSON values take precedence over auto-forwarded variables. For example, if you set both LLM_TIMEOUT=3600 and OH_AGENT_SERVER_ENV='{"LLM_TIMEOUT": "7200"}', the agent-server will use 7200.

Agent-Server Networking

Environment VariableTypeDefaultDescription
AGENT_SERVER_USE_HOST_NETWORKbooleanfalseUse host networking mode for agent-server containers. Useful for reverse proxy setups.
SANDBOX_USE_HOST_NETWORKbooleanfalse(Legacy) Same as above, for backward compatibility
SANDBOX_CONTAINER_URL_PATTERNstring"http://localhost:{port}"URL pattern for accessing exposed agent-server ports. Use {port} as placeholder.
SANDBOX_HOST_PORTinteger3000The port the app-server is running on (for webhook callbacks)

Example: Configuring LLM Timeouts

# These LLM_* variables are auto-forwarded to the agent-server
export LLM_TIMEOUT=3600           # 1 hour timeout
export LLM_NUM_RETRIES=10         # More retries for flaky connections
export LLM_RETRY_MIN_WAIT=30      # 30 second minimum wait between retries

# Start OpenHands - these settings will apply to the agent
docker run -e LLM_TIMEOUT -e LLM_NUM_RETRIES -e LLM_RETRY_MIN_WAIT ...

Example: Host Network Mode

# Enable host network mode for the agent-server
# (useful when running behind a reverse proxy)
export AGENT_SERVER_USE_HOST_NETWORK=true
export SANDBOX_CONTAINER_URL_PATTERN="http://localhost:{port}"

Core Configuration Variables

These variables correspond to the [core] section in config.toml:
Environment VariableTypeDefaultDescription
DEBUGbooleanfalseEnable debug logging throughout the application
DISABLE_COLORbooleanfalseDisable colored output in terminal
CACHE_DIRstring"/tmp/cache"Directory path for caching
SAVE_TRAJECTORY_PATHstring"./trajectories"Path to store conversation trajectories
REPLAY_TRAJECTORY_PATHstring""Path to load and replay a trajectory file
FILE_STORE_PATHstring"/tmp/file_store"File store directory path
FILE_STOREstring"memory"File store type (memory, local, etc.)
FILE_UPLOADS_MAX_FILE_SIZE_MBinteger0Maximum file upload size in MB (0 = no limit)
FILE_UPLOADS_RESTRICT_FILE_TYPESbooleanfalseWhether to restrict file upload types
FILE_UPLOADS_ALLOWED_EXTENSIONSlist[".*"]List of allowed file extensions for uploads
MAX_BUDGET_PER_TASKfloat0.0Maximum budget per task (0.0 = no limit)
MAX_ITERATIONSinteger100Maximum number of iterations per task
RUNTIMEstring"docker"Runtime environment (docker, local, cli, etc.)
DEFAULT_AGENTstring"CodeActAgent"Default agent class to use
JWT_SECRETstringauto-generatedJWT secret for authentication
RUN_AS_OPENHANDSbooleantrueWhether to run as the openhands user
VOLUMESstring""Volume mounts in format host:container[:mode]

LLM Configuration Variables

These variables correspond to the [llm] section in config.toml:
Environment VariableTypeDefaultDescription
LLM_MODELstring"claude-3-5-sonnet-20241022"LLM model to use
LLM_API_KEYstring""API key for the LLM provider
LLM_BASE_URLstring""Custom API base URL
LLM_API_VERSIONstring""API version to use
LLM_TEMPERATUREfloat0.0Sampling temperature
LLM_TOP_Pfloat1.0Top-p sampling parameter
LLM_MAX_INPUT_TOKENSinteger0Maximum input tokens (0 = no limit)
LLM_MAX_OUTPUT_TOKENSinteger0Maximum output tokens (0 = no limit)
LLM_MAX_MESSAGE_CHARSinteger30000Maximum characters that will be sent to the model in observation content
LLM_TIMEOUTinteger0API timeout in seconds (0 = no timeout)
LLM_NUM_RETRIESinteger8Number of retry attempts
LLM_RETRY_MIN_WAITinteger15Minimum wait time between retries (seconds)
LLM_RETRY_MAX_WAITinteger120Maximum wait time between retries (seconds)
LLM_RETRY_MULTIPLIERfloat2.0Exponential backoff multiplier
LLM_DROP_PARAMSbooleanfalseDrop unsupported parameters without error
LLM_CACHING_PROMPTbooleantrueEnable prompt caching if supported
LLM_DISABLE_VISIONbooleanfalseDisable vision capabilities for cost reduction
LLM_CUSTOM_LLM_PROVIDERstring""Custom LLM provider name
LLM_OLLAMA_BASE_URLstring""Base URL for Ollama API
LLM_INPUT_COST_PER_TOKENfloat0.0Cost per input token
LLM_OUTPUT_COST_PER_TOKENfloat0.0Cost per output token
LLM_REASONING_EFFORTstring""Reasoning effort for o-series models (low, medium, high)

AWS Configuration

Environment VariableTypeDefaultDescription
LLM_AWS_ACCESS_KEY_IDstring""AWS access key ID
LLM_AWS_SECRET_ACCESS_KEYstring""AWS secret access key
LLM_AWS_REGION_NAMEstring""AWS region name

Agent Configuration Variables

These variables correspond to the [agent] section in config.toml:
Environment VariableTypeDefaultDescription
AGENT_LLM_CONFIGstring""Name of LLM config group to use
AGENT_FUNCTION_CALLINGbooleantrueEnable function calling
AGENT_ENABLE_BROWSINGbooleanfalseEnable browsing delegate
AGENT_ENABLE_LLM_EDITORbooleanfalseEnable LLM-based editor
AGENT_ENABLE_JUPYTERbooleanfalseEnable Jupyter integration
AGENT_ENABLE_HISTORY_TRUNCATIONbooleantrueEnable history truncation
AGENT_ENABLE_PROMPT_EXTENSIONSbooleantrueEnable skills (formerly known as microagents) (prompt extensions)
AGENT_DISABLED_MICROAGENTSlist[]List of skills to disable

Sandbox Configuration Variables

These variables correspond to the [sandbox] section in config.toml:
Environment VariableTypeDefaultDescription
SANDBOX_TIMEOUTinteger120Sandbox timeout in seconds
SANDBOX_USER_IDinteger1000User ID for sandbox processes
SANDBOX_BASE_CONTAINER_IMAGEstring"nikolaik/python-nodejs:python3.12-nodejs22"Base container image
SANDBOX_USE_HOST_NETWORKbooleanfalseUse host networking
SANDBOX_RUNTIME_BINDING_ADDRESSstring"0.0.0.0"Runtime binding address
SANDBOX_ENABLE_AUTO_LINTbooleanfalseEnable automatic linting
SANDBOX_INITIALIZE_PLUGINSbooleantrueInitialize sandbox plugins
SANDBOX_RUNTIME_EXTRA_DEPSstring""Extra dependencies to install
SANDBOX_RUNTIME_STARTUP_ENV_VARSdict{}Environment variables for runtime
SANDBOX_BROWSERGYM_EVAL_ENVstring""BrowserGym evaluation environment
SANDBOX_VOLUMESstring""Volume mounts (replaces deprecated workspace settings)
AGENT_SERVER_IMAGE_REPOSITORYstring""Runtime container image repository (e.g., ghcr.io/openhands/agent-server)
AGENT_SERVER_IMAGE_TAGstring""Runtime container image tag (e.g., 1.11.4-python)
SANDBOX_KEEP_RUNTIME_ALIVEbooleanfalseKeep runtime alive after session ends
SANDBOX_PAUSE_CLOSED_RUNTIMESbooleanfalsePause instead of stopping closed runtimes
SANDBOX_CLOSE_DELAYinteger300Delay before closing idle runtimes (seconds)
SANDBOX_RM_ALL_CONTAINERSbooleanfalseRemove all containers when stopping
SANDBOX_ENABLE_GPUbooleanfalseEnable GPU support
SANDBOX_CUDA_VISIBLE_DEVICESstring""Specify GPU devices by ID
SANDBOX_VSCODE_PORTintegerautoSpecific port for VSCode server

Sandbox Environment Variables

Variables prefixed with SANDBOX_ENV_ are passed through to the sandbox environment:
Environment VariableDescription
SANDBOX_ENV_*Any variable with this prefix is passed to the sandbox (e.g., SANDBOX_ENV_OPENAI_API_KEY)

Security Configuration Variables

These variables correspond to the [security] section in config.toml:
Environment VariableTypeDefaultDescription
SECURITY_CONFIRMATION_MODEbooleanfalseEnable confirmation mode for actions
SECURITY_SECURITY_ANALYZERstring"llm"Security analyzer to use (llm, invariant)
SECURITY_ENABLE_SECURITY_ANALYZERbooleantrueEnable security analysis

Debug and Logging Variables

Environment VariableTypeDefaultDescription
DEBUGbooleanfalseEnable general debug logging
DEBUG_LLMbooleanfalseEnable LLM-specific debug logging
DEBUG_RUNTIMEbooleanfalseEnable runtime debug logging
LOG_TO_FILEbooleanautoLog to file (auto-enabled when DEBUG=true)

Runtime-Specific Variables

Docker Runtime

Environment VariableTypeDefaultDescription
SANDBOX_VOLUME_OVERLAYSstring""Volume overlay configurations

Remote Runtime

Environment VariableTypeDefaultDescription
SANDBOX_API_KEYstring""API key for remote runtime
SANDBOX_REMOTE_RUNTIME_API_URLstring""Remote runtime API URL

Local Runtime

Environment VariableTypeDefaultDescription
RUNTIME_URLstring""Runtime URL for local runtime
RUNTIME_URL_PATTERNstring""Runtime URL pattern
RUNTIME_IDstring""Runtime identifier
LOCAL_RUNTIME_MODEstring""Enable local runtime mode (1 to enable)

Integration Variables

GitHub Integration

Environment VariableTypeDefaultDescription
GITHUB_TOKENstring""GitHub personal access token

Third-Party API Keys

Environment VariableTypeDefaultDescription
OPENAI_API_KEYstring""OpenAI API key
ANTHROPIC_API_KEYstring""Anthropic API key
GOOGLE_API_KEYstring""Google API key
AZURE_API_KEYstring""Azure API key
TAVILY_API_KEYstring""Tavily search API key

Server Configuration Variables

These are primarily used when running OpenHands as a server:
Environment VariableTypeDefaultDescription
FRONTEND_PORTinteger3000Frontend server port
BACKEND_PORTinteger8000Backend server port
FRONTEND_HOSTstring"localhost"Frontend host address
BACKEND_HOSTstring"localhost"Backend host address
WEB_HOSTstring"localhost"Web server host
SERVE_FRONTENDbooleantrueWhether to serve frontend

Deprecated Variables

These variables are deprecated and should be replaced:
Environment VariableReplacementDescription
WORKSPACE_BASESANDBOX_VOLUMESUse volume mounting instead
WORKSPACE_MOUNT_PATHSANDBOX_VOLUMESUse volume mounting instead
WORKSPACE_MOUNT_PATH_IN_SANDBOXSANDBOX_VOLUMESUse volume mounting instead
WORKSPACE_MOUNT_REWRITESANDBOX_VOLUMESUse volume mounting instead

Usage Examples

Basic Setup with OpenAI

export LLM_MODEL="gpt-4o"
export LLM_API_KEY="your-openai-api-key"
export DEBUG=true

Docker Deployment with Custom Volumes

export RUNTIME="docker"
export SANDBOX_VOLUMES="/host/workspace:/workspace:rw,/host/data:/data:ro"
export SANDBOX_TIMEOUT=300

Remote Runtime Configuration

export RUNTIME="remote"
export SANDBOX_API_KEY="your-remote-api-key"
export SANDBOX_REMOTE_RUNTIME_API_URL="https://your-runtime-api.com"

Security-Enhanced Setup

export SECURITY_CONFIRMATION_MODE=true
export SECURITY_SECURITY_ANALYZER="llm"
export DEBUG_RUNTIME=true

Notes

  1. Boolean Values: Environment variables expecting boolean values accept true/false, 1/0, or yes/no (case-insensitive).
  2. List Values: Lists should be provided as Python literal strings, e.g., AGENT_DISABLED_MICROAGENTS='["skill1", "skill2"]'.
  3. Dictionary Values: Dictionaries should be provided as Python literal strings, e.g., SANDBOX_RUNTIME_STARTUP_ENV_VARS='{"KEY": "value"}'.
  4. Precedence: Environment variables take precedence over TOML configuration files.
  5. Docker Usage: When using Docker, pass environment variables with the -e flag:
    docker run -e LLM_API_KEY="your-key" -e DEBUG=true openhands/openhands
    
  6. Validation: Invalid environment variable values will be logged as errors and fall back to defaults.