Native performance via Termux & glibc. No Proot. No VM. Pure ARM64 power.
π Quick Start β’ β‘ Features β’ π¦ Local LLM β’ π Docs β’ π Architecture β’ π€ Community
Your old Android phone is not e-waste. It's a powerful ARM64 server waiting to happen.
OCA seamlessly installs the OpenClaw AI ecosystem directly onto your device via Termux. This completely bypasses sluggish Linux distributions (like Ubuntu on Proot), running natively with full glibc compatibility instead of Android's default Bionic libraries.
| Before OCA β | After OCA β |
|---|---|
| Slow Proot containers | Native ARM64 execution |
| Bionic libc limitations | Full glibc compatibility |
| Manual setup (hours) | One-command install (2 mins) |
| Limited AI tools | 4+ AI CLIs pre-configured |
| No remote access | SSH server included |
| Static installation | Auto-updating ecosystem |
Configure Developer Options, Stay Awake, and battery optimization to prevent Android from killing Termux.
π Read Phone Setup Guide for step-by-step instructions.
β οΈ Important: The Play Store version of Termux is discontinued. You must install from F-Droid.
- Open browser and go to f-droid.org
- Search for Termux β Tap Download APK
- Install and allow "Install from unknown sources"
Open Termux and run:
pkg update -y && pkg install -y curlcurl -sL https://raw.githubusercontent.com/PsProsen-Dev/OpenClaw-On-Android/master/bootstrap.sh | bash && source ~/.bashrcTakes 3-10 minutes depending on network and device.
openclaw onboardFollow the on-screen instructions.
openclaw gateway
β οΈ Important: Run directly in Termux app, not via SSH. SSH session disconnect will stop the gateway.
That's it. You now have a full AI server on your phone.
| π± Native Execution | βοΈ Full Node.js v24 |
|---|---|
| OpenClaw runs bare-metal in Termux | glibc-patched Node.js v24.14.0 |
| No Proot overhead | Official linux-arm64 binaries |
| 100% ARM64 optimized | Bypasses Android linker restrictions |
| π€ AI CLI Tools | π¦ Local LLM |
|---|---|
| Qwen Code, Claude Code, Gemini, Codex | node-llama-cpp + Ollama support |
| Zero-config setup | Run models locally (experimental) |
| Cloud API routing | βοΈ NEW: Ollama Cloud Models |
| π Remote Access | π‘οΈ Safe Root |
|---|---|
| SSH server on port 8022 | oca-root wrapper for rooted devices |
| Remote terminal access | Selective root commands |
| 24/7 headless operation | No system compromise |
| π Auto-Updates | π§ Unified CLI |
|---|---|
One-command updates via oca --update |
oa command for all operations |
| Automatic security patches | Update, status, install, uninstall |
| Rolling release model | Platform-aware architecture |
After installation, use the oa command for managing your installation:
| Command | Description |
|---|---|
oa --update |
Update OpenClaw and Android patches |
oa --install |
Install optional tools (tmux, code-server, AI CLIs) |
oa --uninstall |
Remove OpenClaw on Android |
oa --status |
Show installation status |
oa --version |
Show version |
oa --help |
Show available options |
Update example:
oa --update && source ~/.bashrcThis updates: OpenClaw core, code-server, OpenCode, AI CLI tools, Android patches
OCA now supports local LLM inference via node-llama-cpp and Ollama (including cloud models!).
Run powerful models in the cloud β zero local RAM/storage usage!
# Pull and launch with cloud model
ollama pull kimi-k2.5:cloud
ollama launch openclaw --model kimi-k2.5:cloudRecommended Cloud Models:
kimi-k2.5:cloud- Multimodal reasoning (64k context)minimax-m2.5:cloud- Fast coding (64k context)glm-5:cloud- Reasoning & code generationgpt-oss:120b-cloud- High-performance (128k context)
β οΈ Important Constraints (click to expand)
| Constraint | Requirement | Reality Check |
|---|---|---|
| RAM | 2-4GB free | Phone RAM is shared with Android |
| Storage | 4-70GB+ | Model files are large |
| Speed | CPU-only | No GPU offloading on Android |
| Use Case | Testing/Experimentation | Cloud APIs for production |
# Option 1: node-llama-cpp (Recommended)
npm install -g node-llama-cpp --ignore-scripts
# Option 2: Ollama (Full Server)
curl -fsSL https://ollama.com/install.sh | sh| Model | Size | RAM Needed | Speed | Best For |
|---|---|---|---|---|
| TinyLlama 1.1B | ~670MB | 2GB | β‘β‘β‘ | Testing |
| Phi-3 Mini | ~2.3GB | 4GB | β‘β‘ | Light tasks |
| Llama 3.2 1B | ~670MB | 2GB | β‘β‘β‘ | Mobile-friendly |
| Mistral 7B | ~4.1GB | 8GB | β‘ | Advanced only |
π Read Full Local LLM Guide for detailed setup, troubleshooting, and cloud comparison.
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Android Device (Termux) β
β β β
β βΌ β
β ββββββββββββββββββββ β
β β glibc-runner β β
β β (ld.so wrapper) β β
β ββββββββββ¬ββββββββββ β
β β β
β βΌ β
β ββββββββββββββββββββ β
β β Node.js v24 β β
β β linux-arm64 β β
β ββββββββββ¬ββββββββββ β
β β β
β βββββββββββββββ΄βββββββββββββββ β
β βΌ βΌ β
β ββββββββββββββββ ββββββββββββββββ β
β β OpenClaw β β Local LLM β β
β β Gateway β β (Optional) β β
β β β β β β
β β β’ AI CLIs β β β’ llama.cpp β β
β β β’ SSH β β β’ Ollama β β
β β β’ clawdhub β β β’ GGUF β β
β ββββββββββββββββ ββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
- Pacman
glibc-runner: Injectsld-linux-aarch64.so.1to bypass Android's restricted linker - Path Rewriting: UNIX paths (
/tmp,/bin/sh) dynamically mapped to Termux prefixes - JS Runtime Shims:
glibc-compat.jspolyfillsos.cpus()andos.networkInterfaces()for V8
| π Guide | Description |
|---|---|
| π Quick Start | Get running in 5 minutes |
| π± Phone Setup | Developer Options, Stay Awake, Battery |
| π§ Installation | Full 8-step installer breakdown |
| π€ AI CLI Tools | Qwen, Claude, Gemini, Codex setup |
| π¦ Local LLM | Run models locally (node-llama-cpp, Ollama) |
| π Dashboard Connect | Multi-device management from PC |
| π SSH Setup | Remote access configuration |
| βοΈ Configuration | Manage settings and preferences |
| π§ Troubleshooting | Common errors and fixes |
| π» Phantom Process Killer | Android 12+ fix |
π Browse all docs:
docs/
If you see [Process completed (signal 9)], Android's Phantom Process Killer has terminated Termux.
Fix it in 30 seconds:
adb shell settings put global development_settings_enabled 1
adb shell settings put global max_phantom_processes 64π Read Full Fix Guide
Android may kill background processes or throttle them when the screen is off. For 24/7 operation:
| Setting | Purpose | How To |
|---|---|---|
| Developer Options | Enable advanced controls | Settings β About β Tap Build Number 7x |
| Stay Awake | Prevent CPU throttling | Developer Options β Stay Awake |
| Battery Optimization | Prevent app killing | Settings β Apps β Termux β Battery β Unrestricted |
| Charge Limit | Protect battery during 24/7 use | Use AccuBattery or similar |
π Read Complete Phone Setup Guide for detailed instructions.
Access your OCA dashboard from PC browser via SSH tunnel:
# From your PC terminal
ssh -L 3000:localhost:3000 -L 8080:localhost:8080 u0_aXXX@192.168.X.X -p 8022π Read SSH Setup Guide for complete instructions.
Running OCA on multiple devices? Use Dashboard Connect to manage them from your PC:
- Save connection settings (IP, token, ports) for each device with a nickname
- Generates SSH tunnel commands automatically
- Switch between devices with one click
- Your data stays local β Settings saved only in browser localStorage
π‘ Tip: Name your devices (e.g., "Old Pixel", "Bedroom Phone") for easy identification.
OCA uses a platform-plugin architecture that separates platform-agnostic infrastructure from platform-specific code:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Orchestrators (install.sh, update-core.sh, uninstall.sh) β
β ββ Platform-agnostic. Read config.env and delegate. β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Shared Scripts (scripts/) β
β ββ L1: install-infra-deps.sh (always) β
β ββ L2: install-glibc.sh, install-nodejs.sh (conditional) β
β ββ L3: Optional tools (user-selected) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Platform Plugins (platforms/<platform>/) β
β ββ config.env: declares dependencies β
β ββ install.sh / update.sh / uninstall.sh / ... β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Dependency layers:
| Layer | Scope | Examples | Controlled by |
|---|---|---|---|
| L1 | Infrastructure (always) | git, pkg update | Orchestrator |
| L2 | Platform runtime (conditional) | glibc, Node.js, build tools | config.env flags |
| L3 | Optional tools (user-selected) | tmux, code-server, AI CLIs | User prompts |
Core Infrastructure (L1): git
Platform Runtime (L2): pacman, glibc-runner, Node.js v24, python, make, cmake, clang, binutils
OpenClaw Platform: OpenClaw, clawdhub, PyYAML, libvips
Optional Tools (L3): tmux, ttyd, dufs, android-tools, code-server, OpenCode, Claude Code, Gemini CLI, Codex CLI
| Feature | Description |
|---|---|
| node-llama-cpp | Prebuilt binary support with --ignore-scripts |
| Ollama Integration | Full server with model management |
| Model Guide | Recommendations for RAM/Storage constraints |
| Cloud vs Local | Comparison table for decision making |
π¦ View Release Notes
Join the discussion! Ask questions, share your Android setup, or request features.
π¬ GitHub Discussions ββ’β π Report an Issue

