I Gave an AI Full Control of My Android Phone. No Cloud, No API Keys, No Shame.
Every AI phone assistant I've tried phones home. Your screen content, your messages, your app usage — all of it goes to a server you don't control, and someone else's model decides what to do next....

Source: DEV Community
Every AI phone assistant I've tried phones home. Your screen content, your messages, your app usage — all of it goes to a server you don't control, and someone else's model decides what to do next. I got tired of it. So I built an alternative. PokeClaw runs Gemma 4 (2.3B parameters) entirely on your Android device. No Wi-Fi required after setup. No API key. No subscription. The model reads your screen through Android Accessibility, picks a tool, executes it, reads the result, and repeats until the task is done. All of it happens on your CPU or GPU. All of it stays on your phone. Why "On-Device" Actually Matters Here This isn't just a privacy talking point. When your AI assistant runs on-device, the latency profile changes completely. There's no round-trip to a server for every tool call. The model reads the accessibility tree, decides what to tap, taps it, reads the new state — all local. The bottleneck is your phone's processor, not your internet connection. There's also a category of