phingr controls devices via USB HID + camera — like a real finger on the screen. Works on iOS, Android, smartwatches, anything with a USB port and a display. Unlock the lock screen, toggle Wi-Fi, navigate any menu. No SDK. No jailbreak. No cloud.
XCTest, Espresso, Appium — they all run inside the device OS, bound by accessibility hooks and sandbox rules. phingr doesn't run on the device at all. It talks to a Raspberry Pi that pretends to be a USB mouse and keyboard, and watches the screen through a camera. If a human finger can reach it, phingr can automate it. Lock screen, system settings, third-party apps, system dialogs — any OS, any app, any screen.
Everything runs on a Raspberry Pi sitting next to your mobile device. There is no cloud component. No telemetry. No external API calls. No accounts on someone else's server. Your test data, your screen content, your device interactions - none of it ever leaves your network. This is not a feature. It is the architecture.
The full source code is on GitHub. With an active subscription, you can read it, modify it, extend it, and adapt it to your workflow. Add custom API endpoints. Change the OCR pipeline. Integrate with your internal tools. The code is yours to work with. No black boxes.
We believe in human honesty.
There is no license server. No activation key. No DRM. No usage tracking. No protection mechanism of any kind. The software runs without restriction whether you have a subscription or not.
Subscription is voluntary. If you use phingr and it brings value to your work, we trust you to subscribe. That's it. No enforcement. No nagging. No crippled features. The full product is the full product.
We chose this model because we believe the right relationship between a tool and its users is one built on trust, not locks.
A Raspberry Pi, a USB cable, a camera. That's the entire system.
Plug the Raspberry Pi Zero 2W into your mobile device via USB. Mount the CSI camera facing the screen. That's your hardware.
Run one setup script. It configures the USB HID gadget, starts the camera, and launches the test automation server. Done in minutes.
Declare flows in YAML for simple scripts, or use the async Python API (PhingrSession) for dynamic logic that branches on UI state. Template matching + OCR find elements, normalized coordinates work across screen sizes.
XCTest, Espresso, Appium — they all run inside the OS. phingr doesn't.
| SDK-Based (XCTest, Espresso, Appium) | phingr (USB HID + camera) | |
|---|---|---|
| Lock screen | Cannot access | Unlock, interact |
| System Settings | Cannot navigate | Full access |
| Wi-Fi / Bluetooth toggle | Cannot toggle | Toggle freely |
| System alerts & dialogs | Limited | Tap any button |
| Third-party apps | Sandboxed | Any app, any screen |
| Cross-app workflows | Not supported | Switch freely |
| Smartwatches, embedded displays | Not supported | Anything with a screen |
| Requires device-side agent/SDK | Yes | None |
| Requires host IDE (Xcode/AS) | Yes | No — runs on Raspberry Pi |
A declarative YAML DSL for simple flows, and a dynamic async Python API for scripts that need to branch on what's actually on screen.
PhingrSessionfind() returns bbox + score, exists() returns a bool — write any logic Python allows.
Register UI elements by cropping them from screenshots in the web editor. phingr finds them in the live camera feed using OpenCV template matching. Use OCR (text:) for anything with readable labels — no accessibility IDs required.
async with PhingrSession(...)tapOn, swipeUntilFound, repeat…if await s.exists("btn")import asyncio from phingr import PhingrSession async def main(): async with PhingrSession( server_url="http://localhost:8800", device_url="http://phingr.local:8080", ) as s: await s.press_key("home") # Find element, get bbox + score settings = await s.find("settings_icon") if not settings: await s.swipe_until_found("settings_icon", direction="LEFT") await s.tap_on("settings_icon") # Branch on current UI state if await s.exists("bt_toggle_on"): await s.tap_on("bt_toggle_on", offset=(0.85, 0.5)) elif await s.exists("bt_toggle_off"): await s.tap_on("bt_toggle_off", offset=(0.85, 0.5)) # OCR: find text regions on screen matches = await s.find_text("Bluetooth") for m in matches: print(f"{m['text']} @ ({m['x']:.2f}, {m['y']:.2f})") asyncio.run(main())
Per device, per month. Full source code access with every plan.
7 days, full access
For individual developers
For teams & organizations
Open source with subscription. The code is on GitHub. Subscribers can read, modify, and extend it for their own use.
Phantom Finger Remote. It controls your mobile device like a ghost is tapping it — by physically interacting with the screen through USB HID while watching the display with a camera. No software on the device, no SDK, no sandbox.
Any device that accepts a USB HID mouse and keyboard — iOS 14+, Android (most versions), smartwatches, Linux tablets, even embedded displays. Because phingr controls the device externally, there's no OS-specific agent or SDK to install. If the OS sees a USB mouse cursor, phingr can drive it.
Those tools all run inside the device OS — they need either a test harness built into your app (XCTest, Espresso) or an accessibility agent running on the device (Appium). That means they can't touch the lock screen, can't navigate System Settings, can't toggle Wi-Fi, and can't cross app boundaries freely. phingr doesn't run on the device at all. The Raspberry Pi acts as an external USB mouse/keyboard and uses a camera to see the screen — so it can reach anything a human finger can, on any OS.
Yes. phingr can swipe to unlock and type a passcode via USB keyboard, just like a person would. This is impossible with XCTest-based tools.
Yes. phingr can navigate to System Settings and tap any toggle - Wi-Fi, Bluetooth, Airplane Mode, anything. It sees and interacts with the actual screen, not an app's internal view hierarchy.
A Raspberry Pi Zero 2W, a CSI camera (Arducam IMX519 or RPi Camera v3), and a USB data cable. Total hardware cost is under $50.
No. There is no cloud component at all. The Raspberry Pi processes everything locally - camera capture, OCR, HID control. Nothing is sent to any external server. Ever.
No. phingr uses standard USB HID protocol — the same way any external keyboard or mouse connects. Stock iOS (14+), stock Android, stock WearOS, all work unmodified. No jailbreak, no root, no dev mode toggle (on iOS), no ADB required on Android.
Yes. The source code is fully available on GitHub. With an active subscription, you can modify it, add custom endpoints, change the OCR pipeline, or adapt it to your specific test workflow. The code is yours to work with.
Yes. The entire API is HTTP-based. Integrate with Jenkins, GitHub Actions, GitLab CI, or any tool that can make HTTP requests. Run your mobile test suite as part of your pipeline.