iphone-mirroir-mcp

Give your AI an iPhone.

MCP server that controls a real iPhone through macOS iPhone Mirroring. Screenshot, tap, swipe, type — from any MCP client. Works with any app on screen, no source code required.

Install iphone-mirroir-mcp

$ /bin/bash -c "$(curl -fsSL https://mirroir.dev/get-mirroir.sh)"

Paste that in a macOS Terminal. Clones the repo, builds from source, installs the helper daemon, and configures Karabiner.

or via npx:

$ npx -y iphone-mirroir-mcp install

or via Homebrew:

$ brew tap jfarcand/tap && brew install iphone-mirroir-mcp

Scenarios: intents, not scripts

Scenarios are reusable YAML files that describe automation flows as intents, not scripts. Steps like tap: "Email" don't specify coordinates — the AI finds the element by OCR matching and adapts to unexpected dialogs and layout changes. Test an app or automate your morning routine. Share them on the community repository.

Branching + looping

name: Expo Go Signup + Onboarding
app: Expo Go
description: >
  Sign up, handle error or success,
  then swipe through onboarding.

steps:
  - launch: "Expo Go"
  - wait_for: "LoginDemo"
  - tap: "LoginDemo"
  - tap: "Sign Up"
  - tap: "Email"
  - type: "${TEST_EMAIL}"
  - tap: "Password"
  - type: "${TEST_PASSWORD}"
  - tap: "Create Account"
  - condition:
      if_visible: "already exists"
      then:
        - tap: "Sign In Instead"
        - tap: "Email"
        - type: "${TEST_EMAIL}"
        - tap: "Password"
        - type: "${TEST_PASSWORD}"
        - tap: "Sign In"
      else:
        - wait_for: "Welcome"
  - assert_visible: "Welcome"
  - screenshot: "authenticated"
  - repeat:
      until_visible: "Get Started"
      max: 5
      steps:
        - swipe: "left"
        - screenshot: "onboarding"
  - tap: "Get Started"
  - assert_visible: "Dashboard"
  - screenshot: "complete"

Cross-app workflow

name: Commute ETA Notification
app: Waze, Messages
description: Get ETA from Waze, text it via iMessage.

steps:
  - launch: "Waze"
  - wait_for: "Where to?"
  - tap: "Where to?"
  - tap: "${DESTINATION:-Work}"
  - tap: "Go"
  - wait_for: "min"
  - remember: "Read the commute ETA."
  - press_home: true
  - launch: "Messages"
  - tap: "New Message"
  - tap: "To:"
  - type: "${RECIPIENT}"
  - tap: "${RECIPIENT}"
  - tap: "iMessage"
  - type: "On my way! ETA {eta}"
  - press_key: "return"
  - screenshot: "message_sent"

${VAR} placeholders resolve from environment variables. Install scenarios in Claude Code*:

$ claude plugin marketplace add marketplace.mirroir.dev

* Also supported by GitHub Copilot.

Search the community scenario library

Fail-closed by default

Without a config file, only read-only tools are exposed. Mutating tools are hidden from the MCP client entirely — it never sees them unless you allow them.

{
  "allow": ["tap", "swipe", "type_text", "press_key", "launch_app"],
  "deny": [],
  "blockedApps": ["Wallet", "Banking"]
}

Drop this in ~/.iphone-mirroir-mcp/permissions.json to control exactly which tools your AI agent can use.

What Does iphone-mirroir-mcp Do?

22 tools exposed as an MCP server.

Touch

  • tap Tap at screen coordinates
  • double_tap Double-tap for zoom or text selection
  • long_press Hold for context menus
  • swipe Quick flick between two points
  • drag Slow drag for sliders and icons

Input

  • type_text Type text via virtual keyboard
  • press_key Send special keys with modifiers
  • shake Shake gesture for undo or dev menus

Observe

  • screenshot Capture screen as PNG
  • describe_screen OCR with tap coordinates
  • start_recording Begin video recording
  • stop_recording End recording, get file path
  • get_orientation Portrait or landscape
  • status Connection and device readiness
  • check_health Full setup diagnostic

Navigate

  • launch_app Open app by name via Spotlight
  • open_url Open URL in Safari
  • press_home Return to home screen
  • press_app_switcher Show recent apps
  • spotlight Open Spotlight search
  • list_scenarios List available YAML scenarios
  • get_scenario Read scenario with env substitution

Works with

Any MCP client that supports stdio transport — plug into your editor or build your own agent.

Coding Assistants

Agent Frameworks

FAQ

Is this safe? Can the AI access my banking apps?

Without a config file, only read-only tools are exposed. Mutating tools require explicit opt-in. Use blockedApps in permissions.json to deny access to sensitive apps. Closing iPhone Mirroring kills all input immediately.

Why does my cursor jump when the AI is working?

macOS routes HID input to the frontmost app. The server must activate iPhone Mirroring before each input. Put it in a separate macOS Space to keep your workspace undisturbed.

Does it work with any iPhone app?

Yes. It operates at the screen level through iPhone Mirroring — no source code, SDK, or jailbreak required. If you can see it on screen, the AI can interact with it.

Why Karabiner? Can I use it without Karabiner?

iPhone Mirroring ignores programmatic CGEvent injection. Karabiner provides a DriverKit virtual HID device that appears as real hardware — the only path that works.

Can I restrict which tools the AI can use?

Yes. Drop a permissions.json with allow and deny lists. Tools not in the allow list are hidden from the MCP client entirely.

Read the full FAQ

Further Documentation

README & docs on GitHub

Community Discussion

Join the conversation

Donate

Sponsor on GitHub