Understand and explore the Omi Flutter mobile app's UI flows, navigation patterns, and widget architecture. Use when developing features, fixing bugs, or verifying changes in app/lib/ Dart files. Provides agent-flutter commands to explore the live app, understand how screens connect, and verify your work.
This skill teaches you the Omi Flutter mobile app's navigation structure, screen architecture, and widget patterns. Use it when developing features (to understand how the app works), fixing bugs (to navigate to the affected screen), or verifying changes (to confirm your code works in the live app).
You can interact with the running app via agent-flutter — a CLI that taps widgets, reads the widget tree, and captures screenshots through Flutter's Marionette debug protocol.
# 1. Emulator must be running
adb devices # should show emulator-5554
# If not: sg kvm -c "$ANDROID_HOME/emulator/emulator -avd omi-dev -no-window -gpu swiftshader_indirect -no-audio -no-boot-anim &"
# 2. Set system language to English (REQUIRED — non-English IME breaks text input)
adb shell "settings put system system_locales en-US"
adb shell "setprop persist.sys.locale en-US"
# 3. App must be running in debug mode with flutter run stdout captured
cd app && flutter run -d emulator-5554 --flavor dev > /tmp/omi-flutter.log 2>&1 &
# Wait for "VM Service" line to appear in the log
# 4. Connect agent-flutter (AGENT_FLUTTER_LOG must point to flutter run stdout, NOT logcat)
AGENT_FLUTTER_LOG=/tmp/omi-flutter.log agent-flutter connect
agent-flutter snapshot -i --json # see what's on screen
Prerequisites:
omi-dev (check: $ANDROID_HOME/emulator/emulator -list-avds)kvm group (sg kvm -c "..." if not in current session)com.friend.ios.dev (dev flavor)fill commandsmarionette_flutter: ^0.3.0 in pubspec.yaml| Command | Purpose | Example |
|---|---|---|
snapshot -i --json | See all interactive widgets with refs, types, bounds | agent-flutter snapshot -i --json |
press @ref | Tap a widget by ref | agent-flutter press @e3 |
press x y | Tap by coordinates (ADB input tap) | agent-flutter press 540 1200 |
press @ref --adb | Tap by ref using ADB (for stale refs) | agent-flutter press @e3 --adb |
dismiss | Dismiss system dialogs (location, permissions) | agent-flutter dismiss |
find type X press | Find widget by type and tap | agent-flutter find type button press |
find text "X" press | Find by visible text and tap | agent-flutter find text "Settings" press |
find type X --index N press | Tap Nth match (0-indexed) | agent-flutter find type switch --index 0 press |
fill @ref "text" | Type into text field | agent-flutter fill @e7 "search" |
scroll down/up | Scroll current view | agent-flutter scroll down |
back | Android back button | agent-flutter back |
screenshot PATH | Capture current screen | agent-flutter screenshot /tmp/screen.png |
Key rules:
find type X is more stable than hardcoded @ref numbers.AGENT_FLUTTER_LOG must point to flutter run stdout (not logcat).disconnect → wait 3s → connect.type, flutterType, or bounds to identify.# "No isolate with Marionette" → bring app to foreground + reconnect
adb -s emulator-5554 shell am start -n com.friend.ios.dev/com.friend.ios.MainActivity
agent-flutter disconnect && agent-flutter connect
# Unhealthy widget tree → hot restart
kill -SIGUSR2 $(pgrep -f "flutter_tools.*run" | head -1)
sleep 3 && agent-flutter disconnect && agent-flutter connect
Onboarding (wrapper.dart) — 11-step wizard
├── 0: Auth (auth.dart) — Google/Apple sign-in
├── 1: Name (name_widget.dart)
├── 2: Primary Language (primary_language_widget.dart)
├── 3: Found Omi (found_omi_widget.dart)
├── 4: Permissions (permissions_widget.dart)
├── 5: User Review (user_review_page.dart)
├── 6-7: Welcome / Find Devices (placeholders)
├── 8: Speech Profile (speech_profile_widget.dart)
├── 9: Knowledge Graph (knowledge_graph_step.dart)
└── 10: Complete (complete_screen.dart)
Home (page.dart) — main app after auth
├── [top bar] Connect Device | Search | History | Settings gear
├── [center] Daily Score card → Add Goal
├── [Ask Omi button] → Chat (chat/page.dart)
│ └── Text input, voice recorder, AI responses, message actions
├── [record button] → Conversation Capturing (conversation_capturing/page.dart)
│ └── Live transcript, waveform, stop button
│
├── [tab 0] Conversations (conversations_page.dart)
│ ├── Folder tabs (All, Starred, custom folders)
│ ├── Daily summaries toggle
│ ├── Today's tasks widget
│ └── Conversation item → Detail (conversation_detail/page.dart)
│ └── Transcript, Summary, Action Items tabs, share, audio
│
├── [tab 1] Action Items (action_items_page.dart)
│ ├── Categories: Today, Tomorrow, Later, No Deadline, Overdue
│ ├── FAB → Create task sheet (action_item_form_sheet.dart)
│ ├── Task checkboxes, drag-drop reorder
│ └── Task → Goal linking
│
├── [tab 2] Memories (memories/page.dart)
│ ├── Search bar, graph button, management button
│ ├── FAB → Add memory dialog
│ ├── Category chips filter
│ ├── Memory item → Quick edit sheet (memory_edit_sheet.dart)
│ ├── Graph → Memory Graph (memory_graph_page.dart)
│ └── Management → Category management sheet
│
├── [tab 3] Apps (apps/page.dart)
│ ├── Search, filter, create buttons
│ ├── Popular apps (horizontal scroll)
│ ├── Category sections → Category apps page
│ ├── App item → App Detail (app_detail/app_detail.dart)
│ │ └── Reviews, capabilities, install/enable
│ └── Create → Custom app or MCP server
│
└── [settings gear] → Settings Drawer (settings_drawer.dart)
├── Profile (profile.dart)
│ ├── Name → Change name dialog
│ ├── Email (read-only)
│ ├── Language → Language Settings (language_settings_page.dart)
│ ├── Custom Vocabulary (custom_vocabulary_page.dart)
│ ├── Speech Profile (speech_profile/page.dart)
│ ├── Identifying Others (people.dart)
│ ├── Payment Methods (payments/payments_page.dart)
│ ├── Conversation Display (conversation_display_settings.dart)
│ ├── Data Privacy (data_privacy_page.dart)
│ └── Delete Account (delete_account.dart)
├── Notifications (notifications_settings_page.dart)
│ ├── Frequency slider (0-5)
│ ├── Daily Summary toggle + time picker
│ └── Daily Reflection toggle
├── Plan & Usage (usage_page.dart)
├── Offline Sync (sync_page.dart)
│ ├── Local storage, recordings list
│ ├── Fast transfer settings
│ └── Private cloud sync
├── Device Settings (device_settings.dart) — requires BLE device
│ ├── Device info (name, ID, firmware, SD card)
│ ├── LED brightness slider, mic gain slider
│ └── Double tap action picker
├── Integrations (integrations_page.dart) — BETA
│ └── Google Calendar, Gmail, Apple Health
├── Phone Calls (phone_call_settings_page.dart)
│ └── Verified numbers list, delete button
├── Transcription Settings (transcription_settings_page.dart)
│ ├── Source toggle: Omi Cloud vs Custom STT
│ ├── Provider selector, API key, model config
│ └── Advanced JSON editors, logs viewer
├── Developer Settings (developer.dart)
│ ├── Custom STT provider config
│ ├── API key management
│ └── MCP API keys
├── What's New → Changelog sheet
├── Referral Program (referral_page.dart) — NEW
└── Sign Out → Confirmation dialog
Persona Profile (persona_profile.dart) — AI clone management
├── Avatar (100x100), name with verified badge
├── Share Public Link button
├── Make Public toggle
└── 10 social link rows (omi, Twitter active; others Coming Soon)
└── Twitter → Social Handle Entry → Verify Identity → Clone Success
Connected Device (home/device.dart) — requires BLE
├── Device name, connection status, battery
├── Actions: Firmware Update, SD Card Sync, Disconnect, Unpair
└── Device info: Product, Model, Manufacturer, Firmware, ID, Serial
Speech Profile (speech_profile/page.dart)
├── Device animation, intro text
├── Get Started / Do It Again button
├── Question flow: text, progress bar, skip
└── Listen to Speech Profile (if samples exist)
Bottom navigation bar:
InkWell widgets at bounds.y > 780, sorted left-to-right by bounds.xsnapshot -i --json → filter flutterType == 'InkWell' and bounds.y > 780Settings gear:
button widget in the top barbounds.x descending, take firstSettings rows:
gesture widgets with bounds.width > 300Switch toggles:
switch in snapshotsBottom sheet pickers:
gesture rows with bounds.y > 380DEVICE=emulator-5554; APP_PKG=com.friend.ios.dev
# Read current
adb -s $DEVICE shell "run-as $APP_PKG cat shared_prefs/FlutterSharedPreferences.xml" | grep app_locale
# Change to Spanish, hot restart to apply
adb -s $DEVICE shell "run-as $APP_PKG cat shared_prefs/FlutterSharedPreferences.xml" > /tmp/prefs.xml
sed -i 's|flutter.app_locale">[^<]*|flutter.app_locale">es|' /tmp/prefs.xml
adb -s $DEVICE push /tmp/prefs.xml /data/local/tmp/FlutterSharedPreferences.xml
adb -s $DEVICE shell "run-as $APP_PKG cp /data/local/tmp/FlutterSharedPreferences.xml shared_prefs/FlutterSharedPreferences.xml"
kill -SIGUSR2 $(pgrep -f "flutter_tools.*run" | head -1)
sleep 3 && agent-flutter disconnect && agent-flutter connect
Every flow lists prerequisites: — conditions that MUST be true before running. These describe the real user state — no bypasses, no shortcuts. Test the app the way users experience it.
| Prerequisite | What it means | How to achieve (Android) | How to achieve (iOS) |
|---|---|---|---|
auth_ready | User completed sign-in (Google or Apple), app shows home screen | Run bash setup.sh android → launch app → complete Google Sign-In flow → complete onboarding | Run bash setup.sh ios → launch on simulator or device → complete Google/Apple Sign-In → complete onboarding |
signed_out | Fresh app, user NOT signed in, shows Get Started screen | Uninstall + reinstall, or clear app data via Settings → Apps → Omi → Clear Data | Delete app from simulator/device and reinstall |
microphone_permission | App has mic permission granted | When app requests mic permission during use, tap "Allow". Or pre-grant: adb shell pm grant com.friend.ios.dev android.permission.RECORD_AUDIO | When app requests mic permission, tap "Allow" in the iOS permission dialog |
ble_on | Bluetooth enabled on device | Enable Bluetooth in device Settings → Connected Devices. Emulators/simulators do not support BLE — requires physical device | Enable Bluetooth in device Settings. iOS Simulator has no BLE — requires physical iPhone |
omi_device_connected | Omi hardware paired and connected via BLE | Power on Omi device within BLE range → app auto-discovers on home screen → tap Connect. Physical device only | Same — power on Omi, app discovers it. Physical iPhone only |
phone_number_verified | Phone number added and verified in settings | Settings → Phone Calls → add phone number → receive SMS → enter code. Requires real phone number | Same flow — requires real phone number that receives SMS |
developer_settings_enabled | Developer Settings screen is open | Settings drawer → scroll down → tap "Developer Settings" (visible to all users) | Same navigation path |
adb_access | Shell access for locale/prefs manipulation (Android only) | Debug build + adb in PATH. Verify: adb shell run-as com.friend.ios.dev ls shared_prefs/ | Not applicable — iOS equivalent uses xcrun simctl for simulator or Xcode for device |
signed_out ─── (fresh install, no prior state)
auth_ready ─── launch app → sign in with Google/Apple → complete onboarding
│ (this is the REAL user flow — no bypasses)
├── microphone_permission (grant when prompted, or pre-grant via platform tools)
├── developer_settings_enabled (navigate in-app to Settings → Developer Settings)
├── phone_number_verified (in-app SMS verification — manual step)
├── ble_on + omi_device_connected (physical device + physical Omi hardware)
└── adb_access (Android debug builds only — for locale manipulation)
# Android: setup + build + launch
cd app && bash setup.sh android
# → completes: keystore, Firebase config, .dev.env, flutter run --flavor dev
# → sign in with Google when app launches, complete onboarding
# iOS: setup + build + launch
cd app && bash setup.sh ios
# → completes: Firebase config, .dev.env, flutter run --flavor dev
# → sign in with Google/Apple when app launches, complete onboarding
Important: Both platforms require completing the real sign-in and onboarding flows. Never bypass auth or onboarding — these are user-facing flows that must work correctly.
Each flow file uses schema v2: