visionOS game engine patterns and tradeoffs. Use when choosing between Unity PolySpatial, Unreal Engine, Godot, or native Metal for visionOS development — provides engine capabilities, rendering paths, simulator support, and integration patterns.
Path 1 — RealityKit / Shared Space:
Path 2 — CompositorServices / Full Space:
LayerRendererUse CompositorLayer (not MTKView) as SwiftUI scene type for Path 2.
| Feature |
|---|
| Unity PolySpatial |
|---|
| Unreal 5.5+ |
|---|
| Godot (pending) |
|---|
| Native RealityKit |
|---|
| Shared Space | Yes | No | Yes (windowed) | Yes |
| Simulator | Yes | No (device only) | Likely | Yes |
| Custom shaders | No (MaterialX only) | Yes | Yes | Metal only |
| Hand tracking | Yes (XR Hands) | Yes (ARKit) | Yes (ARKit) | Yes (ARKit) |
| Passthrough (visionOS 2.0+) | Yes | Yes (UE 5.5+) | Pending | Yes |
| Embed in native app | Partial (separate SwiftUI windows) | No documented path | Unknown | N/A (it IS native) |
| License cost | Pro required | Free | Free (MIT) | Free |
Min versions: Unity 6 for 2.x packages; Unity 2022.3.18f1 for 1.x packages. Apple Silicon Mac required.
App modes (Project Settings > XR Plug-in Manager > Apple visionOS > App Mode):
Mixed Reality - Volume or Immersive Space → RealityKit renderingVirtual Reality - Fully Immersive Space → Metal/CompositorServices renderingHybrid (2.x only) → runtime switch between the twoSwiftUI interop pattern:
*InjectedScene.swift auto-merge into the app's top-level AppDllImportWhat doesn't work in PolySpatial (RealityKit) mode:
Versions: 5.4+ experimental (Full Immersion), 5.5+ adds Mixed Immersion (visionOS 2.0 required)
Key constraints:
Status (early 2026): Apple contributing 3-stage PR to Godot master branch. Stage 1 (windowed) merged. Stages 2-3 (Swift lifecycle, VR plugin) in progress. Not in binary releases yet — compile from source.
Community alternative: GodotVision (godot.vision) wraps Godot in RealityKit, already functional.
Mix native RealityKit chrome with game engine 3D content:
ZStack {
MetalGameView() // bottom: game engine renders to texture
RealityView { ... } // top: RealityKit 3D frame/chrome
}
@State variables drive dynamic frame changes (level transitions, etc.).
Unity's formal Hybrid Mode (2.x) manages this programmatically — performance overhead scales with scene complexity.
// Scene type for immersive Metal app
ImmersiveSpace {
CompositorLayer(configuration: MyConfig()) { layerRenderer in
MyRenderer(layerRenderer: layerRenderer).run()
}
}
// Render loop pattern
func run() {
while true {
guard let frame = layerRenderer.queryNextFrame() else { return }
frame.startSubmission()
// encode Metal commands for left+right eye
// use .layered layout + vertex amplification for stereo
frame.endSubmission()
}
}
Key notes:
.layered texture layout + Metal vertex amplification for single-pass stereoHandAnchor, WorldAnchor) separate from CompositorServicesOnly Unity and native RealityKit work in the Simulator. Everything targeting CompositorServices in full immersion can technically run in Simulator, but Unreal's toolchain doesn't support it.
Full expertise: /Users/quartershots/.claude/research/visionos-game-engines/expertise.md