Helps build Unity MR applications for Magic Leap 2 connecting to IoT sensors (MQTT), InfluxDB, and LLM interfaces.
This skill assists AI agents in constructing Magic Leap 2 mixed-reality applications inside Unity. The application integrates real-time IoT sensor data (via MQTT), historical data (via InfluxDB), and natural language processing (via LLMs) to create an Assistant interface.
When starting a project, ensure the following setup steps are taken in Unity:
OpenXR and the Magic Leap XR Plugin.OpenXR and Magic Leap Feature Group.Magic Leap Controller Profile.XROrigin (AR) and configure it for Magic Leap.Ensure the Unity project follows this directory structure:
Assets/
├── Scripts/
│ ├── MQTT/
│ ├── InfluxDB/
│ ├── LLM/
│ ├── MagicLeap/
│ ├── UI/
│ └── Config/
├── Scenes/
└── Materials/
The templates/ folder within this skill provides base C# scripts:
UnityWebRequest) to an InfluxDB time-series database.UnityWebRequest nested in async Task or yield return Coroutines for network IO.To test the application easily on an iPhone or Android phone without needing the Magic Leap headset, a WebXR version is provided in the WebAR-Client/ folder.
The Arduino/ folder contains a basic Wired_Arduino.ino sketch designed for standard boards like the Arduino Uno (no WiFi needed).
To forward this data to the Unity scene and WebAR client, use the attached Python script in the same folder:
pip install pyserial paho-mqttpython serial_mqtt_bridge.pyhospital/sensors/ultrasonic, which all other clients will see.Use the MqttManager to listen on sensor/ultrasonic.
Forward parsed float values to a SensorVisualizer component that updates a TextMeshPro UI element anchored in MR space.
Periodically pass the last 10 seconds of sensor data from the MQTT manager to the LLMClient. Have the user press a trigger or use a Voice Intent to ask, "Is the measurement stable?", and have the LLM respond based on the passed context.