# Magic Leap 2 Support

`v0.2.0-pre.6`   `24 Oct 2025`

The **MeshMap Magic Leap 2 Support** package contains features and sample assets for creating location-based AR experiences with the Magic Leap 2 headset.

{% hint style="info" %}
This package depends on a minimally modified version of the Magic Leap SDK to prevent compilation errors when building for multiple target XR devices with Unity OpenXR Plugin 1.15.0 (used in [MeshMap XR](https://docs.meshmap.com/unity-sdk/xr)). In accordance with the [Magic Leap 2 Software License Agreement](https://www.magicleap.com/legal/software-license-agreement-ml2), the package is included in a [sample project](https://github.com/MeshMap/mmsdk-samples-magicleap2/), but not redistributed solely on its own.
{% endhint %}

[**Changelog**](https://github.com/MeshMap/com.meshmap.sdk.magicleap2/blob/main/CHANGELOG.md)

***

## Features

In addition to the samples below:

* **Dimming** — Adjust the Magic Leap Global Dimming setting with a UI slider.
* **Haptics** — Simple helper class for preset and custom haptic feedback.
* **Occlusion** — Apply occlusion shaders/materials to objects and UI elements.
* **Rigs** — A reliable XR rig to use in Magic Leap projects.

***

### Samples

To import a sample into an existing Unity project, go to `Windows > Package Manager > MeshMap Magic Leap 2 Support > Samples`.

* **Marker Tracking** — Demonstrates how to set up a simple scene with marker tracking and a calibration UI to localize content for location-based AR experiences.
* **Space Localization** — Demonstrates how to localize using Magic Leap 2 Spaces.
* **Spatial Anchors** — Demonstrates how to position AR content across sessions using Magic Leap 2 Spatial Anchors Subsystem.
* **Physical Occlusion** — Demonstrates how to occlude AR content using the physical environment, nearby objects, hands, and controllers using the experimental Magic Leap 2 Physical Occlusion Feature.
* **User Calibration** — Demonstrates how to check the user's headset fit and eye calibration status using the experimental Magic Leap 2 User Calibration Feature.
* **Eye Tracking** — Demonstrates how to get the user's eye tracker data using the experimental Magic Leap 2 Eye Tracking Feature. Recommended to pair with the User Calibration sample.
* **Voice Commands** — Demonstrates how to set up custom voice commands using Magic Leap Voice Intents.

For more samples, we recommend the official [Magic Leap Unity (OpenXR) Example Project](https://developer-docs.magicleap.cloud/docs/guides/unity-openxr/openxr-unity-samples/).

***

## Getting Started

The easiest way to get started is to clone the [**MeshMap Samples for Magic Leap 2** repository](https://github.com/MeshMap/mmsdk-samples-magicleap2/) and open the project using [Unity Hub](https://unity.com/unity-hub).

The samples are pre-imported and the project settings are already configured.

***

### Requirements

* Windows, Mac
* **Unity 2022.3.11 LTS** or **6000.0.58f2 LTS**
  * Android Build Support
  * Universal Render Pipeline (URP)

{% hint style="info" %}
Other Unity versions *may* work but they have **not** been tested. We strongly recommend using the one of the versions listed above.
{% endhint %}

***

### Package Dependencies

* VContainer v1.16.8 (scoped registry)
* Unity TextMeshPro v3.0.7
  * Make sure to [Import TMP Essential Resources](https://docs.unity3d.com/Packages/com.unity.textmeshpro@4.0/manual/index.html).
* Unity Input System v1.11.2
* Unity XR Interaction Toolkit v2.6.3
* Unity OpenXR v1.13.0
* Magic Leap SDK v2.6.0-pre.R15-meshmap
* [MeshMap Building Blocks v0.0.13](https://github.com/MeshMap/com.meshmap.sdk.bb)

***

### Import

The [com.meshmap.sdk.magicleap2](https://github.com/MeshMap/com.meshmap.sdk.magicleap2) package can be added to an existing project via Unity Package Manager (UPM) from Git. Follow the import [instructions](https://docs.meshmap.com/unity-sdk/overview/using-our-packages).

***

### Basic Usage

After import, the files will be available in the Packages section of the Project window in Unity Editor.

For further support, we highly recommend the official [Magic Leap Developer Documentation](https://developer-docs.magicleap.cloud/docs/category/unity-openxr/) and [Forum](https://forum.magicleap.cloud/).

***

## Settings and Permissions in Unity

* In `Project Settings > Player Android > Other Settings`, make sure the Vulkan Graphics API, DXTC texture compression format, minimum API level Android 10.0 (API level 29), IL2CPP scripting backend, new Input System, x86-64 target architecture, and internal write permission are set correctly.
* Check `Project Settings > MagicLeap > Permissions > API Level 29`.
* In `Project Settings > XR Plug-in Management`, enable the OpenXR Plugin and Magic Leap 2 feature groups.
* In `Project Settings > XR Plug-in Management > OpenXR > Enabled Interaction Profiles & Feature Groups`, check that the required interaction profiles, feature groups, and subsystems are enabled.
  * Magic Leap 2 Controller Interaction Profile, Eye Gaze Interaction Profile, and Hand Interaction Profile (if using hand tracking).
* In `Project Settings > XR Plug-in Management > Project Validation`, review that all checks pass successfully.
* In `Project Settings > MagicLeap > Permissions`, include all Android permissions that you anticipate your app needing.

***

## Installing an App to Magic Leap 2

To install an .apk to your Magic Leap 2, enable [Developer Mode](https://developer-docs.magicleap.cloud/docs/guides/getting-started/enable-developer-mode/). Then, connect it to your computer via USB-C and use [Magic Leap Hub 3](https://developer-docs.magicleap.cloud/docs/guides/developer-tools/ml-hub-3/get-started/) > Device Bridge.

\**If your computer has a USB-A port, you may need to use a USB-A-to-C cable.*

***

## Settings and Permissions on Device

* Manually grant permissions (e.g., camera, voice input) in `Settings > Apps & Notifications > App info > your-app > Permissions`.
* For the Voice Commands sample, enable Voice Commands in the device at `Settings > Magic Leap Inputs > Voice`.
