Passthrough relighting (for mixed reality)
- MRUK: Use the Mixed Reality Utility Kit (MRUK) to achieve passthrough relighting.
- Import the sample package: Import the necessary passthrough relighting sample package into your project to access the required components.
- Replace OVR: The MRUK component replaces the OVR scene manager and includes an effect mesh component that applies a material to your room models, handling highlights and shadows correctly.
This video explains how to use passthrough relighting in Quest 3 with Unity:
Mixed reality (MR) passthrough lighting
Quest 3 passthrough applications must solve the complex problem of blending virtual and real-world lighting. This requires using the Meta SDK to create virtual lights and shadows that interact with your actual physical environment.
1. Enable passthrough relighting
- Add the OVR components: Begin by setting up passthrough in your scene with the OVR Manager and OVR Passthrough Layer components. You will need to set the
Passthrough SupporttoSupported. - Import the Meta XR Interaction SDK: Passthrough relighting capabilities are part of the Meta XR Interaction SDK (MRUK package). Import this into your Unity project to gain access to the necessary components and prefabs.
- Add the MRUK prefab: Add the MRUK prefab to your scene hierarchy. The MRUK handles the generation of the real-world meshes that your virtual lights and shadows can interact with.
2. Configure shadows and occlusion
- Enable occlusion meshes: The Meta SDK generates a virtual mesh of your room. This “scene mesh” is required for shadows and occlusion to work correctly.
- Cast shadows on the real world: With the scene mesh in place, you can configure your virtual lights to cast shadows onto the real-world environment. This adds depth and realism to your mixed-reality experience.
- Virtual lights will cast shadows onto the generated scene mesh, and objects marked as casting shadows will appear to block real-world light.
3. Estimate real-world lighting for reflections
To make virtual objects reflect real-world light, you can leverage the headset’s cameras to generate a real-time reflection probe.
- Create a real-time reflection probe: Add a reflection probe to your scene.
- Map to a cubemap: Map the passthrough camera feed to a cubemap and project this onto the reflection probe. This allows the probe to capture the surrounding real-world lighting environment.
- Improve PBR material plausibility: When used on physically based rendering (PBR) materials, this technique allows virtual objects to realistically reflect the real-world environment, dramatically improving visual plausibility.
Realtime Lighting Estimation on Quest 3
This video demonstrates the unlocked potential of Meta Quest 3 Camera Access via the Media Projection API. When mapped to a cubemap and projected onto a real-time reflection probe in Unity, you can finally use the environment to light your PBR materials. This dramatically improves the plausibility of the 3D Model in the environment – especially of course when you are in Mixed Reality. The video shows how this approach allows to adopt to different lighting conditions in “Real Time”.