Files
Bachelor-Arbeit-Adrian-Haefner/Library/PackageCache/com.unity.xr.arkit@58a677f717be/Documentation~/arkit-meshing.md
adriadri6972 d3d9c5f833 upload project
2025-07-31 15:21:08 +02:00

80 lines
5.2 KiB
Markdown

---
uid: arkit-meshing
---
# Meshing
ARKit provides support for the scene reconstruction feature that became available in ARKit 3.5 and is enabled on the new iPad Pro with LiDAR scanner.
ARKit scene reconstruction provides a meshing feature that generates a mesh based on scanned real-world geometry. The mesh manager enables and configures this functionality.
## Requirements
This new mesh functionality requires Xcode 11.4 or later, and it only works on iOS devices with the LiDAR scanner, such as the new iPad Pro.
## Use meshing in a scene
Using the LiDAR sensor, ARKit scene reconstruction scans the environment to create mesh geometry representing the real world environment. Additionally, ARKit provides an optional classification of each triangle in the scanned mesh. The per-triangle classification identifies the type of surface corresponding to the triangle's location in the real world.
To use ARKit meshing with AR Foundation, you need to add the [ARMeshManager](xref:UnityEngine.XR.ARFoundation.ARMeshManager) component to your scene.
![ARFoundation ARMeshManager component](images/arfoundation-mesh-manager.png)
### Mesh Prefab
You need to set the [meshPrefab](xref:UnityEngine.XR.ARFoundation.ARMeshManager.meshPrefab) to a prefab that is instantiated for each scanned mesh. This prefab must contain at least a [MeshFilter](xref:UnityEngine.MeshFilter) component.
If you want to render the scanned meshes, add a [MeshRenderer](xref:UnityEngine.MeshRenderer) component and a [Material](xref:UnityEngine.Material) component to the mesh prefab.
If you want to have virtual content that interacts physically with the real-world scanned meshes, add a [MeshCollider](xref:UnityEngine.MeshCollider) component to the mesh prefab.
This image demonstrates a mesh Prefab configured with the required [MeshFilter](xref:UnityEngine.MeshFilter) component, an optional [MeshCollider](xref:UnityEngine.MeshCollider) component to allow for physics interactions, and optional [MeshRenderer](xref:UnityEngine.MeshRenderer) and [Material](xref:UnityEngine.Material) components to render the mesh.
![Mesh prefab example](images/arfoundation-mesh-prefab.png)
### Normals
As ARKit is constructing the mesh geometry, the vertex normals for the mesh are calculated. If you don't need the mesh vertex normals, disable [normals](xref:UnityEngine.XR.ARFoundation.ARMeshManager.normals) to save on memory and CPU time.
### Concurrent queue size
To avoid blocking the main thread, the tasks of converting the ARKit mesh into a Unity mesh and creating the physics collision mesh (if the mesh prefab contains a [MeshCollider](xref:UnityEngine.MeshCollider) component) are moved into a job queue processed on a background thread. [concurrentQueueSize](xref:UnityEngine.XR.ARFoundation.ARMeshManager.concurrentQueueSize) specifies the number of meshes to be processed concurrently.
### Other `ARMeshManager` settings
For the ARKit implementation, only the settings mentioned above affect the performance and output of ARKit meshing. ARKit does not implement the following features of the `ARMeshManager`:
- Density
- Tangents
- Texture Coordinates
- Colors
## Sample scenes
Three sample scenes exist in the [arfoundation-samples](https://github.com/Unity-Technologies/arfoundation-samples) repository:
- The **ClassificationMeshes** scene uses the mesh classification functionality to generate colored overlays on top of the real world. Each color represents a unique surface typed detected by ARKit.
![ClassificationMeshes](images/arfoundation-arkit-classified-meshing.gif)
- The **NormalMeshes** scene renders an overlay on top of the real world. The color of the mesh varies based on the normal of the mesh geometry.
![NormalMeshes](images/arfoundation-arkit-normal-meshing.gif)
- The **OcclusionMeshes** scene might appear to be doing nothing at first. However, it is rendering a depth texture on top of the scene based on the real-world geometry. This allows for the real world to occlude virtual content. The scene has a script on it that fires a red ball into the scene when you tap the screen. You can see the occlusion working by firing the red balls into a space which you can then move the device camera behind some other real world object to see that the virtual red balls are occluded by the real world object.
![NormalMeshes](images/arfoundation-arkit-occlusion-meshing.gif)
## Meshing behaviors
**Note:** It usually takes about 4 seconds after the Made With Unity logo disappears (or when a new AR Session starts) before the scanned meshes start to show up.
Additionally, the LiDAR scanner alone might produce a slightly uneven mesh on a real-world surface. If you add and enable an [ARPlaneManager](xref:UnityEngine.XR.ARFoundation.ARPlaneManager) to your scene, ARKit considers that plane information when constructing a mesh and smooths out the mesh where it detects a plane on that surface.
## Additional information
For more information about ARKit 3.5, refer to [Introducing ARKit 3.5](https://developer.apple.com/augmented-reality/arkit/).
For more information about scene reconstruction, see [Visualizing and Interacting with a Reconstructed Scene](https://developer.apple.com/documentation/arkit/world_tracking/visualizing_and_interacting_with_a_reconstructed_scene?language=objc).
[!include[](snippets/apple-arkit-trademark.md)]