Automatic environment occlusion

AR Foundation support for ARKit 4 Depth

No Comments

Automatic environment occlusion

The iPad Pro running ARKit 4 produces a depth image for each frame. Each pixel in the depth image specifies the scanned distance between the device and a real-world object.

AR Foundation 4.1 includes an AR Occlusion Manager that incorporates this depth information when rendering the background. When the camera background is rendered, the background renderer updates the depth buffer based on data from the scanned depth image. When virtual content is closer to the camera than real-world content, the virtual objects occlude the real world as the virtual scene is rendered. Likewise, when the virtual content is farther away, behind real-world objects, the virtual content will not be rendered to these pixels; the physical objects hide the virtual content.

Read more…

Previous Post
‘Water-forecasting’ and fish farms fed on waste: how innovation is driving the blue economy
Next Post
Understanding fluid mechanics is crucial to stopping the spread of the virus

Related Posts

No results found.
You must be logged in to post a comment.
Menu