In their keynote at World-Wide Developer Conference (WWDC), Apple unveiled ARKit 3, their latest set of developer tools for iOS augmented reality (AR) applications. According to their site, “ARKit 3 goes further than ever before, naturally showing AR content in front of or behind people using People Occlusion, tracking up to three faces at a time, supporting collaborative sessions, and more.”

There are many updates that Apple reports. To start with, they claim improved Motion Capture for integrating people’s movement into apps, and People Occlusion will seamlessly pass AR content behind and in front of the people being tracked. Previous versions of ARKit struggled with depth of field as it related to people and objects: virtual objects often appear in front of everything in the scene, regardless of other objects’ positions, often breaking immersion. They boast that this tech not only creates more immersive AR experiences, but also loftily claim it creates “green screen-style effects in almost any environment.”

Apple detailing the motion capture capabilities of ARKit 3 at WWDC 2019.

The press release also states the tool can track up to three faces at a time while utilizing the front and back camera simultaneously. Other improvements to the tech include live collaborative session, automatic size estimates of objects, more robust 3D-object detection, detecting up to 100 images at a time, and machine learning being used t detect planes in environments faster.

Announced with ARKit 3 was also Reality Composer and RealityKit, both designed to help first time AR creators. Reality Composer is an app for both iOS and Mac that should allow users to move seamlessly between devices with live linking, and has a built-in AR library already filled with hundreds of ready-to-use objects, animations, and audio. RealityKit is a high-level framework built specifically for AR, with “photo-realistic rendering, camera effects, animations, physics and more.” It claims that RealityKit “seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality.”

Lastly, AR Quick Look claims to quickly visualize virtual objects in iOS 12 and later apps, such as Messages, Mail, and more, to create interactive experiences with incredible detail: Apple states the objects will have ” reflections of real world surroundings in shiny virtual objects.”

We here at Th3rdEyeXR are excited to see how well all of these claims stack up to what the tech delivers on with its eventual launch. Check back soon for more updates.