Contradictory to rumors circulating earlier this summer, the programming framework for Apple’s augmented reality (AR) glasses, called “StarBoard,” has been found untouched in the final version of iOS 13 and iOS 13.1 beta. According to a report by Tom’s Guide writer Jesus Diaz, Apple developer Steve Throughton-Smith and 9to5mac’s Guilherme Rambo found the Starboard system shell that is used to launch and run stereo AR apps in iOS 13.

This integration with iOS 13 could signal some truth from the original report by analyst Ming-Chi Kuo, which stated that the AR glasses would act as a display that’s powered from the iPhone in the user’s pocket – borrowing similar processes used by the Apple Watch. Throughton-Smith noted that Apple “had time to rip out all the Apple Tag references in iOS 13,” but references to the mysterious AR glasses were left within the update. Given these developments, the initial Q2 2020 production timeframe could be on the mark.

“The picture of Apple’s AR efforts from iOS 13 is very different to what one might expect,” Throughton-Smith said in a tweet. “It points to the headset being a much more passive display accessory for iPhone than a device with an OS of its own. The iPhone seems to do everything; ARKit is the compositor.”

Over the course of the year, Apple’s been making a fair amount of plays into AR. Their WWDC keynote this year revealed a slew of updates for ARKit 3, and as for personnel, the company promoted former iPhone executive and QuickTime developer Frank Casanova to Head of Marketing for AR and hired former Valve VR engineer Nat Brown to focus on graphics on Apple platforms. These subtle moves towards AR glasses can be best summed up by Apple CEO Tim Cook’s statement during an earnings call in 2018: “I see AR as being profound. AR has the ability to amplify human performance instead of isolating humans. So I am a huge, huge believer in AR. We put a lot energy on AR. We’re moving very fast.”