Apple has announced the expansion of its Apple Vision Pro developer labs to New York City and Sydney, giving more developers the opportunity to test their apps on the innovative device that combines augmented reality and virtual reality.
The Apple Vision Pro developer labs are a series of sessions where developers can work directly on the device and get feedback from Apple experts on how to optimize their visionOS, iPadOS, and iOS apps for the new platform.
The labs also offer guidance on how to use the new features and frameworks that Apple Vision Pro provides, such as spatial audio, hand tracking, and face capture.
“We’re thrilled with the excitement and enthusiasm from developers around the world at the Apple Vision Pro developer labs, and we’re pleased to announce new labs in New York City and Sydney,” Apple said on its developer website.
The labs have been running since early August in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo, and have received positive responses from developers who have participated.
According to Apple, the labs have helped developers create immersive and engaging experiences for Apple Vision Pro users, such as games, education, entertainment, and productivity apps.
Developers who are interested in joining the Apple Vision Pro developer labs can apply on the Apple Developer website.
The labs are free of charge, but space is limited and applicants must meet certain criteria, such as having a visionOS, iPadOS, or iOS app in development or on the App Store.
Apple Vision Pro is expected to launch in early 2024, and will be the first device to run on visionOS, a new operating system that combines elements of iOS and iPadOS with new capabilities for augmented and virtual reality.
The device will feature a sleek design, a high-resolution display, advanced sensors, and a powerful processor. Apple Vision Pro will also be compatible with existing iOS and iPadOS apps, as well as new apps designed specifically for the device.
Apple Vision Pro is a new device that will offer users a spatial computing experience with augmented and virtual reality. The device is expected to launch in early 2024 for $3,499. It will run on visionOS, a new operating system that combines iOS and iPadOS features with new capabilities for AR and VR.
The device will have three layers of user interface: Windows, Volumes, and Spaces. Windows are 2D interfaces that resemble iOS and iPadOS apps. Volumes are 3D immersive experiences that can be accessed through Windows. Spaces are the spatial computing environments where Volumes and mixed reality applications exist.
The device will have several competitive features, such as a custom M2 Silicon chip, a R1 graphics processor, a high-resolution display with HDR and WCG, a LiDAR scanner, a TrueDepth camera, an immersive camera, a light seal, and a two-hour battery life. It will also sync with iPhone, iPad, and Mac devices, and allow users to share spatial audio, photos, and videos with others.
Apple is reportedly facing some challenges in producing the device, due to the complexity of its components and the dissatisfaction with some of its partners. The initial production forecast has been reduced from 1 million units to 400,000 units, and some of the component suppliers have also lowered their projections. Apple is reportedly working on resolving these issues and ensuring the quality of the device.