Apple announces software tools for developers to create apps for Vision Pro

    Earlier this month, Apple unveiled their first spatial computing device called Vision Pro. This device allows users to interact with digital content in their physical space using their eyes, hands, and voice. Apple has now announced that their global community of developers can start creating spatial computing apps for Vision Pro.

    Developers are able to utilize various Apple platforms including Xcode, SwiftUI, RealityKit, ARKit, and TestFlight. According to Apple, these tools allow developers to create different types of apps that offer immersive experiences such as windows with depth and 3D content, volumes that can be viewed from any angle, and spaces that fully immerse users in an environment with unlimited 3D content.

    Additionally, Apple has introduced a new tool called Reality Composer Pro, available with Xcode, which helps developers optimize 3D content for their visionOS apps and games. This tool allows them to preview and prepare 3D models, animations, images, and sounds. Developers can also use the visionOS simulator to interact and test their apps in different room layouts and lighting conditions.

    To provide hands-on support, Apple will be opening developer labs in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo. Developers can test their apps on Apple Vision Pro hardware and receive assistance from Apple engineers. Developer kits will also be available for development teams to quickly build, iterate, and test their apps on Apple Vision Pro, starting next month in these open developer labs.

    Susan Prescott, Apple’s Vice President of Worldwide Developer Relations, stated, “Apple Vision Pro redefines what’s possible on a computing platform. Developers can use familiar frameworks to build visionOS apps and explore new experiences for their users with innovative tools like Reality Composer Pro.”