The visionOS SDK, a collection of tools that will let developers create applications for the Apple Vision Pro mixed reality headset, was published by Apple on Wednesday. The first spatial computer from the Cupertino business will be available for purchase in the US early the next year, and Apple is giving app developers the resources they need to create content for the gadget. The next mixed reality headset from Apple will combine three separate forms of interaction using a person’s eyes, hands, and voice, in contrast to existing Apple products. By utilising the specialist hardware on the device, the visionOS SDK will enable developers to create apps that are specifically tailored to this feature.

Developers may now access the visionOS SDK according to a corporate announcement on the Apple Developer website. Developers will need to download Xcode 15 Beta 2 in order to create spatial computing applications for the Apple Vision Pro. This version of Xcode also includes the most recent version of the visionOS SDK and Reality Composer Pro, a programme that allows you to preview and visualise 3D material for the headset.

For visionOS, programmers will have access to a simulator that will let them to interact with their applications while they are being developed and test how they would look in various lighting scenarios or room configurations, according to Apple. Users may utilise the SDK to convert a current app project for the headset or start from scratch when developing a new app.

Additionally, Apple announced that beginning in July, it would provide laboratories where programmers can test their apps on the Apple Vision Pro. In addition to Tokyo, these developer labs will be established in Cupertino, London, Munich, Shanghai, and Singapore. Engineers from the firm will also be on hand to help developers who want to make software for the Vision Pro headset.

The business will also start accepting applications for Apple Vision Pro developer kits. With the use of these kits, app developers will be able to build and test their apps more quickly on the mixed reality headset rather of waiting for them to finish development at the company’s developer labs. Apple has not stated if these developer kits would just be accessible to developers in the US or the requirements for requesting to obtain these kits, though.

The Apple Vision Pro, which was unveiled by the company earlier this month at WWDC 2023, is its first mixed reality headset, supporting both augmented reality (AR) and virtual reality (VR) technology. The hands, mouth, and eyes of the user are used to control it. A user may also be aware of their surroundings thanks to the device’s inclusion of Apple’s EyeSight technology. When Apple’s Vision Pro goes on sale in the US next year, consumers who need vision correction will also be supported by optical inserts, and the prescription lenses will be readily available.

 

 

TOPICS: Apple Apple Vision Pro