Mr Cook said: “Just as the Mac introduced us to personal computing, and iPhone introduced us to mobile computing, Apple Vision Pro introduces us to spatial computing.”
The system is controlled using hand movements, eye tracking and voice instructions.
Unlike existing VR headsets, wearers will be able to see the world around them while using the device. Other people will also be able to see the user’s eyes through the headset’s screen.
Vision Pro is powered using Apple’s dual-chip design, combining its existing high-powered M2 processor with a brand-new sensor, the R1.
The R1 chip processes information from the headset’s 12 cameras, five sensors and six microphones eight times quicker than a blink of an eye, to provide users a virtually lag-free experience.
The microchips drive the headset’s audio-spatial system and high-resolution display, which uses micro-OLED technology to fit 23 million pixels between two displays – one for each eye.
They also allow for eye movements to be tracked through invisible light patterns projected onto users’ eyes which are then captured with high-speed cameras.
The Vision Pro will “change the way we communicate, collaborate, work and enjoy entertainment”, according to Mr Cook.
One way is by using Vision Pro for work – either at the office or remotely.
The device can project enormous, 3D versions of Apple’s apps into a wearer’s visual field – including Safari, Notes, iMessage and Apple Music.
Microsoft and Adobe are also creating versions of their apps designed for the new headset.
Virtual work meetings can become more immersive with Vision Pro’s spatial audio, which allows users to arrange FaceTime participants around the room in life-size tiles.
This content was originally published here.