They will be priced at $3,499, and their release is scheduled for next year. They feature a new operating system and the introduction of a co-processor that will accompany the Apple Silicon M2.
The new era: Apple Vision Pro, a technological arsenal.
Apple's first mixed reality glasses are named Apple Vision Pro.
The visor is made of polished glass that allows for a clear view of the surroundings, accompanied by an aluminum body that encases it. The strap is made of fabric, adapting to different sizes as it is quite flexible (we'll have to see how it copes with dirt, as it resembles the straps of Apple Watches).
These straps feature small capsules that integrate speakers providing spatial audio. Immersive experience is particularly important for Apple, so these capsules are positioned right near our ears (we can adjust the side pads to our liking).
These glasses can be connected to a power source for all-day use. Alternatively, they can be used while connected to their external battery. With the battery, the duration is two hours. This puts them in line with competitors like Meta Quest, although Apple has chosen to omit the battery to reduce weight.
Inside, looking into the visors (with Zeiss lenses), there are 23 megapixels between the inner visors, superior to 4K resolution in each eye, with micro-OLED technology. They also feature a dual-camera system capable of capturing "spatial" images: panoramic photos and videos with spatial audio, aiming to faithfully recreate captured moments.
Inside the visors, there is a high-performance eye-tracking system based on LEDs and infrared cameras. It projects invisible light patterns onto each eye. The goal? To accurately track eye movements, enabling interface control using just our gaze.
As for audio, both capsules (one on each side of the strap) can recreate spatial sound. They analyze the environment, much like AirPods, to adapt to it. Apple calls it "audio mapping": a system that maps the 3D environment in which we are located in real-time using infrared and cameras, in order to tailor the sound.
In addition to an array of sensors both inside and outside the visor, Apple Vision Pro boasts two high-resolution cameras and a LiDAR scanner. These two cameras are responsible for precise head and hand tracking, real-time 3D mapping of the environment, and capturing both photos and videos. The videos can be recorded with spatial audio.
To power this hardware arsenal, Apple has gone above and beyond with their processor choice. These glasses feature the Apple M2 chip, the same one found in some of Apple's best computers. Alongside it, they introduce the new Apple R1 chip, a dedicated SoC for processing input from the cameras, sensors, and microphones, delivering images to the displays in just 12 milliseconds. The goal is to process all the information seen by the glasses without delays and in real-time.
Apple's true differentiator:
While the hardware is top-notch, the real differential key of these glasses lies in one of Apple's strongest points: the software.
visionOS and the usage scenarios of Apple Vision Pro...
In the case of these Apple Vision Pro glasses, we're talking about a combination of augmented reality and virtual reality in a way we haven't seen before. Their direct competitors focused on using beacons to achieve virtual reality or projecting content into our environment to create augmented reality. The software responsible for managing this proposition is called visionOS.
"visionOS features a new three-dimensional interface that makes digital content look and feel present in a user's physical world. By dynamically responding to natural light and casting shadows, it helps the user understand scale and distance. To enable user navigation and interaction with spatial content, Apple Vision Pro introduces a completely new input system controlled by a person's eyes, hands, and voice." - Apple
In Apple's case, the integration is much more organic, with different interfaces that adapt to various environments we may encounter, as well as configurations designed for each application we run on the glasses.
The size of the interface itself is adjustable, as we can zoom in on both multimedia content and applications (browser windows, galleries, movies, etc.). The goal is for the interface to adapt to the environment at all times, and for us to have control over its size.
We'll continue to share more about this game-changing tool tomorrow
Tu opinión enriquece este artículo: