Hands-on: The Apple Vision Pro is best AR and VR experience I’ve ever tried (and I’ve tried many)

Apple’s long-rumored mixed reality (MR) headset, the Vision Pro, is here. Well, sort of. It was announced yesterday at WWDC, and I got to wear the headset for about 25 minutes today, but it isn’t going on sale until “early next year,” and its $3,499 retail price is going to be the point of much contentious debate.

I’m no stranger to MR eyewear (or AR glasses, as they’re more widely known), having covered most of the products from upstart Chinese brands like XReal (formerly nReal) and Rokid for a few years. In all my testing, I concluded that they’re great as screen mirroring eyewear but not much else. Apple’s Vision Pro is a far more ambitious (and pricier) product. It isn’t just mirroring a smartphone screen in front of the wearer’s face. Instead, it’s truly a wearable computer, with its own SoCs, displays, sound system, and operating system.

You know those photos and videos of the Vision Pro in action Apple showed off unveiling? With multiple virtual windows floating in front of the wearer, straight out of a sci-fi movie scene? I can confirm those are accurate depictions of the real experience of wearing it.

The setup process

Screenshot 2023-06-06 at 11.07.26 AM

Yup, it really felt like that for me.

The Vision Pro has a calibration/fitting process for first-time users. When I entered a private room to demo the headset, I was given an iPhone by Apple staff to scan my face and ears with the iPhone’s TrueDepth camera system. The face scan does two things. One, it creates a depth map of my face, so the Vision Pro’s modular pieces (a light-sealing faceplate and head wrap) can be adjusted for a personalized fit. The second is so the Vision Pro can create a digital virtual avatar of my face for if I am making FaceTime calls with others while wearing the headset.

Meanwhile, the ear scan helps the Vision Pro understand my ear shape and ear canal, so it can customize a spatial audio profile for the sound, which is pumped out by the headset’s dual-drive speaker system toward the user’s ear.

One of the two dual-driver ear pods on the Vision Pro. 

One of the two dual-driver ear pods on the Vision Pro.

Next, an Apple-employed optometrist attempted to scan my eyes (using a different tool that’s not an iPhone) so he could create a customized Zeiss-branded vision plate for me. I didn’t need one since I have perfect vision, but my bespectacled media friend who had one made said the Zeiss inserts allowed him to see clearly. An Apple exec confirmed this whole set-up experience will be provided for U.S. consumers at Apple stores, so it doesn’t appear the Vision Pro is something one can just buy off Amazon.

Putting the Vision Pro on

Screenshot 2023-06-06 at 12.22.48 PM

The forward-facing camera sensors.

The first thing I realized when I put on the headset was that the Vision Pro covered my vision entirely. There are no see-through lenses, so the “real world” I see is, in fact, real-time video footage captured by the forward-facing cameras. The footage looked sharp and was low latency enough to not feel weird, but it wasn’t 100% natural either. I could clearly tell I was staring at video footage of my surroundings. There’s also a subtle loss of sharpness at the edges. The forward-facing cameras aren’t just recording footage to feed my eyes. They’re constantly mapping the surrounding space, detecting my hand movements, and capturing footage for photos and videos (more on this later).

The Vision Pro did feel a bit heavy on my head. Apple declined to reveal its official weight, but I’d say it’s at least 1.5 pounds. It’s not heavy enough for me to say it’s uncomfortable to wear, but it’s not exactly comfortable, either. However, I do think I could get used to the weight if I sat back while leaning on a pillow or cushion behind my head.

Once the headset was on my face and booted up, the first thing I saw is the floating home screen menu, which looks like a giant iPad home screen mapped onto real-world surroundings. The menu also stays centered in front of my face no matter where I move my head.

The floating homescreen dock of Vision Pro

The floating homescreen dock of Vision Pro

This was where the first sign of Apple magic kicked in: I could navigate the UI with my eyeballs. Even though I knew about this feature going into the demo, it still felt surreal to experience it for the first time. I simply looked at the app icon and it enlarged. The Vision Pro really stood out with finger gestures. For example, I could touch my thumb and index finger together to open an app. I’ve tried other AR/VR headsets requiring hand gestures, and they require very exaggerated motions, with my arms fully extended. Here, I didn’t even have to lift my arms. I touched my fingers with them resting on my lap.

Things I tried: Immersive content, extended virtual screens, truly ‘augmented’ reality

An Apple render of the Vision Pro

During my demo session, Apple execs showed off several impressive things the headset can do. The Vision Pro can use its array of forward-facing cameras to capture 3D photos and videos, which I can then view on the headset. Capturing my own content, which I can then view in a large size with a convincing 3D effect, is going to change the way we relive memories. I suppose needing to wear the headset to take the photos or videos would make the capturing experience very awkward, but imagine being able to relive your child’s birthday party, or your dog running through the park.

Earlier I mentioned that the headset lets me see “through” into the real world, but you can easily turn this off with a twist of the digital crown. A full twist will essentially black out the background so you only see the graphical interface. Twist it halfway or to various degrees to get levels of transparency. The haptics felt great, and the way I can slowly shut out the real world felt surreal.

Shutting the real world out would be the ideal way to watch videos and get total immersion. With some hand gestures, I can pinch and drag to enlarge videos or photos. Viewing content with the headset feels sublime, with panoramic photos spanning my entire field of view. The Vision Pro can detect when someone has stepped into my field of view, and will automatically turn on transparency to let me see the person. The other person will also see my eyes — not my real eyes (remember, there isn’t actual pass-through), but a video of them.

apple-vision-pro-xda-wwdc01528

Apple also showed off a series of professionally shot video clips it called “immersive video” that covered my entire field of vision (not quite 360 degrees like virtual reality, but at least 180 degrees, enough that I’d have to drastically turn my head to break the illusion). These clips included a courtside NBA game that felt so real my jaw dropped; footage shot by the cliff that had my fear-of-heights kick in; and later, a CGI animated dinosaur jumped into the scene of what was otherwise a tranquil nature shot that gave me a legit jump scare.

The Vision Pro’s ability to blend realities is impressive. During those videos, I stuck my hand out, and the headset’s sensors detected my hands in a split-second and overlaid my hand over the scene.

So that’s all the fun scenarios. However, I could also potentially see myself wearing the Vision Pro to turn my entire field of view into a workspace. Since the headset is a standalone computer with its own processor, displays, software, and input device, I can in theory run Apple apps directly in the visionOS UI and work that way. But Apple also confirmed I can extend a Mac’s display into the virtual space. So I can have my MacBook screen floating in front of me as if I’m working off a 70-inch screen. And yes, the text is sharp enough that I think I can write articles on it.

Early thoughts: I am blown away

An image showing a person wearing Apple's Vision Pro headset.

Source: Apple

Look, I know there’s been no shortage of snark and mockery on Twitter the past 24 hours about the Vision Pro’s $3,500 price. I think at least half of those skeptics would change their tune if they got to demo the Vision Pro. It may not change their opinion that $3,500 is a lot of money, but they would probably cease to be snarky and concede the technology justifies the price.

I spoke to about eight different peers from various countries who got to try the Vision Pro, and not a single one of them had a bad thing to say. All of them used words like “amazing”, “unbelievable”, and “jaw-dropping” to describe the experience.

And those who are familiar with my work should know I’m a very vocal critic of the iPhone, so I am far from an Apple loyalist. But the Vision Pro really feels like a game-changer to me. It is easily the best VR experience I’ve tried, and easily the best AR eyewear I’ve tried. Heck, it’s the best demo I’ve ever tried, from any consumer product in my nine years covering tech.

Is the product worth $3,500? Only you can make that call for yourself. But I really advise people to take this advice: “Don’t knock it until you’ve tried it.”