Apple's new Vision Pro is a very cool piece of tech, but how does it work for people who don't have a typical anatomy?

What's the problem?

For those who aren't aware I have an exceedingly rare condition known as Aniridia. Aniridia is either a partial or complete absence of an Iris. In my case, I have no Iris's at all! This makes the very premise of the Vision Pro rather difficult. For one, it's "Optic ID" is an Iris scan, for another tracking somebody's eye movement is almost certainly heavily impacted by somebody's Iris. This is effectively confirmed if you go through the setup process for the Vision Pro which keeps increasing the background lighting and asking you to select various dots by looking at them. My suspicion is that the lighting change is specifically calibrate eye tracking with various eye dilations. In my case, however my eyes are always 100% dilated.

How did it work?

Setup for me was an absolute nightmare. It started out easy enough, put on the Vision Pro, become impressed by "Hello" being written in the physical world, and enjoy my partner laughing at how stupid I looked waving at thin air with these large awkward goggles strapped to my face.

It starts off simple enough, bring your phone close and through typical Apple magic it starts to pair with your new device, done! Then an immediate challenge. It asked me to look at the code that came with my prescription lenses. I stared at the code and....nothing. I figure maybe it's just not calibrated and needs help so I set down the code on my table and bent over so it was basically the only thing in my field of view...success!

Next up comes the calibration, hold out my hands...success, flip them over...success. And here's where the Vision Pro experience fell apart for me. Stare at 6 dots and select each one by tapping your fingers together. The Vision Pro couldn't remotely tell where I was looking, random dots would activate nowhere near where I was focused.

After making me select 6 dots, failing to know which ones I selected, and making me try again it'd say "Completing Eye Setup ... Eye Setup Failed." Of course there's no information to be found online and contacting Apple support is rather burdensome since they're overwhelmed with calls from folks asking for help.

So is it Remotely Usable?

Actually, yes! While I couldn't find a reasonable way during the setup to change how tracking worked I was able to triple tap the crown to turn on VoiceOver. For those not familiar VoiceOver is a screen reader designed to help blind people navigate Apple products. This allowed me to get past the setup screens but came with the downside that it kept trying to move the VoiceOver cursor based on where I was looking.

Getting past the setup was still very cumbersome because of the VO cursor moving with where it thought I was looking, but eventually I was able to use the virtual keyboard to create a password and get through to the home screen!

A dive into the Accessibility Settings finally gave me what I really wanted, a way to tell the Vision Pro that I don't want it to use eye tracking to move the cursor. There were several options:

  • Eyes -- bad for me
  • Head -- This actually works quite well and it's when I went with. The downside is that the Vision Pro is heavy and this involves a fair bit of neck movement
  • Wrist -- I might give this more of a shot, you have a lot more "waving your hands around", but it comes with the advantage that when you tap your fingers to select something it's picked up much better and more frequently by the Vision Pro
  • Index Finger -- I liked this the least of the options that worked, it involved a little too much hand movement for my taste

It's also worth noting there was a way to turn on a display that showed you where the cursor was. This was honestly hilarious when I had "eyes" selected because I could see exactly how hard it was for the Vision Pro to track my eye movement, the cursor manically jumped all around my virtual environment.

What about Optic ID?

Optic ID, unsurprisingly, doesn't remotely work with my anatomy. It scans my eyes, tells me to raise the headset a bit, then just fails. I honestly never expected this to work, as it was always described as an Iris scan which notably requires somebody to have Iris's.

Verdict

Apple gets a gold star for having different pointer options, it makes the headset decidedly usable even for people like me. If you've got any number of conditions that would make eye tracking difficult rest assured there is some way to use the $3,500+ device. Apple gets a big fat thumbs down for the setup experience being so difficult to navigate when you have an irregular anatomy.

post-image
Tyler Thompson

Tyler Thompson is a Principal Engineer with over 15 years experience. He currently works as a Principal Software Development Engineer for Zillow Group. Before working at Zillow he was a Principal Software Engineer for World Wide Technology and worked across many different industries.