🟡 Vision Pro: Optics and Passthrough

a photo of my cat, Miss Honey, on my lap as taken by my Vision Pro. If you're thinking that it's kind of a crappy photo, you're right.
Last summer when select members of the tech press got brief demos with the Vision Pro, the reviewers’ impressions really oversold the experience of the passthrough video. I don’t think the reviewers were trying to oversell it; I suspect that it was a side effect of how tightly Apple controlled the demo environments, preventing the reviewers from being able to experience any issues. I was left thinking I’d be able to wear Vision Pro and reasonably see out of it to do household tasks, albeit with a bit of a haze.

In exactly one aspect, the vision passthrough is like magic. As you move your head around, what you see on screen keeps up with you perfectly, with no lag whatsoever. That is really quite good. I can competently have a conversation with my boyfriend and see him pretty well, and I basically can move around the house without feeling disoriented. It is easy to forget that you are looking at a pair of screens and not just looking through some glass.

The problem is that although it feels like you’re looking through glass, it’s a really lousy piece of glass. When you move your head the view keeps up with you. You can’t make out details in the world around you with Vision Pro on. It would be an exercise in frustration to do something nontrivial on your phone. As you move your head around, you will see some motion blur on the things you’re looking at. It seems worse when you’re looking at the things in passthrough, but apps in the Vision Pro do it too, to a lesser extent. It’s weird. But again, I must emphasize, when you move your head around, the movement keeps up with you absolutely perfectly in terms of what you see.

For a VR headset, I suspect this is the correct tradeoff; the brain is very sensitive to vision lagging and it will quickly lead to motion sickness in a lot of people, and it’s better to have that rendering always happen with minimal (and I’m talking 12 milliseconds) latency at the cost of some image fidelity because the brain tolerates that lack of fidelity more.

One feature I was hoping to see in Vision Pro was the ability to have a window stay in one position in my field of view and follow me as I move around, like a real life picture in picture, so I could move about the house and do some tasks while watching a video casually or something. But I get why that feature isn’t there; it just isn’t pleasant to rely on the passthrough view to actually do anything with what you see; it’s mostly there to give you a sense of place.

The app Crouton, a recipe manager for Apple platforms, made a Vision Pro app, and they did some super cool stuff, like letting you place timers throughout your kitchen so that the timers themselves are sitting next to the thing they’re timing. But I’m not sure I’d feel comfortable trying to cook a meal with a Vision Pro on; it would be limiting my vision too much. I might give it a shot this week and see.

I don’t think the problem is with the 4K cameras in each eye, or even necessarily the cameras (allegedly they’re 6.5 megapixels, which isn’t great but not terrible). The challenge is that Vision Pro needs to be able to render your field of view for you constantly, in real time, with so little latency that you don’t feel motion sickness or get disoriented. And to render all of those pixels in perfect focus would take more compute power than Apple could cram into this first Vision Pro. Even when I’m holding still, the Vision Pro isn’t able to render my entire field of view in focus; it uses foveated rendering to watch where my eyes are looking and put more effort into rendering that part of the image in focus, and the parts in my peripheral vision are rendered with less fidelity.

Overall, the rendering of on-screen things like apps in the Vision Pro is decent. I use 2 27" 5K Retina displays at my desk and the rendering isn’t quite that nice, but it’s still good and text is not pixelated, but sometimes things are blurry, like I’m looking through a microscope I need to turn the focus knob on a tad. Focus is uneven too; if you’re looking at just the right angle things look just perfect, but the outer edges of the displays have a bit of a vignetting effect and a little chromatic aberration, and stuff that isn’t dead center is blurrier than what I’m looking right at. I can improve this somewhat by fiddling with the positioning of the headset on my head, but I sometimes wish that there was just more fine tuning available.

Some of this makes me think I might want to go to an Apple Store with one of every Zeiss lens insert strength and just try a few different ones and see if something different is better, then see if I can get my optometrist to write me an Rx specifically for new lenses, simply because the nature of the blurriness feels optical in nature, not the result of something like foveated rendering. But professional reviews also noted issues like this, even those from people who don’t wear glasses, so I think this might just be a v1 limitation.

I originally scored this as 🔴, but after spending more time with it I’ve softened up a bit and I just accept it as a thing that can use some work, but it’s not a dealbreaker.

Leave a Reply

Your email address will not be published. Required fields are marked *