By BC Holmes, Chief Technologist | Published: June 29, 2023 in Blog
I am eager to try Apple’s new Vision Pro.
Ever since Apple announced this new product at WWDC, there have been countless think pieces expressing two talking points. First, it’s a big deal because it has the backing of Apple, it’s a new product category for them, and Apple fans are going to fawn all over it. So, it’d be foolish to ignore the announcement.
More soberly, the second talking point is how commentators can’t really believe that this product will be a success. That more cynical second talking point brings up two key arguments:
1. The price. It’s really expensive.
2. Nobody can imagine why anybody would want one.
I’m not going to lie: the price point is a bit intimidating. The price tag moves me from “I’m going to get one of these to try out” to “I might need to think about this”. (It’s also not entirely clear when Vision Pro will be available in Canada — my fear is, like the HomePod, Apple will take entirely too long to make it available here).
But the second complaint is the one that interests me more because it feels like such a failure of imagination on the part of the critics. We’ve heard this critique before: for the iPad, for the Apple Watch and for the HomePod. When the Apple Watch originally came out, for example, people regularly queried “why would I want this on my wrist?” And “what’s the killer app for the Watch?” Was it just a status symbol? Was it a miniature iPhone? The trade press struggled to articulate why people might want one.
But at the same time, Apple Watch caused an industry wide 61% jump in smartwatch sales. Apple Watch quietly became the leading wearable device. In some recent quarters, Apple Watch accounts for over half of all smartwatch purchases.
Why is Vision Pro more interesting than people are giving it credit for?
First, I’m fascinated by the relatively low-friction operation of the device. Everyone who attended the Vision Pro hands on event at WWDC raves about how good Apple’s implementation of eye tracking and finger-tapping-to-select are. The phrase “the best I’ve ever seen” is very common among those reviewers. This low friction matters: today, VR headsets like the Meta Quest are largely controlled using hand-held controllers. These controllers will sometimes fail, or go missing, or run out of battery, or have connection issues. The naturalness of this interaction model will be, I suspect, as revolutionary as iPhone tap-and-gesture based interface. iPhone created a deeper form of engagement than had existed with other phones or digital assistants, and I think Vision Pro will do likewise.
Secondly, I’m very interested in how Apple has positioned Vision Pro. Whereas other VR headsets have largely been sold as gaming appliances, Apple spent far more time emphasizing it as a media device, an extension of your computer, etc., and a video conferencing tool.
I’m also fascinated by how the media experience is untethered from location. For example, today I have one television in my home. When I want to watch a movie or TV show, I go to where the TV is and watch it there. I could watch a show on my iPad or iPhone, but I’m always conscious of the fact that the experience is not as good. The screen is smaller; there might be glare; and I will probably have to hold the device through the entire watching experience (I also have cats who will be more than happy to get between me and the iPad). The best viewing experience is in my living room because the picture quality is high, the TV is large, and I have my Apple TV connected to my HomePod speakers for good sound quality.
Vision Pro purports to give me the “big screen TV” experience in a portable way. I’ve never wanted to put a TV in my bedroom, but if Vision Pro gives me a big-screen TV experience while I’m lying in bed, I’d try that out. I might watch something in a back yard or in the bath. I often read things on my iPad in the bath — why not watch TV on a big screen while soaking away with a nice bath bomb and a cold drink? Vision Pro might, in fact, be a superior experience than my physical TV, given the claims about how it is able to virtually adjust the physical room to be darker, more immersive and so forth.
There’s a moment in the Vision Pro introductory video where we see a person using it in an airplane seat. It’s not hard to imagine why you might want to use a device like that on an airplane. They have terrible TV screens (when they actually work!), and who doesn’t want to tune out other passengers on the plane?
Of course, that idea makes me think about the use of Vision Pro as a device to tune out the outside world and to enable better focus. Pre-lockdown, I knew of many tech development shops that would encourage, for example, the use of noise-cancelling headphones because they understand that interrupting developers causes context switching and kills productivity. If I am already using Vision Pro as a virtual monitor for my computer, it’s easy to think that the experience will be like creating a personal environment to tune out distractions and get into the zone.
Apple was quick to point to the features that ensure that you don’t completely tune out the kids that you’re supposed to be looking after while working on the computer but, candidly, I don’t want that; I want the opposite.
More than just a focus-aid, the virtual monitor abilities — either while connected to a laptop or just using the computing capabilities of the Vision Pro itself — also provides me with a lot of options.
I recall one of the early responses as the lockdown prompted us in the direction of remote work was that people suddenly needed extra monitors for their home offices. With Vision Pro, everything in front of me can be part of my “monitor”. I can fill my field of vision with all kinds of virtual application windows. And I can take those windows with me if I’m working at a client site, or if I decide, one day, to set up in a coffee shop or a remote workspace.
In Norm Chan’s tested.com review of his experience with the Vision Pro hands-on demo, he talked about an experience where he collaborated with another Vision Pro user. As he tells it, the two of them were interacting with a shared screen much like one would interact with a white board. Norm referenced that doing this demo required that they go through a bit of coordination so that the two of them were making the same choices about the layout of their virtual space. That’s a weirdness that we don’t have in the physical world. But what I take from that is that Vision Pro appears to enable a deeper form of virtual interaction: rather than just be a face on a conferencing app or a shared monitor window, it sounds like this is getting closer to a virtual presence that better supports body language and gestures and being able to point at a shared image and refer to “that part there.”
Comparison of this capability with Google’s Project Starline is inevitable. The selling feature of Starline is that it creates a stronger sense of physical presence, which purports to translate into a better and more immersive meeting experience. For me, the limiting factor for Starline is the hardware requirement: it uses a specific configuration of camera devices that almost certainly will stay, permanently, in a specialized meeting room. And that device has no other purpose other than to be a Starline-enabled meeting appliance. People are probably not going to have this kind of set-up at home, and that’s also going to limit its usefulness. Candidly, the hardware requirements feel self-marginalizing. This is probably not an unsolvable problem; I just think that Google has a short runway to solve it, or another solution will become the de facto standard.
Unlike with Starline, the reviewers who’ve tried Vision Pro report that the imagery in the Apple video conferencing capability doesn’t try to be hyper-realistic; they report that there’s a veneer of artificiality in the way people are digitally re-created. But in all other ways, I feel like Vision Pro has the ability to be more accessible in its video conferencing ability. I can use it wherever I have the headset, and I’m more likely to have the headset on hand because I’m also using it for other things.
And, of course, you can write apps for Vision Pro. For example, a Vision Pro app could provide an organization with something like what Amazon has in its warehouses. Much has been written about Amazon’s use of augmented reality to help warehouse workers find specific stock or receive messages about work tasks (although, admittedly, by all accounts, Amazon’s implementation is Kafkaesque). Amazon, as a FAANG company, has invested significant developer effort to provide those features: a Vision Pro could help warehouse environments access that kind of function with less proprietary technical capability.
And we’ve barely explored the ability to enjoy new immersive experiences. Many people have already talked about buying virtual ringside seats at a sporting event as a one kind of experience enabled by the virtual reality capabilities of the Vision Pro. Media reps who experienced the Vision Pro hands-on demo also talked about the 3D dinosaur demo: how effective the demo seemed and how realistic it looked. But I can also foresee great interest in virtual tours of Angkor Wat or exploring volcanos or wandering around the inside of the Starship Enterprise.
I am super stoked and excited to try Vision Pro. I don’t know when I will get that chance, but when it comes, I’ll jump on it. But at the same time, I think that the media commentators who can’t conceive of why this device might be attractive need to up their imagination skills.
Does Apple Vision Pro have a role in your collaborative or remote work strategy? Contact the Intelliware team today to learn more.