Random Thoughts – Randocity!

Is Apple’s Vision Pro worth the money?

Posted in Apple, botch, business, computers by commorancy on February 2, 2024

VisionProGogglesLet me preface this article by saying that this is not intended review the Apple Vision pro. Instead, it is intended as an analysis of Apple’s technology and the design behind the Apple Vision Pro headset. The Vision Pro’s hefty price tag also begins at $3500 and goes up from there depending on selected features. Let’s explore.

Price Tag vs Danger Target

The first elephant in the room to address with this Virtual Reality (VR) headset is its price tag. Because there is presently only one model of this headset, anyone who sees you wearing it knows the value of this headset instantly. This means that if you’re seen out and about in public wearing one, you’ve made yourself a target not simply for theft, but for a possible outright mugging. Thieves are emboldened when they know you’re wearing a $3500 device on your person. Because the Vision Pro is a relatively portable device, it would be easy to scoop up the entire device and all of its accessories in just a few seconds and walk off with it.

Like wearing an expensive diamond necklace or a Rolex watch, these items flaunt wealth. Likewise, so does the Vision Pro. It says that you have disposable income and wouldn’t really mind the loss of your $3500 device. While that previous statement might not be exactly true, it does have grains of truth in it. If you’re so wealthy that you can plop down $3500 for a Vision Pro, you can likely afford to buy another one should it go missing.

However, if you’re considering investing in a Vision Pro VR headset, you’d do well to also invest in a quality insurance policy that covers both loss from theft and damage both intentional and accidental. Unfortunately, a loss policy won’t cover any injuries you might sustain from a mugging. Be careful and remain alert when wearing a Vision Pro in public spaces.

The better choice is not wear the headset in public spaces at all. Don’t use it on trains, in planes, at Starbucks, sitting in the lobby of airports or even in hotel lobbies. For maximum safety, use the Vision Pro device in the privacy and safety of your hotel room OR in the privacy and safety of your own home. Should you don this headset on public transportation to and from work, expect to get not only looks from people around you, expect to attract thieves looking to take it from you, potentially forcibly. With that safety tip out of the way, let’s dive into the design of this VR headset.

What exactly is a VR headset useful for?

While Apple is attempting to redefine what a VR headset is, they’re not really doing a very good job at it, especially for the Vision Pro’s costly price tag. To answer the question that heads up this section, the answer is very simple.

A VR headset is simply a strap on 3D display. That’s it. That’s what it is. That’s how it works. Keep reading much further down for the best use cases of 3D stereoscopic displays. The resolution of the display, the eye tracking, the face tracking, the augmented reality features, these are all bells and whistles that roll out along side of the headset and somewhat drive the price tag. The reality is as stated, a VR headset is simply a strap on video display, like your TV or a computer monitor. The only difference between a TV screen or monitor is that a VR headset offers 3D stereoscopic visuals. Because of the way the lenses are designed on VR headset, the headset can use its each-eye-separate-display feature to project flat screens that appear to float convincingly both at a distance and at a scale that appears realistically large, some even immensely large like an IMAX screen in scale.

These VR flat screens float in the vision like a floating displays featured in many futuristic movies. However, a VR headset is likewise a personal, private experience. Only the wearer can partake in the visuals in the display. Everyone else around you has no idea what you’re seeing, doing or experiencing…. except they will know when using the Vision Pro because of one glaring design flaw involving the audio system (more on this below). Let’s simply keep in mind that all that a VR headset boils down to is a set of goggles containing two built-in displays, one for each eye; displays which produce a stereoscopic image. Think of any VR headset as the technological equivalent of a View Master, that old 1970s toy with paper image discs (reels) and a pull down lever to switches images.

How the video information is fed to those displays is entirely up to each VR headset device.

Feeding the Vision Pro

For the Vision Pro, this device is really no different than any of a myriad of other VR headsets on the market. Apple wants you to think that theirs is “the best” because Apple’s Vision Pro is “brand new” and simply because it’s brand new, this should convince you that it is somehow different. In reality, the Vision Pro doesn’t really stand out. Oh sure, it utilizes some newer features, such as better eye tracking and easier hand gestures, but that’s interface semantics. We’ll get into the hand gesture problems below. For the Vision Pro’s uses, getting easy access to visual data from the Vision Pro is made as simple as owning an iPad. This ease is to the credit of Apple, but this ease also exists because the iPad already exists allowing that iPad ease to be slipped into and then leveraged and utilized by the Vision Pro.

In reality, the Vision Pro OS might as well be an iPad attached to a strap-on headset. That’s really how the Vision Pro has been designed. The interface on the iPad is already touch capable, so it makes perfect sense to take the iPadOS and extract and expand it into what drives the Vision Pro, except using the aforementioned eye tracking, cameras and pinch gesture.

The reason the Vision Pro is capable of all of this is because they’ve effectively married the technology guts of an iPad into the chassis of the Vision Pro. This means that unlike many VR headsets which are dumb displays with very little processing power internally, the Vision Pro crams a whole iPad computer inside of the Vision Pro headset chassis.

That design choice is both good and bad. Let’s start with the good. Because the display is driven by an M2 chip motherboard design, like an iPhone or iPad, it has well enough power to do what’s needed to drive the Vision Pro with a fast refresh rate and with a responsive interface. This means a decent, friendly, familiar and easy to use interface. If you’re familiar with how to use an iPad or an iPhone, then you can drop right into the Vision Pro with little to no learning curve. This is what Apple is banking on, literally. The fact that because it’s so similar to their already existing devices makes it simple to strap one on and be up and running in just a few minutes.

Let’s move onto the bad. Because the processor system is built directly into the headset, that means it will become obsolete the following year of its release. As soon as Apple releases its next M2 chip, the Vision Pro will be obsolete. This is big problem. Expecting people to drop $3500 every 12 months is insane. It’s bad enough with an iPhone that costs $800, but for a device that costs $3500? Yeah, that’s a big no go.

iPhone and Vision Pro

The obvious design choice in a Vision Pro’s design is to marry these two devices together. What I mean by this marriage is that you’re already carrying around a CPU device capable of driving the Vision Pro headset in the palm of your hand. Instead, Apple should have designed their VR headset to be a thin client display device. What this means is that as a thin client, the device’s internal processor doesn’t need to be super fast. It simply needs to be fast enough to drive the display at a speed consistent with the refresh rates needed to be a remote display. In other words, turn the Vision Pro into a mostly dumb remote display device, not unlike a computer monitor, except using a much better wireless protocol. Then, allow all Apple devices to pair with and use the Vision Pro’s headset as a remote display.

This means that instead of carrying around two (or rather three, when you count that battery pack) hefty devices, the Vision Pro can be made much lighter and will run less hot. It also means that the iPhone will be the CPU device that does the hard lifting for the Vision Pro. You’re already carrying around a mobile phone anyway. It might as well be the driving force behind the Vision Pro. Simply connect it and go.

Removing all of that motherboard hardware (save a bit of processor power to drive the display) from inside the Vision Pro does several things at once. It removes the planned obsolescence issue around the Vision Pro and turns the headset into a display device that could last 10 years vs a planned obsolescence device that must be replaced every 12-24 months. Instead of replacing the headset each year, we simply continue replacing our iPhones as we always have. This business model fits right into Apple’s style.

A CPU inside of the headset will still need to be fast enough to read and understand the cameras built into the Vision Pro so that eye tracking and all of the rest of these technologies work well. However, it doesn’t need to include a full fledged computer. Instead, connect up the iPhone, iPad or even MacBook for the heavy CPU lifting.

Vision Pro Battery Pack

The second flaw of the Vision Pro is its hefty and heavy battery pack. The flaw isn’t the battery pack itself. It’s the fact that the battery pack should have been used to house the CPU and motherboard, instead of inside the Vision Pro headset. If the CPU main board lived in the battery pack case, it would be a simple matter to replace the battery pack with an updated main board each year, not needing to replace the headset itself. This would allow updating the M2 chip regularly with something faster to drive the headset.

The display technology used inside the Vision Pro isn’t something that’s likely to change very often. However, the main board and CPU will need to be changed and updated frequently to increase the snap and performance of the headset, year over year. By not taking advantage of the external battery pack case to house the main board along with the battery, which must be carried around anyway, this is a huge design flaw for the Vision Pro.

Perhaps they’ll consider this change with the Vision Pro 2. Better, make a new iPhone that serves to drive both the iPhone itself and the Vision Pro headset with the iPhone’s battery and using the CPU built into the iPhone to drive the Vision Pro device. By marrying the iPhone and the Vision Pro together, you get the best of both worlds and Apple gets two purchases at the same time… an iPhone purchase and a Vision Pro purchase. Even an iPad should be well capable of driving a Vision Pro device, including supplying power to it. Apple will simply need to rethink the battery sizes.

Why carry around that clunky battery thing when you’re already carrying around an iPhone that has enough battery power and enough computing power to drive the Vision Pro?

Clunky Headset

All VR headsets are clunky and heavy and sometimes hot to wear. The worst VR headset I’ve worn is, hands down, the PSVR headset. The long clunky cables in combination with absolutely zero ventilation and its heavy weight makes for an incredibly uncomfortable experience. Even Apple’s Vision Pro suffers from a lot of weight hanging from your cheeks. To offset that, Apple does supply an over-the-head strap that helps distribute the weight a little better. Even still, VR headset wearing fatigue is a real thing. How long do you want to wear a heavy thing resting on your cheekbones and nose that ultimately digs in and leaves red marks? Even the best padding won’t solve this fundamental wearability problem.

The Vision Pro is no different in this regard. The Vision Pro might be lighter than the PSVR, but that doesn’t make it light enough not to be a problem. But, this problem cuts Apple way deeper than this.

Closing Yourself Off

The fundamental problem with any VR headset is the closed in nature of it. When you don a VR headset, you’re closing yourself off from the world around you. The Vision Pro has opted to include the questionable choice of an aimed spatial audio system. Small slits in the side of the headset aim audio into the wearer’s ears. The trouble is, this audio can be heard by others around you, if even faintly. Meaning, this extraneous audio bleed noise could become a problem in public environments, such as on a plane. If you’re watching a particularly loud movie, those around you might be disturbed by the Vision Pro’s audio bleed. To combat this audio bleed problem, you’ll need to buy some Airpods Pro earbuds and use these instead.

The problem is, how many people will actually do this? Not many. The primary design flaw was in offering up an aimed, but noisy audio experience by default instead of including a pair of Airpods Pro earbuds as the default audio experience when using the Vision Pro. How dumb did the designers have to be to not see the problem coming? More than likely, some airline operators might choose to restrict the use of the Vision Pro entirely on commercial flights simply to avoid the passenger conflicts that might ensue because the passenger doesn’t have any Airpods to use with them. It’s easier to tell passengers that the device cannot be used at all instead of trying to fight with the passenger about putting in Airpods that they might or might not have.

It goes deeper than this, though. Once you don a headset, you’ve closed yourself off. Apple has attempted to combat the closed of nature of a VR headset by offering up front facing cameras and detecting when to allow someone to barge into the VR world and have a discussion with the wearer. This is an okay idea so long as enough people understand that this barge-through idea exists. That will take some getting used to, both for the Vision Pro wearer, but also for the person trying to get the wearer’s attention. That assumes that barge-through even works well enough to do that. I suspect that the wearer will simply need to remove the headset to have a conversation and then put it back on to resume whatever they were previously doing.

Better Design Choice

Instead of a clunky closed off VR headset, Apple should have focused on a system like the Google Glass product. Google has since discontinued the production of Google Glass, mostly because it really didn’t work out well, but that’s more because of Google itself and not of the idea behind the product.

Yes, a wearable display system could be very handy, particularly with a floating display in front of the vision of the user. However, the system needs to work in a much more open way, like Google Glass. Because glasses are an obvious solution to this, having a floating display in front of the user hooked up to a pair of glasses makes the most obvious sense. Glasses are light and easy to use. They can be easily put on and taken off. Glasses are easy to store and even easier to carry. Thick, heavy VR headsets are none of these things.

Wearing glasses keeps the person aware of their surroundings, allowing for talking to and seeing someone right in front of you. The Vision Pro, while it can recreate the environment around you with various cameras, still closes off the user from the rest of the world. Only Apple’s barge-through system, depending on its reliability, has a chance to sort-of mitigate this closed off nature. However, it’s pretty much guaranteed that the barge-through system won’t work as well as wearing a technology like Google Glass.

For this reason, Apple should have focused on creating a floating display in front of the user that was attached to a pair of glasses, not to a bulky and clunky headset. Yes, the Vision Pro headset is quite clunky.

Front Facing Cameras

You might be asking, if Google Glass was such a great alternative to a bulky headset, why did Google discontinue it? Simple, privacy concerns over the front facing camera, which led to a backlash. Because Google Glass shipped with a front facing camera enabled, anyone wearing it, particularly when entering a restaurant or bar, could end up recording the patrons in that establishment. Because restaurants and bars are privately owned spaces, all patron privacy needs to be respected. To that end, owners of restaurants and bars ultimately barred anyone wearing Google Glass devices from using them in the establishment space.

Why is this important to mention? Because Apple’s Vision Pro may suffer the same fate. Because the Vision Pro also has front facing cameras, cameras that support the barge-through feature among other potential privacy busting uses, restaurants and bars again face the real possibility of another Google Glass like product interfering with the privacy of their patrons.

I’d expect Apple to fare no better in bar and restaurant situations than Google Glass. In fact, I’d expect those same restaurants and bars that banned Google Glass wearers from using those devices to likewise ban any users who don a Vision Pro in their restaurants or bars.

Because the Vision Pro is so new and because restaurant and bar owners aren’t exactly sure how the Vision Pro works, know that if you’re a restaurant or bar owner, the Vision Pro has front facing cameras that record input all of the time, just like Google Glass. If you’ve previously banned Google Glass use, you’ll probably want to ban the use of Vision Pro headsets in your establishment for the same reasons as the ban on Google Glass. Because you can’t know if a Vision Pro user has or has not enabled a Persona, it’s safer to simply ban all Vision Pro usage than trying to determine if the user has set up a Persona.

VisionProEyesWhy does having a Persona matter? Once a Persona is created, this is when the front facing cameras run almost all of the time. If a Persona has not been created, the headset may or may not run the front facing cameras. Once a Persona is created, the front facing LED display creates a 3D virtual representation of the person’s eyes using the 3D Persona (aka. avatar). What you’re seeing in the image of the eyes is effectively a live CGI created image.

The Vision Pro is claimed by Apple not to run the front cameras without a Persona created, but bugs, updates and whatnot may change the reality of that statement from Apple. Worse, though, is that there’s no easy way to determine if the user has created a Persona. That’s also not really a restaurant staff or flight attendant job. If you’re a restaurant or bar owner or even a flight attendant, you must assume that all users have created a Persona and that the front facing cameras are indeed active and recording. There’s no other stance to take on this. If even one user has created a Persona, then the assumption must be that the front facing cameras are active and running on all Vision Pro headsets. Thus, it is wise to ban the use of Apple’s Vision Pro headsets in and around restaurant and bar areas and even on airline flights… lest they be used to surreptitiously record others.

Here’s another design flaw that Apple should have seen coming. It only takes about 5 minutes to read and research Google Glass’s Wikipedia Page and its flaws… and why it’s no longer being sold. If Apple’s engineers had done this research during the design phase of the Vision Pro, they might have decided not to include front facing cameras on the Vision Pro. Even when the cameras are supposedly locked down and unavailable, that doesn’t preclude Apple’s own use of these cameras when someone is out and about used solely for Apple’s own surveillance purposes. Restaurant owners, beware. All of Apple’s assurances mean nothing if a video clip of somebody in your establishment surfaces on a social media site recorded via the Vision Pro’s front cameras.

Better Ideas?

Google Glass represents a better technological and practical design solution; a design that maintains an open visual field so that the user is not closed off and can interact and see the world around them. However, because Google Glass also included a heads up display in the user’s vision, some legislators took offense to the possibility of the user becoming distracted by the heads up display that they could attempt to operate a motor vehicle dangerously while distracted. However, there shouldn’t be a danger of this situation when using a Vision Pro, or at least one would hope not. However, because the Vision Pro is capable of creating a live 3D image representation of what’s presently surrounding the Vision Pro user, inevitably someone will attempt to drive a car while wearing a Vision Pro and all of these legislative arguments will resurface… in among various lawsuits should something happen while wearing it.

Circling Back Around

Let’s circle around to the original question asked by this article. Is the Vision Pro worth the money?

Considering its price tag and its comparative functional sameness to an iPad and to other similar but less expensive VR headsets, not really. Right now, the Vision Pro doesn’t sport a “killer app” that makes anyone need to run out and buy one. If you’re looking for a device with a 3D stereoscopic display that acts like an iPad and that plays nice in the Apple universe, this might suffice… assuming you can swallow the hefty sticker shock that goes with it.

However, Apple more or less overkilled the product by adding the barge-through feature requiring the front facing camera(s) and the front facing mostly decorative lenticular 3D display, solely to support this one feature “outside friendly” feature. Yes, the front facing OLED lenticular display is similar to the Nintendo 3DS’s 3D lenticular display. The lenticular feature means that you probably need to stand in a very specific position for the front facing display to actually work correctly and to display 3D in full, otherwise it will simply look weird. The front facing display is more or less an expensive, but useless display addition to the wearer. It’s simply there as a convenience to anyone who might walk buy. In reality, this front display is a waste of money and design dollars, simply to add convenience to anyone who might happen along someone wearing this headset. Even then, this display remains of almost no use until the user has set up their Persona.

Once the wearer has set up a Persona, the unit will display computer generated 3D eyes on the display at times, similar to the image above. When the eyes actually do appear, they appear to be placed at the correct distance on the face using a 3D lenticular display to make it appear like the real 3D eyes of the user. The 3D lenticular display doesn’t require glasses to appear 3D because of the lenticular technology. However, the virtual Persona created is fairly static and falls rather heavily into the uncanny valley. It’s just realistic enough to elicit interest, but just unrealistic enough to feel creepy and weird. Yes, even the eyes. This is something that Apple usually nails. However, this time it seems Apple got the Persona system wrong… oh so wrong. If Apple had settled on a more or less cartoon-like figure with exaggerated features, the Persona system might have worked better, particularly if it used anime eyes or something fun like that. When it attempts to mimic the real eyes of the user, it simply turns out creepy.

In reality, the front facing display is a costly lenticular OLED addition that offers almost no direct benefits to the Vision Pro user, other than being a costly add-on. However, the internal display system per eye within the Vision Pro sports around 23 million pixels between both eyes and around 11.5 million pixels per eye, which is slightly less than a 5K display per eye, but more than a 4K display per eye. When combined with both eyes, the full resolution allows for the creation of a 4K floating display. However, the Vision Pro would not be able to create an 8K floating display due to its lack of pixel density. The Vision Pro wouldn’t even be able to create a 5K display for this same density reason.

Because many 5K flat and curved LCD displays are now priced under $800 and are likely to drop in price even further, that means you can buy two 5K displays for less than than half the cost of one Vision Pro headset. Keep in mind that these are 5K monitors. They’re not 3D and they’re relatively big in size. They don’t offer floating 3D displays appearing in your vision and there are limits to a flat or curved screen. However, if you’re looking for sheer screen real estate for your computing work, buying two 5K displays would offer a huge amount of screen real estate for managing work over the Vision Pro. By comparison, you’d honestly get way more real estate with real monitors compared to using the Vision Pro. Having two monitors in front of you is easier to navigate than being required to look up, down and left and right and perhaps crane your neck to see all of the real estate that the Vision Pro affords… in addition to getting the hang of pinch controls.

The physical monitor comparison, though, is like comparing apples to oranges when compared with a Vision Pro headset (in many ways). However, this comparison is simply to show you what you can buy for less money. With $3400 you can buy a full computer rig including a mouse, keyboard, headphones and likely both of those 5K monitors for less than the cost of a single Vision Pro headset. You might even be able to throw in a gaming chair. Keeping these buying options in perspective keeps you informed.

The Bad

Because the headset offers a closed and private environment that only the wearer can see, this situation opens the doors to bad situations if using it in a place of business or even if out in public. For example, if an office manager were to buy their employee a Vision Pro instead of a couple of new and big monitors, simply because the Vision Pro is a closed, private environment, there’s no way to know what that worker might be doing with those floating displays. For example, they could be watching porno at the same time as doing work in another window. This is the danger of not being able to see and monitor your staff’s computers, if even by simply walking by. Apple, however, may have added a business friendly drop-in feature to allow managers to monitor what employees are seeing and doing in their headsets.

You can bet that should a VR headset become a replacement for monitors in the workplace, many staff will use the technology to surf the web to inappropriate sites up to and including watching porn. This won’t go over well for either productivity of the employee or the manager who must manage that employee. If an employee approaches you asking for a Vision Pro to perform work, be cautious when considering spending $3500 for this device. There may be some applicable uses for the Vision Pro headset in certain work environments, but it’s also worth remaining cautious for the above reasons when considering such a purchase for any employee.

On the flip side, for personal use, buy whatever tickles your fancy. If you feel justified in spending $3500 or more for an Apple VR headset, go for it. Just know that you’re effectively buying a headset based monitor system.

Keyboard, Eye Tracking and The Pinch

Because the Vision Pro is affixed to your head, Apple had to devise a way to obtain input within the VR environment. To that end, Apple decided on the pinch motion. You pinch your thumb and forefinger together in a sort of tapping motion. Each tapping motion activates whatever you are looking at (eye tracking). Whenever the headset “sees” (using its many cameras) your pinching motion, it activates wherever your eyes are focused. This means that in order to open an application from the iPad-ish icon list, you must be looking directly at the icon to activate it. If your eyes flutter around and you perform the pinch motion the instant your eyes look someplace else, the app will not activate. You might even activate something unintentional.

Keep in mind that this is still considered a beta product, which weird coming from Apple. This is the first time I can recall Apple explicitly releasing a beta product for review.

That said, there are definitely some improvements that could be had with this eye tracking system. For example, the system could detect and count linger time. The longer the eye lingers, the more likely it is that the user wants to activate the thing that the eyes lingered on the longest, even if the eyes are not currently looking at it. This means that even if your eyes dart away at the moment you pinch, the system would still understand that you want to activate the icon that was lingered on the longest. As far as I understand it, the OS doesn’t presently work this way. It only activates the icon or control you are presently looking at. Adding on a fuzzy eye linger system could reduce errors when selecting or activating the wrong things.

If you need to move a window around or expand the size of it, you must be looking directly at the control that performs that action. Once you’re looking at that specific control, the pinch and move will activate the control for as long as the pinch and move continues.

Unfortunately, this system falls down hard when you want to use the on-screen keyboard. This keyboard only works if you poke each key with your forefingers on each hand. This means hunt-and-peck typing. If you’re a touch typist, you’re going to feel horribly out of place being forced into using single finger hunt-and-peck. The Vision Pro will need to make much better improvements around keyboard typing.

On the flip side, it seems that the Vision Pro may want you to use the microphone and voice to input longer strings of text instead of typing. This means that for web searches, you’re likely going to fill in fields using voice dictation. I will say that Apple’s dictation system is fair. It works in many cases, but it also makes many mistakes. For example, most dictation systems can’t understand the difference between its and it’s, preferring to use it’s whenever possible, even though the selected usage is incorrect. Same problem exists with the words there, their and they’re and several similar type words when dictating. Typing is usually the better option over dictating long sentences of text, but it also means you’re going to need to pair a Bluetooth keyboard. Then, type on that keyboard blind because the Vision Pro won’t show you your hands or that keyboard in the VR display when the keyboard is sitting in your lap. Even if the keyboard is sitting on a desk, it might not show the keyboard properly without looking down at the keyboard instead of the window into which you’re typing.

For example, I would never attempt to blog an article this long using a VR headset. Not only would the headset eventually become too uncomfortable on my head, dictating everything by voice would get to be a pain in the butt because of all of the constant corrections. Even Apple’s active correction system leaves a lot to be desired, changing words from what you had actually wanted into something that doesn’t make any sense after you read it back. These problems will immediately be carried into the Vision Pro simply because these systems already exist in Apple’s other operating systems and those existing systems will be pulled into the Vision Pro exactly as they are, warts and all.

What Apple needs to create is a psuedo Augmented Reality (AR) keyboard. A keyboard where the VR system uses AR to pick up and read what you’re typing. Sure, the keyboard could be connected, but the AR system could simply watch the keys you’re pressing and then input those key presses via camera detection rather than via Bluetooth. In this way, the on-screen keyboard can still present and show which key is being typed in your vision, yet give you the option of touch typing on a keyboard.

Pinch Motion

The Apple chosen VR pinch motion seems like a fine choice and might become a sort of standard across the industry for other VR headsets and applications. Many VR headsets have struggled to produce a solid standardized input system. The pinch is a relatively easy, intuitive control and it works well for most use cases in the Vision Pro, but it’s definitely not perfect for all use cases. The cameras around the Vision Pro unit seem sensitive enough that you don’t have to hold your hands directly out in front in an awkward position like many VR headsets require. Instead, you can sit comfortably with your hands in your lap or on a desk and the unit will still pick up your pinch taps. You will need to move your hand(s) around, though, to activate resize and movement controls as well as when typing on the on-screen keyboard.

However, I do think it would be great for Apple to offer a lighted wand or other physical object that can supplement, augment and/or replace the pinch control. For people who don’t have access to fine motor controls with their hands, an alternative control method using an external device could be ideal for accessibility purposes.

VR Motion Sickness

One thing that cannot be easily overcome is VR motion sickness. It doesn’t matter what headset manufacturer it is, this problem cannot be easily overcome by software. Apple has done nothing to address this issue with the Vision Pro. If you have previously encountered VR sickness while wearing a headset, you’re likely to encounter it with the Vision Pro eventually. The transparent effect of showing you your present surroundings might help reduce this problem. If you replace your present surroundings with a forest or beach scene or some other fantasy environment, your body will be at odds with what your eyes are seeing.

VR motion sickness is typically exacerbated by rapid movements, such as riding a VR roller coaster or riding in a high speed car chase in VR. These are situations where the mind sees motion, but the body feels nothing. This disparity between the physical body sensations and the motion the mind is experiencing can easily lead to VR motion sickness.

If you stick to using the Vision Pro strictly for computer purposes, such as an extended monitor or for other productivity or entertainment purposes, you might not experience sickness. If you wish to get into full 3D virtual gaming, the reason most people want to purchase a VR headset, then you’re inviting motion sickness.

Keep in mind that VR motion sickness is not the same as real motion sickness. I can ride on planes, boats and even buffeting roller coasters, all without any sickness or issues. However, the moment I strap on a VR headset and begin riding a VR roller coaster or ride around in a fast VR car, the VR sickness begins to kick in. When it arrives, the only solution is to take off the headset and let it subside. It also means exceedingly short VR sessions. When the VR sickness comes on, it comes on rapidly. Perhaps even as fast as 5 minutes after experiencing a lot of motion on the VR screen.

If you’ve never bought into or tried a VR headset in the past, you should make sure you can return the headset should you experience VR sickness while using it.

Overall

The Vision Pro is a pricey VR headset. While the Vision Pro is not the most expensive VR headset on the market, it’s definitely up there in price. The question remains whether the Vision Pro is a suitable or efficient alternative to using a keyboard, mouse and monitor when computing. This author thinks that the presently clumsy, slow input systems utilized in VR headset systems (yes, that includes the pinch), when compared to a mouse and keyboard input, doesn’t make a VR headset the most efficient product for computing.

The best use cases for 3D stereoscopic VR headsets is for immersive 3D virtual gaming (assuming you can get past the motion sickness) and consuming movies and TV shows. The floating large screens in front of your vision are ideal for presenting flat and 3D movies as well as TV shows which make you feel like you’re watching entertainment in a theater environment. This aspect is actually quite uncanny. However, for consuming music, a VR headset is a fail. You simply need earbuds, such as Apple’s Airpods for that. You don’t need to spend $3400 to listen to music, even if the Vision Pro is capable of layering reverb and echo effects onto the music to make it sound more spatial.

Personally, I want to hear the music as it was crafted by the musician. I don’t want third party added effects that are more likely to detract from and muddy the final music product. If a musical artist as recorded a Dolby Atmos version of their music, then playing that version back exactly in its original recorded spatial form is perfectly fine, but devices shouldn’t layer anything else on top.

Overall, the Vision Pro is a fair addition to the VR headset space. However, it’s no where near perfect and it needs a lot of nuanced tweaking in subsequent models before it can become a real contender. This first released model is both overkill and naive all at the same time, adding bells and whistles that, while interesting, add to the hefty price tag without adding substantial benefit to the final product.

The built-in main board M2 computer ensures that the unit will become obsolete in 1-2 years and need to be replaced, adding yet more computer junk to our already overflowing landfills. Apple needs to firmly grasp and get behind product longevity in this product rather than planned obsolescence every 12 months. Decoupling the main board and placing it into the battery case would go a long way towards longevity AND allow for easy replacement of that battery and main board. This change alone would enable a Vision Pro headset’s display to remain viable for years to come, all while simply replacing an obsolete computer and battery that drives it. This one is a big miss by Apple’s design team.

Rating: 2.5 out of 5 (Apple tried to do too much, but actually did very little to improve VR. Apple’s design increases landfill chances; not a green product.)
Recommendation: Skip and wait for the next iteration

↩︎