Apple’s biggest new hardware launch in decades happened on Feb. 2 when the Vision Pro spatial computer hit the United States market. Based on the reported preorder sales and the company’s history of hardware hits, there are plenty of reasons for metaverse enthusiasts to be optimistic for the future.
The Apple Vision Pro is a spatial computing device that takes the form of an augmented reality headset. Not to be confused with virtual reality, the Vision Pro gives users a view of the world outside of their headset and then overlays that reality with digital imagery. Essentially, it turns the physical world into your computer desktop.
Spatial computing, not VR
The distinction between spatial computing and virtual reality is an important one. Despite positive gaming sales and steady enterprise adoption, virtual reality hasn’t managed to grab a foothold in the mainstream consumer market.
This isn’t surprising. As numerous analysts have noted over the past five years, a significant portion of the population experiences “VR sickness” when using VR hardware. And even some of the most seasoned VR reviewers have reported disorientation or nausea from time to time.
Spatial computing has, so far, proven more universally comfortable. Instead of immersing wearers in a virtual space, it uses cameras to display the space they’re currently in. Because of this, users are less isolated from their surroundings, able to interact with other people, and safer, as they can see and avoid obstacles in the real world while wearing the headset.
This could make for an easier on-ramp to adoption for Apple than VR manufacturers have had. And if the major players in the metaverse space can imagine its future in augmented reality instead of virtual reality, the next major paradigm shift in tech hardware could be here.
Remember the iPhone
The iPhone is, arguably, the most successful piece of consumer technology of the 21st century. Its launch drove Apple, a company that at the time relied on cash injections from the likes of Microsoft to stay afloat, to become the most valuable company in the world for several years running.
At launch in 2007, the iPhone cost the equivalent of a little more than $700 in 2024. Apple sold just 270,000 units in the U.S. in its first week.
Related: Apple to roll out generative AI features in 2024, says Cook
The Apple Vision Pro, on the other hand, costs five times as much today ($3,499). And while first-week sales aren’t out yet as of this article’s publication, preorders reportedly exceeded 200,000.
Going back to the iPhone, Apple sold just 1.4 million iPhones in 2007, generating revenue of about $630 million. By 2021, however, iPhone sales alone generated nearly $200 billion in revenue for the company.
Face computers
It might seem silly to imagine wearing a computer on your face at work or in public, but it’s worth remembering that many pundits predicted touchscreen phones would fail. The general thinking then was that consumers would never give up the tactile feel of typing out texts on real buttons for tapping on glass.
Today, it’s hard to imagine a world where everyone walks around with a computer on their faces — even if it is an augmented reality headset. However, back in 2007, it would have been impossible to predict a future where people spent vast swathes of their time staring at their phone screens, even at public events.
Augmented reality headsets could actually bring people closer together again. Instead of staring at screens, people could be looking at the world around them through their headsets and overlaying useful information as they need it.
People can have face-to-face discussions featuring multimedia graphics without having to keep looking away from one another to stare at a screen.
And thanks to the magic of cameras, artificial intelligence and augmented reality, people can even make eye contact with others wearing computers on their faces too.