First came VR. Then came a wave of AR headsets that were high-priced and full of promises of wild mixed reality worlds. Apple now seems to be readying its own pair of smart glasses, at long last, seven years after Google Glass and four years after the debut of Oculus Rift. These reports have extended back for several years, including a story broken by CNET’s Shara Tibken in 2018.
Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: next year, or the year after, or even further down the road. Or, whether Apple proceeds with just glasses, or with a mixed-reality VR/AR headset, too.
Withlikely to reveal a lot more AR-related news, where do glasses fall in the picture? It’s unlikely that Apple will unveil any AR headset next week, but more of the software underpinnings should definitely emerge.
I’ve worn more AR and VR headsets than I can even recall, and been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple just acquired VR media-streaming company NextVR, and previously acquired AR headset lens maker Akonia Holographics.
I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well aligned to be just that., which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses will probably not be a massive surprise if you’ve been following the beats of the AR/VR landscape lately.
Remember Google Glass? How about and Snap, and . The landscape could suddenly get crowded fast.? Or the or ? , too,
Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.
Apple declined to comment on this story.
Normal glasses, maybe with a normal name
Getting people to put on an AR headset is hard. I’ve found it a struggle to remember to pack smart glasses, and find room to carry them. Most of them, either.
Apple always touted the Apple Watch, first and foremost, as a “great watch.” I expect the same from its glasses. If Appleand makes them available, Warby Parker-style, in seasonal frames from its Apple Stores, that might be enough for people if the frames are good looking.
From there, Apple could add AR features and let newcomers settle into the experience. Augmented reality is weird, potentially off-putting, and people will need to feel out how much of it is right for them. The original Apple Watch was designed to be glanced at for five seconds at a time. Maybe the same idea is in the works for Apple AR features.
Apple Glass is the new purported name for the glasses. Not surprising, since the watch is Apple Watch, the TV box is Apple TV. Apple could have gone the “Air” route like “AirFrames,” but I wonder if these things will end up being tethered some of the time.
A recent patent filing also shows Apple looking to solve vision conditions with adaptive lenses. If true, this could be the biggest killer app of Apple’s intelligent eyewear.
Lower cost than you’d think?
A new report from Apple leaker Jon Prosser sayswill start at $499 plus prescription add-ons like lenses. That could still ramp up the price beyond what I pay for my glasses, but still stay in a realm that isn’t insane. While the HoloLens and Magic Leap cost thousands of dollars, they’re not targeted at regular consumers at all. VR headsets cost anywhere from $200 to $1,000, and the $400-to-$500 price seems like a good settling point. The original iPad started at $500. The Apple Watch was around the same. If the glasses are accessories, and meant to go with a watch, AirPods and an iPhone, you can’t make them cost too much.
have been telegraphing the next wave of headsets: many of them . Phone-powered headsets can be lower-weight and just have key onboard cameras and sensors to measure movement and capture information, while the phone does the heavy lifting and doesn’t drain headset battery life.
Apple’s star device is the iPhone, and it’s already loaded with advanced chipsets that can do tons of AR and computer vision computation. It could already handle powering an AR headset now, imagine what could happen in another year or two.
A world of QR codes, and maybe location-aware objects
Reports of QR codes in an upcoming iOS 14 AR app that will launch 3D experiences when scanning a code in a physical location, like a Starbucks, are corroborated by Prosser’s report. The Apple Glass(es) will scan these codes and use them to quick-launch AR experiences.
This idea of QR codes working for AR isn’t new: thelaunched with a pack of QR cards that worked with a baked-in AR game, too.
Maybe QR codes will help accelerate AR working in the “dumb” world. Apple’s latest iPhones have athat can be in AR object placement, and also to more quickly locate other Apple devices that have the U1 chip, too. Reports of tracker tiles arriving as soon as this year, that could be seen via an iPhone app using AR, could possibly extend into Apple’s glasses. If all Apple’s objects recognize each other, they could act as beacons in a home. The U1 chips could be indoor navigation tools for added precision.
Apple’s newest iPad has the sensor tech it needs
Apple’s already been deeply invested in camera arrays that can sense the world from short and long distances. The front-facingon every Face ID iPhone is like a shrunken-down , and can scan a few feet out, sensing 3D information with high enough accuracy to be used for a secure face scan. The newer can scan out much further, several meters away. That’s the range that glasses would need.
Apple’s iPad Pro Lidar scanner is more for depth sensing than photo-real object scanning, according to developers: The array of dots sent out to ping the world is less fine-grained, but good enough to mesh its surroundings and scan a landscape, noting furniture, people and more. Recent iPad Pro apps using Lidar use the tech to and even improve the camera’s understanding of room details. That Lidar sensor array is reported to be Apple’s AR glasses sensors, and it makes complete sense. The iPad Pro and the next iPhone could end up acting as a living development kit for the glasses’ sensors, but there’s already code in iOS 13 and reported iOS 14 support for a handheld controller-remote.
How bleeding-edge will the visuals be?
Will Apple push the bleeding edge of realistic holographic AR, or aim for style, a few key functions and build up from there? Undoubtedly, the latter. The first Apple Watch was feature-packed but still lacked some key things other watches had, like GPS and cellular connectivity. So did the first iPhone, which had neither an app store, 3G or GPS. Apple tends to market its new products at doing a few key things exceedingly well.
High-end mixed reality headsets like HoloLens 2 and Magic Leap, that show advanced 3D effects, are heavy. Smaller, more normal smart glasses like North Focals or Vuzix Blade are more like Google Glass used to be; they present bits of heads-up info in a flat 2D screen.
There aren’t that many lightweight AR headsets yet, but that’s going to change. Plug-in glasses like the nReal Light show some Magic Leap-like 3D graphics, and it runs using a phone. That comes closer to what Apple could be making.
Apple’s dual displays could leapfrog the competition and offer better image quality for its size. We’ve already seen regular-looking glasses lenses that can embed waveguides to make the images float invisibly. And over time,.
Look to AirPods for ease of use — and audio augmented reality
I’ve thought about how, and weird design, was an early experiment on how wearing could be accepted and become normal. AirPods are expensive compared to in-box wired buds, but also utilitarian. They’re relaxed. Apple Glass needs to feel the same way.
AirPods could also start involving spatially-aware audio, to involve information from locations that could pop up and alert someone to maybe turn on their glasses. Maybe the two would work together. Immersive audio is casual, and we do it all the time. Immersive video is hard and not always needed. I could see AR working as an audio-first approach, like a ping. Apple Glass could potentially do the world-scanning spatial awareness that would allow the spatial audio to work.
Apple Watch and AirPods could be great Glass companions
Apple’s already got a collection of wearable devices that connect with the iPhone, and both make sense with glasses. AirPods can pair for audio (although maybe the glasses have their own Bose Frames-like audio, too), while the watch could be a helpful remote control. The Apple Watch already acts as a remote at times, for the Apple TV, or linking up with the iPhone camera. Apple’s glasses could also look to the watch and expand its display virtually, offering enhanced extras that show up discreetly, like a halo.
The Apple Watch could also provide something that it’ll be hard to get from hand gestures or touch-sensitive frames on a pair of glasses: haptics. The rumbling feedback on the Watch could lend some tactile response to virtual things, possibly.
Could Qualcomm and Apple’s reconciliation also be about XR?
Qualcomm and Appleon future iPhones, and I don’t think it’s just about modems. 5G is a key feature for phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has already been exploring how remote rendering could allow 5G-enabled phones and connected glasses to link up to streaming content and cloud-connected location data. Glasses could eventually stand on their own and use 5G to do advanced computing, in a way like the Apple Watch eventually working over cellular.
Qualcomm’s chipsets are in almost every self-contained AR and VR headset I can think of (Oculus Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Apple’s hardware will likely interface with some of Qualcomm’s emerging XR tools, too.
Expect the iPhone to support other VR and AR, too
While Apple Glass may be Apple’s biggest focus, it doesn’t mean there can’t be, or shouldn’t be, competitors. There are tons of smartwatches and fitness trackers that work with the iPhone, for instance. Where it gets annoying for other trackers and watches is how they’re walled off in a more limited interaction with iOS than the Apple Watch. That could be the same down the road, if connected VR and AR headsets are allowed to work with a future iOS update. It’s where Qualcomm’s heading with phone chips, and Google’s Android could be likely to follow.
Launch date: 2021, 2022, 2023… or later?
New Apple products tend to be announced months before they arrive, maybe even more. The iPhone, Apple Watch, HomePod and iPad all followed this path. Prosser’s report says a first announcement could come alongside the next iPhone in the fall; it’s a standard Apple event as was originally planned precoronavirus (which it probably won’t be). Even then, actual availability might be 2021. This lines up with.
Bloomberg’s Mark Gurman has since contested Prosser’s report, and other noted analysts like Ming-Chi Kuo say the glasses could come in 2022. A report from The Information from 2019, based on reported leaked Apple presentational material, suggested 2022 for an Oculus Quest-like AR/VR headset, and 2023 for glasses. Maybe Apple takes a staggered strategy with AR, and releases several devices: one for creators first, with a higher price, and one for everyday wearers later.
Either way, developers would need a long head start to get used to developing for Apple’s glasses, and making apps work and flow with whatever Apple’s design guidance will be. That’s wherecould be a leaping-off point to continue developing AR software well before any hardware is formally announced, as Apple has already done for years.
Apple Glass sounds like the culmination of years of acquisitions, hires and behind-the-scenes drama, but they may not be arriving quite as soon as you’d think.