Connect with us

Technologies

Apple’s AR/VR Headset: What Could Be Coming in 2023

The company’s next big product should arrive next year. Here’s what we expect.

This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.

Apple has been integrating augmented reality into its devices for years, but the company looks like it will leap right into the territory of Meta, Microsoft and Magic Leap with a long-expected mixed-reality headset in 2023.

The target date of this AR/VR headset keeps sliding, with the latest report in early December from noted analyst Ming Chi-Kuo suggesting an arrival in the second half of 2023. With an announcement event that could happen as soon as January, we’re at the point where every Apple event seems to feel like the one where it could pull the covers off this device at last. Bloomberg’s Mark Gurman reported in early January that’s he’s heard the company is aiming to unveil the headset in the spring ahead of the annual Worldwide Developers Conference in June.

2023 looks like a year full of virtual reality headsets that we originally expected in 2022, including the PlayStation VR 2 and Meta Quest 3. Apple has already laid down plenty of AR clues, hinting at what its mixed-reality future could hold and has been active in AR on its own iPhones and iPads for years.

As far as what its device could be like, odds are strong that the headset could work from a similar playbook as Meta’s recent high-end headset, the Quest Pro, with a focus on work, mixed reality and eye tracking onboard.

Here’s what we’re expecting.

Is its name Reality Pro? Is the software called xrOS?

The latest report from noted Apple reporter Mark Gurman at Bloomberg suggests the operating system for this headset could be called «xrOS,» but that may not indicate the name of the headset itself. Recent trademark filings reported by Bloomberg showed the name «Reality» showing up a lot: Reality One, Reality Pro and Reality Processor. Apple’s existing AR software framework for iOS is named RealityKit, and previous reports suggested that «Reality OS» could be the name for the new headset’s ecosystem.

No one really expected the Apple Watch’s name (remember iWatch?), so to some degree, names don’t matter at this point. But it does indicate that Apple’s moving forward on a product and software, for sure.

One of several headsets?

The headset has been cooking for a long while. Reports have been going around for several years, including a story broken by former CNET Managing Editor Shara Tibken in 2018. Apple’s been building more advanced AR tools into its iPhones and iPads for years, setting the stage for something more.

Whatever the headset might become, it’s looking a lot more real lately. A detailed report from The Information earlier this year discussed likely specs, which include what Bloomberg’s Mark Gurman says is Apple’s latest M2 chip. According to another report from Bloomberg earlier this year, Apple’s board of directors have already seen a demonstration of the mixed-reality headset.

The expected arrival of this headset has kept sliding for years. Kuo previously predicted that Apple’s VR-AR headset would arrive in the fourth quarter of 2022 with Wi-Fi 6 and 6E support. But this VR-type headset could be the start of several lines of products, similar again to how Meta has been targeting future AR glasses. Kuo has previously predicted that Apple smart glasses may arrive in 2025.

Apple could take a dual headset approach, leading the way with a high-end AR-VR headset that may be more like what Meta has done with the Quest Pro, according to Bloomberg’s Gurman. Gurman also suggests a focus on gaming, media and communication on this initial first-wave headset. In terms of communication, Gurman believes FaceTime using the rumored headset could rely on Memoji and SharePlay: Instead of seeing the person you’re talking to, you’d see a 3D version of their personalized Memoji avatar.

Eventually, Apple’s plans for this headset could become larger. The company’s «goal is to replace the ‌iPhone‌ with AR in 10 years,» Kuo explained in a note to investors, seen by MacRumors. The device could be relatively lightweight, about 300 to 400 grams (roughly 10.5 to 14 ounces), according to Kuo. That’s lighter than Meta’s Oculus Quest 2. However, it’s larger than a normal pair of glasses, with early renders of its possible design looking a lot more like futuristic ski goggles.

Read more: The Metaverse is Just Getting Started: Here’s What You Need to Know

The headset could be expensive, maybe as much as $2,000 or more, with 8K displays, eye tracking and cameras that can scan the world and blend AR and VR together, according to a report from The Information last year. That’s to be expected, considering the Quest Pro costs $1,500 and AR headsets like the Magic Leap 2 and Hololens 2 are around $3,000.

It’s expected to feature advanced processors, likely based on Apple’s recent M2 chips, and work as a stand-alone device. But it could also connect with Apple’s other devices. That’s not a surprising move. In fact, most of the reports on Apple’s headset seem to line right up with how VR is evolving: lighter-weight, with added mixed-reality features via more advanced pass-through cameras. Much like the Quest Pro, this will likely be a bridge to future AR glasses efforts.

Previous reports on Apple’s AR/VR roadmap suggested internal disagreements, or a split strategy that could mean a VR headset first, and more normal-looking augmented reality smart glasses later. But recent reports seem to be settling down to tell the story of a particular type of advanced VR product leading the way. What’s increasingly clear is that the rest of the AR and VR landscape is facing a slower-than-expected road to AR glasses, too.

VR, however, is a more easily reachable goal in the short term.

Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: this year, the year after or even further down the road. Or whether Apple proceeds with just glasses, or with a mixed-reality VR and AR headset, too.

I’ve worn more AR and VR headsets than I can even recall, and have been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple acquired VR media-streaming company NextVR in 2020 and it bought AR headset lens-maker Akonia Holographics in 2018.

I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well-aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses probably won’t be a massive surprise if you’ve been paying attention to the AR and VR landscape lately.

Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Meta is working on AR glasses too, as well as Snap and also Niantic. The landscape got crowded fast.

Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.

Apple declined to comment on this story.

Launch date: Looks likely for 2023

New Apple products tend to be announced months before they arrive, maybe even earlier. The iPhone, Apple Watch, HomePod and iPad all followed this path.

The latest reports from Kuo point to possible delays for the release of the headset to the second half of 2023, but an event announcing the headset could happen as soon as January. That timeframe would make a lot of sense, giving time for developers to understand the concept well ahead of the hardware’s release, and even possibly allowing for Apple’s WWDC developer conference (usually in June) to go over specifics of the software.

Either way, developers would need a long head start to get used to developing for Apple’s headset, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving a heads-up on its hardware well in advance of its actual arrival.

An Apple headset could be a lot like the Meta Quest, but higher end

There’s already one well-polished success story in VR, and the Quest 2 looks to be as good a model as any for where future headsets could aim. Gurman’s report makes a potential Apple VR headset sound a lot like Facebook’s stand-alone device, with controller-free hand tracking and spatial room awareness that could be achieved with Apple’s lidar sensor technology, introduced on the iPad Pro and iPhone 12 Pro.

Apple’s headset could end up serving a more limited professional or creative crowd. But it could also go for a mainstream focus on gaming or fitness. My experiences with the Oculus Quest’s fitness tools feel like a natural direction for Apple to head in, now that the Apple Watch is extending to subscription fitness training, pairing with TVs and other devices.

The Oculus Quest 2 (now officially the Meta Quest 2) can see through to the real world and extend some level of overlap of virtual objects like room boundaries, but Apple’s headset could explore passthrough augmented reality to a greater degree. I’ve seen impressive examples of this in headsets from companies such as Varjo. It could be a stepping stone for Apple to develop 3D augmented reality tech on smaller glasses designs down the road.

Right now, there aren’t any smart glasses manufacturers able to develop normal-looking glasses that can achieve advanced, spatially aware 3D overlays of holographic objects. Some devices like the nReal Light have tried, with mixed success. Meta’s first smart glasses, Ray-Ban Stories, weren’t AR at all. Meta is working on ways to achieve that tech later on. Apple might take a similar approach with glasses, too.

The VR headset could be a ‘Pro’ device

Most existing reports suggest Apple’s VR headset would likely be so expensive — and powerful — that it will have to aim for a limited crowd rather than the mainstream. If so, it could target the same business and creative professionals that more advanced VR headsets like the Varjo XR-3 and Meta Quest Pro are already aiming for.

I tried Varjo’s hardware. My experience with it could hint at what Apple’s headset might also be focusing on. It has a much higher-resolution display (which Apple is apparently going to try to achieve), can blend AR and VR into mixed reality using its passthrough cameras, and is designed for pro-level creative tools. Apple could integrate something similar to its lidar sensors. The Quest Pro does something similar, but in a standalone device without as high-end a display.

Varjo’s headset, and most professional VR headsets, are tethered to PCs with a number of cables. Apple’s headset could work as a standalone device, like the Quest 2 and Quest Pro, and also work when connected to a Mac or iPad, much like the Quest 2 already does with Windows gaming PCs. Apple’s advantage could be making a pro headset that is a lot more lightweight and seamlessly standalone than any other current PC-ready gear. But what remains unknown is how many apps and tools Apple will be able to introduce to make its headset feel like a tool that’s truly useful for creators.

Controls: Hand tracking or a small wearable device?

The Information’s previous reports on Apple’s headset suggest a more pared-down control system than the elaborate and large game controller-like peripherals used by many VR headsets right now. Apple’s headset should work using hand tracking, much like many VR and AR headsets already enable. But Apple would likely need some sort of controller-type accessory for inputs, too. Cracking the control and input challenge seems to be one of the bigger hurdles Apple could face.

Recent patent filings point to a possible smart ring-type device that could work for air gestures and motion, and maybe even work with accessories. It’s also possible that Apple might lean on some of its own existing hardware to act as inputs, too.

Could that controller be an Apple Watch? Possibly, but the Apple Watch’s motion-control capabilities and touchscreen may not be enough for the deeper interactions an Apple headset would need. Maybe iPhones could pair and be used as controllers, too. That’s how Qualcomm is envisioning its next wave of phone-connected glasses.

Future AR smart glasses may also be in the works

Getting people to put on an AR headset is hard. I’ve found it a struggle to remember to pack smart glasses, and find room to carry them. Most of them don’t support my prescription, either. Developer-focused AR glasses made by Snap that I tried at home show what everyday AR glasses could look like someday, but they’re still a work in progress.

Qualcomm’s plans for AR glasses show a wave of devices arriving between 2023 and 2025, but at this point no one has been able to crack making a perfect pair. Software, battery life and even common cross-platform interfaces remain a big challenge.

Kuo’s prediction of AR glasses coming a few years after a VR-AR goggle-type headset would line up with what other companies are promising. The challenges with AR glasses are a lot greater than VR. No one’s figured out how wearing them all the time would work, or how you’d interact with virtual objects: Hand tracking? A watch or a ring? Voice? Neural inputs?

Apple always touted the Apple Watch, first and foremost, as a «great watch.» I would expect the same from its glasses. If Apple makes prescription glasses and makes them available, Warby Parker-style, in seasonal frames from its Apple Stores, that might be enough for people if the frames look good. Apple’s VR headset, according to Gurman, will also offer prescription lenses. That could be a stepping stone to developing glasses later on.

Google acquired smart glasses manufacturer North in 2020, which made a prescription, almost normal set of eyewear. North’s concept for glasses might be too similar to Google Glass for Apple’s tastes, but the idea of AR glasses doubling as functional glasses sounds extremely Apple-like. More recently, Vuzix’s planned smart glasses for 2021 show how far the tech has shrunken down, but even those planned glasses won’t have the ability to spatially scan the world and overlay augmented reality: They’ll be more like advanced glasses with heads-up displays and 3D audio.

A report from The Information in 2020 said new AR lenses were entering a trial production phase for Apple’s AR hardware (9to5Mac also broke the report down). These lenses sound much closer to normal glasses than current AR headsets allow, but when would those be ready?

Could Apple make its first smart glasses something more basic, letting Apple slowly add more AR features over time and let newcomers settle into the experience? Or would Apple try to crack the AR challenge with its first pair of glasses? Augmented reality is a weird concept for eyewear, and potentially off-putting. Maybe Apple will aim for subtlety. The original Apple Watch was designed to be glanced at for just 5 seconds at a time.

A recent patent filing also showed Apple looking to solve vision conditions with adaptive lenses. If true, this could be the biggest killer app of Apple’s intelligent eyewear.

Are the AirPods Max a sign of how expensive a headset could be?

The business-focused HoloLens and Magic Leap cost thousands of dollars. Current VR headsets have trended towards $500 or more.

The latest price reports suggest something between $2,000 and $3000, which is in the territory of business-focused AR headsets like the HoloLens 2, or business-creative VR headsets like those from Varjo. An analysis from TrendForce published in February also estimates that an Apple headset’s hardware would cost in the thousands, and it predicts that Apple would employ a «monthly subscription-based software solution.»

Apple’s headphones, the AirPods Max, indicate that the pricing could climb high. At $549, they cost more than a PlayStation 5. And those are just headphones. A pair of smart glasses, or an advanced VR headset, would be a lot more advanced.

iPhone-connected, too?

Qualcomm’s AR and VR plans telegraph the next wave of headsets: Many of them will be driven by phones. Phone-powered glasses can be lighter and just have key onboard cameras and sensors to measure movement and capture information. Meanwhile the phone does the heavy lifting and doesn’t drain headset battery life.

Apple’s star device is the iPhone, and it’s already loaded with advanced chipsets that can do tons of AR and computer vision computation. It could already power an AR headset right now; imagine what could happen in another year or two.

Apple could also have its own high-end dedicated chip in its first wave of VR and AR headsets, as reports suggest, but they’ll also undoubtedly dovetail with more advanced processors in Apple’s phones, tablets and Macs. Over time, this could mean smaller glasses that lean on connecting to other Apple devices, or the cloud.

How Apple could blend the real world with AR and VR

Apple already dabbles with AR overlays with real world locations: QR code and NFC-enabled App Clips can launch experiences from real-world locations with a tap or scan. These micro apps are made to work with AR, too: With glasses or an AR headset, they could eventually launch interactions at a glance.

Maybe QR codes can help accelerate AR working in the «dumb» world. Apple’s iPhones also have a U1 chip that can be used to improve accuracy in AR object placement, and also to more quickly locate other Apple devices that have the U1 chip, too.

Apple’s AirTags arrived in 2021 with features similar to Samsung’s SmartTags Plus that use similar ultrawideband technology. These tags could be seen via an iPhone app using AR, which could possibly extend into Apple’s future VR or AR headsets. If all Apple’s objects recognize each other, they could act as beacons in a home. The U1 chips could also be indoor navigation tools for added precision.

Microsoft’s collaborative mixed-reality platform, Mesh, shows how meetings with people in virtual spaces could happen instantly and in work-like environments. Apple already enables multiperson AR in real places, but a necessary next step would be to allow a platform for collaboration in AR and VR like Microsoft is developing.

Apple’s depth-sensing hardware is already here

Apple is already deeply invested in camera arrays that can sense the world from short and long distances. The front-facing TrueDepth camera, which Apple has used on every Face ID iPhone since the X, is like a shrunken-down Microsoft Kinect and can scan a few feet out, sensing 3D information with high enough accuracy to be used for a secure face scan. Apple’s lidar technology on its recent iPhones and iPads can scan out much further, several meters away. That’s the range that glasses would need.

Apple’s existing lidar technology, combined with cameras, is already good enough to scan environments and 3D objects. Add to this the wider-scale lidar scanning Apple is doing in Maps to enable overlays of real-world locations with virtual objects via a technology called Location Anchors, and suddenly it seems like the depth-scanning Apple is introducing could expand to worldwide ambitions.

Apple’s new Mac chips already point toward VR-AR compatibility

Apple’s M1-enabled Macs and those since are technically a lot more capable of the power needed to run AR and VR, and they share similarities to how iPhone and iPads handle graphics. Developing a common groundwork across devices could allow a headset to feasibly run on an iPhone, iPad or Mac, making it a universal Apple device accessory.

That would be essential if Apple intends on its VR or AR headsets to have any role in creative workflows, or be used for games or apps. It’s one of the limitations of existing VR headsets, which need to run off particular Windows gaming PCs and still don’t play that well with iOS or Android phones.

Look to AirPods for ease of use — and audio augmented reality

I’ve thought about how the AirPods’ comfort — and weird design — was an early experiment in wearing Apple’s hardware directly on our faces — and it was a success. It proved that doing so could be accepted and become normal. AirPods are expensive compared to in-box wired buds, but they’re also utilitarian. They’re relaxed. If Apple’s working on AR or VR headsets, they’ll need to feel the same way.

The AirPod Pros’ spatial audio, which AirPods Max and AirPods 3 also have, points to where future ideas could head. Immersive audio is casual, and we do it all the time. Immersive video is hard and not always needed. I could see AR working as an audio-first approach, like a ping. Apple glasses could potentially do the world-scanning spatial awareness that would allow the spatial audio to work. In the meantime, Apple’s already developing the spatial audio tech that its VR headset would need.

Apple Watch and AirPods could be great companions

Apple’s already got a collection of wearable devices that connect with the iPhone, and both make sense with glasses. Its AirPods can pair for audio (although maybe the glasses have their own Bose Frames-like audio, too), while the Watch could be a helpful remote control. The Apple Watch already acts as a remote at times, for the Apple TV or for linking up with the iPhone camera. Apple’s future headsets could also look to the Watch and expand its display virtually, offering enhanced extras that show up discreetly, like a halo. Or they could use the Watch as some sort of controller.

The Apple Watch could also provide something that it’ll be hard to get from hand gestures or touch-sensitive frames on a pair of glasses: haptics. The rumbling feedback on the Watch could lend some tactile response to virtual things, possibly.

There’s already a low-cost pair of phone goggles, the HoloKit X, that explores these ideas. It uses an iPhone for the headset’s display and cameras and can channel spatial audio to AirPods, and use an Apple Watch for gesture controls. Apple could do the same.

Could Qualcomm and Apple’s reconciliation also be about XR?

Qualcomm and Apple are working together again on future iPhones, and I don’t think it’s just about modems. 5G is a key feature for phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has already been exploring how remote rendering could allow 5G-enabled phones and connected glasses to link up to streaming content and cloud-connected location data. Glasses could eventually stand on their own and use 5G to do advanced computing, in a way like the Apple Watch eventually working over cellular.

Qualcomm’s chipsets are in almost every self-contained AR and VR headset I can think of (Meta Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Will Apple’s tech dovetail at all with Qualcomm’s cross-device platforms?

Technologies

Invincible VS Is a Tag-Team Brawler Packed With Bloody Superhero Carnage

The Invincible franchise is heading to Xbox.

Microsoft’s Xbox Games Showcase had its share of surprises, including a new game from Pokemon developer Game Freak and the ROG Xbox Ally portable handheld. Another surprise is a fighting game featuring characters from the Invincible comic and show. 

Invincible VS is a three-versus-three tag fighting game featuring characters from the Invincible universe. The trailer showed several characters from the show, including Invincible, Omni-Man, Atom Eve, Rex Splode, Bulletproof, and two Viltrumites — the powerful alien species Omni-Man and Invincible belong to.

The game itself has a comic book art style to it, but its action is more along the lines of Mortal Kombat. The fighting is very bloody, which is faithful to the comic and show, but no kind of fatalities were shown in the trailer. There are also a couple of familiar settings from the show. While we saw only a handful of characters in this first glimpse of Invincible VS, there is a wealth of heroes and villains that could be added to the game before it launches. 

Robert Kirkman’s Invincible
 started as a comic in 2003 and ended its run in 2018. In 2021, an animated series based on the comic made its debut on Amazon Prime Video. The show wrapped up its third season in March and has already been renewed for a fourth season

Skybound Games is publishing Invincible VS with development handled by Quarter Up, an in-house studio led by members of the team that created 2013’s Killer Instinct. 

Invincible VS will be released sometime in 2026 for PC and Xbox Series consoles. 

Continue Reading

Technologies

I Played With the ROG Xbox Ally, the Upcoming Xbox Handheld

The new handheld console was revealed during the Xbox Games Showcase, and I got to spend some time with my hands on it.

Microsoft revealed its long-rumored Xbox handheld console running Windows 11 during the Xbox Games Showcase — two models called the ROG Xbox Ally and ROG Xbox Ally X — and I spent a short time playing around with one soon after. 

Unfortunately, I wasn’t allowed to take any pictures or videos of the demo since the hardware we got to test wasn’t final. That became evident when our designated guide had HDMI connection issues with the unit. I was able to play around with the Xbox full-screen experience and the various settings menus and play the beginning minutes of Gears of War Reloaded, which comes out this summer.  

The device is quite comfortable to hold, with slightly textured grips. The face buttons, triggers and analog sticks all felt familiar, very similar to what I’m used to on an Xbox controller. 

What’s really exciting is that you can download your games, remote play from your Xbox or stream from the cloud, making this more useful than PlayStation’s Portal, which can only stream and play remotely. That’s one of the major benefits of being inside Xbox’s ecosystem: You can play a game on any of its devices, regardless of where you bought it, whether that be Xbox consoles, PC, cloud or this new handheld. This more open-platform approach makes the Xbox Ally closer in spirit to a Steam Deck compared with a Nintendo Switch, which can only run Nintendo games. 

When it ships — expected in time for the winter holidays — you’ll be able to navigate via a full-screen Xbox app, which combines your Xbox game library with installed games from several other marketplaces into a single Xbox experience. The company specifically mentioned Xbox, Game Pass, battle.net (owned by Microsoft) and «other leading PC storefronts,» which I’m hoping includes Steam. Much like on an Xbox, each game has icons depicting which platform they’re from. In my demo, the only example of a different storefront was Hearthstone, which had a battle.net icon. 

The Xbox Ally consoles use the Game Bar, and if you’ve used the Xbox app on PC, then you’ll find it familiar. In fact, pressing the new Xbox button opens an almost identical version of the guide when playing Xbox games on PC. However, there’s also a new Command Center tab on the far left to adjust settings for power consumption and performance, similar to what we’ve seen on Steam Deck.

In Game Bar, you can quickly jump to the home screen, your library, launch games, open apps, chat with friends, adjust settings and more. And this Game Bar works alongside Asus’s Armoury Crate overlay. This is a little worrisome, as Armoury Crate has usually felt more like unnecessary bloatware, but when we get to test the device later this year we’ll see if Asus has stripped it down to the relevant functions rather than just added more on top.

Since it’s a Windows 11 device, you’ll also be able to launch and use apps like Discord and Twitch and access game mods. The Xbox Ally boots directly into the «Xbox full screen experience» similar to how a Steam Deck launches into Big Picture mode. The full-screen experience is optimized specifically for handheld gaming, and Xbox told me the device minimizes background activity and allocates more system resources to gameplay like Game Mode does on Windows. This means more memory and potentially higher framerates for your games.

The ROG Ally and Ally X have been out for a bit now, but the Xbox models have some unique features. In addition to the Xbox button, the ROG Xbox Ally also has larger, contoured grips. The previous ROG Ally is more rectangular; the Xbox Ally is closer to the design of the PlayStation Portal, with dedicated, slightly separated hand grips that mimic the look and feel of a standard game controller. They also have upgraded components over the Asus versions.

The handheld comes in two options, a white Xbox Ally and the more powerful Xbox Ally X that comes in black. The lower-end Ally is powered by an AMD Ryzen Z2A processor, comes with 16GB of RAM and 512GB of SSD storage, weighs 23.6 ounces (670 grams) and has a 60Wh battery. The Ally X has an AMD Ryzen AI Z2 Extreme processor, 24GB of RAM, 1TB of SSD storage, weighs a bit more at 25.2 ounces (715 grams) and has an 80Wh battery. 

Both models are equipped with a 7-inch,120Hz 1080p screen, the same as on the original Asus versions of the devices. They also have RGB lights surrounding the analog sticks, something I hope I’ll be able to turn off when I spend some real time playing on the device. The Ally X did feel on the heavier side, but then again, the recently released Switch 2 and my Steam Deck OLED are also pretty heavy, so I think that’s just what handhelds weigh these days.

Xbox hasn’t yet revealed  the pricing or release date, aside from «this holiday.»

Continue Reading

Technologies

Everything I Suspect Will Be Announced at WWDC 2025 Monday

We could see new iPhone, Mac and Apple Watch software called iOS 26, MacOS Tahoe and WatchOS 26. Apple is rumored to overhaul all of its OSes with a unifying visual interface.

Apple’s developer conference, WWDC 2025, kicks off Monday at 10 a.m. PT. At its last two WWDC events, Apple launched itself into new territories, jumping into both AR/VR and generative AI. There’s pressure on the company to match, if not top, what it’s done in the past. CNET has editors and writers attending in person to report on live WWDC 2025 developments as they break.

There was the Vision Pro in 2023, and then Apple Intelligence in 2024. What big announcement is coming in 2025? With both the Vision Pro and Apple Intelligence having faced slow and heavily criticized starts, the big message at this year’s WWDC doesn’t seem clear at all. Apple might focus on operating system redesigns and gradual improvements across the board.

WWDC is usually a showcase for Apple’s future-forward ideas. It’s also where the company discusses its developer tools, as you’d expect. And it’s where previews of all the new OS versions are revealed, giving an early look at what’s coming to the iPhone, iPad, Mac and other Apple devices.

It’s possible Apple will reveal a new home device — a display-enabled HomePod — or even a new Apple Pencil with a calligraphy mode. But the biggest rumors so far suggest a new cross-OS redesign and renaming that could be Apple’s way of deflecting some attention away from not having big new AI features to show off.

OS by year: Will it be iOS 26?

Recent reports from Bloomberg’s often-correct Mark Gurman say Apple is going to ditch the existing numbered OS convention it’s used for years and instead go with another approach to naming: labeling all annual OSes by year number. Instead of iOS 19, we’ll have iOS 26. And iPadOS 26, and MacOS 26, and WatchOS 26, TVOS 26, VisionOS 26. Samsung made a similar move in 2020, jumping from the Galaxy S10 in 2019 to the Galaxy S20 in 2020.

Apple’s numbering has felt pretty disjointed as the numbers have gone ever-higher across multiple device categories. A yearly number would at least help people know if they’re on the current version. 

Glass as the new look

The WWDC invites, featuring a hazy transparent ring, hint at a reported redesign of all the company’s software to a new «glass» look. Bloomberg’s Gurman reported on a large incoming cross-OS design shift, calling it a dramatic redesign and one of the biggest Apple’s done in years. The design may mirror the Vision Pro’s VisionOS feel, which has lots of frosted glass panes, layers of transparency and circular app icons. Front Page Tech’s Jon Prosser showed a preview of the expected design based on information from his sources, and it definitely looks VisionOS-esque.

Beyond a coat of paint, will the OSes start to feel more similar in function too? I’m particularly curious about how iPadOS and MacOS start to close in on each other even more. Apple’s iPad has slowly inched toward acting like a computer, with features like Stage Manager for multitasking, and it’s felt inevitable that the tablet line would eventually provide a comparable experience to the MacBook.

WatchOS should get Apple Intelligence, and the Health app may be part of it

One of the devices that’s missed out on Apple Intelligence so far has been the Apple Watch, and that should be changing soon. Apple is expected to put more AI on the next Watch OS, which could help with message summaries, translation and maybe even composing messages. It could also bring overdue health and fitness upgrades. Reports say Apple could be working on adding generative AI insights to its Health app data and even using AI as a medical service, with a launch target of 2026. Health could possibly get a paid subscription tier, similar to Fitness and what many of Apple’s current services are adding. This could be like what Google is doing with Wear OS, which has long used Fitbit Premium as a health subscription (a broader Gemini rollout is on the way too). 

I like AI coaching and insights on a watch, but I don’t like subscriptions. We’ll see what happens, and if Apple gets into any of these future plans at this WWDC.

Battery life boosts

Another recent report (again, Gurman) says AI will help Apple improve battery life on its devices. How many devices? The iPhone, but hopefully the Apple Watch, too — these are the products in the lineup that I find I need to charge more than I’d like. For me, at least, iPads and Macs are mostly fine on battery life as is, but I’ll never refuse longer battery life for anything.

Apple has made gradual boosts to its battery features over time, but maybe there will be more intelligently applied power modes this time.

Game news?

Apple may be pushing the importance of games again, just as the Nintendo Switch 2 debuts. Bloomberg reports that the company could release a new app to act as a hub for games and game services including Apple Arcade, becoming an overdue overhaul of Game Center.

A number of game controller accessories, like Backbone, already have app hubs that function as game launchers, but Apple has never done much to help organize games on its devices in a way that feels more like what you find on a console. A new app seems like a good fit for those types of controllers, too.

Apple just acquired its first game studio: RAC7, the developers of hit Apple Arcade game Sneaky Sasquatch.

Apple could also have VR gaming news, if older reports come true: PlayStation VR 2 controllers have been expected to work with Vision Pro headsets, in a push to expand gaming on Apple’s VR/AR headset. Maybe that’ll be part of a push to get more developers onboard, as Apple could be readying a less expensive version of the Vision Pro in the next year. Right now the headset can’t compete with Meta’s more affordable Quest headsets in the gaming department.

AI: Live translation, and maybe Vision camera advancements

Apple opened up camera access to enterprise developers last year, and now it’s time for AI tools to emerge for everyone else — tools that could help describe what you’re seeing, or help you remember things too. Apple has already added assistive support for some camera-enabled functions on the Vision Pro and other products, suggesting more to come.

Though Apple’s WWDC keynote presentation isn’t expected to include many announcements of AI strides, the company still needs to compete with Google, Open AI, Perplexity and many others who are making such strides. Reports say live translation will come to some AirPods models, which would mirror what Google and Meta have been doing on glasses and earbuds and on phones.

The biggest VisionOS move I’d expect to see is some introduction of camera-aware AI. Apple Intelligence debuted on Apple’s VR/AR spatial computer headset earlier this year, but none of the AI can take advantage of the system’s cameras to «see» what you’re seeing. At least not yet. Google’s use of Gemini to access the cameras on upcoming headsets and glasses, and Meta’s support of camera access for Quest developers (and its expanding AI tools on Ray-Bans), suggest Apple needs to move this way now to begin paving a way for camera-aware AI to work on future headsets and eventually glasses. 

Apple Pencil

We could see either a brand-new Apple Pencil or updated features that make the current device feel new, according to a report from Bloomberg. Expect to see a new a digital reed calligraphy pen feature unveiled. It’s unclear whether this new software will be for both the original Apple Pencil and the Apple Pencil 2, or if we’ll actually see a brand-new version of the stylus.

A new HomePod-slash-iPad?

There could be a new product emerging at WWDC: a look at a long-expected screen-enabled HomePod that may be part of a bigger push into smarter smart home tech. Reports suggest it’ll be something like a HomePod now — speaker-enabled, with an array of mics — but with a touchscreen. Would it be a screen big enough to act as a photo frame, or something more like a control panel? Where would this thing live, exactly? And what would it cost? Originally, reports of this device even suggested a robotic arm that would allow the screen to follow your face, but those plans seem to be off the table for now.

Of all the wild-card product ideas Apple could announce at this show, this seems the most likely.

WWDC/Gurman potpourri

There are, of course, a number of other rumors from Gurman. Here are some that caught our attention:

  • Messages app: iOS could get the ability to add backgrounds to chats and group chats.
  • iPadOS: Apple may reveal an iPadOS version of the Preview app.
  • iPadOS: MacOS-like multitasking might come to the iPad.
  • iPhone Camera app: The interface could get an overhaul focused on making it simpler to use.

We’ll know more soon

WWDC is happening June 9, with the keynote video presentation streaming at 10 a.m. Pacific. We’ll be there at Apple Park, too, covering it in person. We’ll know more about how all this software could be hinting at new products, and get a check-in on where exactly Apple is with its AI strategies. And maybe we’ll get a bit of product news, too — you never know.

Continue Reading

Trending

Copyright © Verum World Media