Technologies
Apple’s Mixed Reality Headset: What to Expect From WWDC’s Big Reveal
Long-awaited and still mysterious, Apple’s VR headset could be the spark for a whole new wave of hardware and software.
Apple’s next big product looks like it’ll cost $3,000, rest on your face and need to be tethered to a battery pack. Whatever this expected VR headset ends up being, it isn’t immediately clear what it’ll do or who it’s for. The Reality Pro headset, as it’s expected to be called when it’s likely unveiled at Apple’s WWDC developer conference on June 5, is Apple’s biggest new product in nearly a decade. It’s also totally different than anything Apple has ever made before.
VR headsets have been a standard consumer tech thing for years, and your family, or families you know, may already have one lying in a corner. They’re used for games, fitness, creative collaboration, even theater. Still, VR and AR have been outlier technologies, not deeply connected enough to the phones, tablets and laptops most of us use every day.
Apple could change that. And of course, don’t expect the word «metaverse» to be uttered even once. The metaverse became Meta’s buzzword to envision its future of AR and VR. Apple will have its own parallel, possibly unique, pitch.
A connection to everything?
I pair my Quest 2, from Meta, to my phone, and it gets my texts and notifications. I connect it to my Mac to cast extra monitors around my desk using an app called Immersed. But VR and AR don’t often feel deeply intertwined with the devices I use. They aren’t seamless in the way my watch feels when used with an iPhone, or AirPods feel when used with an iPad or Mac.
Apple needs this headset to bridge all of its devices, or at least make a good starting effort. Reports say the headset will run iPad apps on its built-in 4K displays, suggesting a common app ecosystem. It’s also possible that the Apple Watch could be a key peripheral, tracking fitness and also acting as a vibrating motion-control accessory.
VR is a self-contained experience, but mixed reality – which Apple’s headset should lean on heavily – uses pass-through cameras to blend virtual things with video of the real world. In Apple’s case, its own devices could act as spatially linked accessories, using keyboards and touchscreens and ways to show virtual screens springing from real ones.
Apple’s expected headset is supposed to be self-contained, a standalone device like the Quest 2 and Quest Pro. But that interconnectivity, and its position in Apple’s continuity-handoff connected ecosystem, is a big opportunity and a big question mark.
However, Apple does have a big AR head start: Its iOS ecosystem has supported AR for years, and the iPhone and iPad Pro already have depth-sensing lidar scanners that can map out rooms in ways that Apple’s headset should replicate. Apple could emphasize making its existing AR tools on other devices more usable and visible through a new interface.
Apple’s head of AR, Mike Rockwell – the person expected to be leading this new headset’s development – told me in a conversation about AR in 2020 that «AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we’ve got to make sure that it is successful. For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort.»

The Quest Pro and other headsets already support hand tracking. Will Apple refine the technology?
How do we control it?
I’m less curious about the Apple headset display – which sounds extremely promising with a possible 4K resolution per eye and a Micro OLED display – and more focused on how Apple solves what we do with our hands.
Interfaces in VR and AR are very much a work in progress. VR has tended to lean on split game controllers for most inputs, with optional (and steadily improving) hand tracking that still isn’t perfected.
Apple isn’t expected to have any controller at all with its Reality Pro headset. Instead, it’ll likely use both eye tracking and hand tracking to create a more accurate and possibly streamlined style of interface that could make targeting intended actions feel faster. Eye tracking already works this way, sometimes, in headsets that use it: The PlayStation VR 2 has some games that use eye tracking for controlling menus.
Accessibility is a big question here. Apple’s design choices are often very accessibility-conscious, and VR and AR headsets often rely on eye movement or physical hand movements that aren’t always easy for everyone. Voice control is a possible option here, or maybe some Apple Watch-connected functions that improve gesture accuracy and offer some touch controls could be in the cards, too. I don’t know. Apple already added some gesture controls for accessibility purposes on the Apple Watch, so the door’s open.
A lot of hand gestures in VR feel complicated to me, and involve lots of movement. Can Apple make a gesture language that feels as intuitive and as easy as multitouch on iPhones and iPads? It’s a big hurdle.

Supernatural has been a popular VR fitness app for the Meta Quest 2 for years.
Fitness focus
VR has already been a surprisingly effective fitness tool for years. Apple could address a whole bunch of opportunities that could open the landscape a lot further, though.
I’ve used Beat Saber and Supernatural on the Quest 2 for years as home exercise options, but the Quest 2 (and most VR headsets) aren’t designed with fitness in mind. Foam and silicone face pieces get sweaty, hardware can feel weirdly balanced, and no company has really spent targeted effort yet on making headgear that’s aimed at breathability and comfort like a piece of athletic equipment. There are plenty of third-party Quest accessories that help, but it still feels like an imperfect situation.
That’s Apple’s wheelhouse. After designing the Apple Watch, AirPods and, most recently, the Watch Ultra’s new straps, conceiving of materials and design that could feel better during workouts seems like an achievable goal. If the Reality Pro feels like a better piece of workout gear, it could inspire others to invest in better designs, too.
Apple should, and could, integrate the Apple Watch and fitness and health tracking into the headset’s functions. The Quest 2 can do this too to some degree, but most smartwatches and fitness trackers, like Fitbit, don’t have deep connections with VR headsets yet. They should, and again, introducing a clear wearable relationship between watch and headset feels like an overdue bridge.
Of all the things I’m trying to imagine Apple positioning an expensive headset to be in people’s lives, a fitness device keeps coming to mind as a much more likely proposition than a gaming gadget. Not that many people own gym equipment, or have space for it. Could headsets fill that role? I think they could. For me, they already do, sometimes.
Will Apple just focus on making it a great wearable display?
I’m starting to wonder if maybe Apple’s first goal with Reality Pro is just to nail a great audio/video experience. I’ve thought of VR/AR glasses as eventually needing to be «earbuds for your eyes,» as easy to use and as good as headphones are now. VR and AR headsets I’ve used all far short of being perfect displays, with the exception of the highly expensive Varjo XR-3. Could Apple achieve making the Reality Pro a headset that looks and sounds good enough to truly want to watch movies in?
Some reports that the Apple headset runs iPad apps, and that perhaps the iPad Pro with its lidar/camera array is in fact the «developer kit» for the headset, make me wonder if the headset will feel like a wearable extension of iOS rather than a whole new experience.

The inside of the Vive XR Elite: prescription adjustments allow a wide range of vision to fit… but not as wide as mine.
What about my glasses?
VR and AR headsets aren’t making it easy for me to live with my own eyewear. Some hardware fits right over my own chunky glasses, and some doesn’t. As headsets get smaller, a lot of them are trying to add vision-adjustment diopters right into the hardware – like the Vive XR Elite – or add optional prescription inserts.
Maybe someday we’ll have AR glasses that double as our own everyday glasses, and Apple can morph into a Warby Parker optical shop for its retail glasses fittings. In the meantime, these sometimes-on headsets also need to work without being annoying. Am I going to have to order prescription lenses? And how? And will they fit my needs? It’s a big responsibility for VR/AR manufacturers, and I’ve found that some of the insert options don’t meet my heavily near-sighted needs.
What are the killer apps?
Finally, of course, I’m curious about how this headset is defined. The Quest 2 is a game console with benefits. The Quest Pro was aimed at work. The PlayStation VR 2 is a PS5 extension.
The iPhone was a browser, an iPod, and an email device at first. The iPad wanted to be an easy way for users to read and look at the web. The Apple Watch was a fitness device, iPod, and wrist-communicator. What will Version One of the Apple Mixed Reality Headset be positioned as?
Apple did pepper a ton of extras into the Apple Watch at first, almost to test the waters with possibilities: a camera remote, a virtual way to send love taps and scribbles, voice memos. Reports of an avatar-based FaceTime, multiscreen immersive sports, and maybe 3D immersive versions of Apple’s already 3D-enabled Maps are clear starts. Apple’s collaborative Freeform app could be pitched as a mixed reality workplace, and movies could be watched in a virtual theater, in a way that VR headsets have enabled for years (but maybe here with an even better display and audio). AR-enabled iPhone and iPad home improvement apps, 3D scanning apps, and games could be ported over, leaning on similar lidar-scanning AR functions in-headset. Apple fitness workouts, clearly, could be big. Gaming? With Arcade, or some early partners, sure.
Will any of these be enough? Will Apple define a territory that right now has had a hard time defining itself beyond gaming? This first headset may not be the one most people buy, but it could be the one that tries to map out some clear directions for development beyond gaming. With Samsung and Google’s headset on the horizon, and possibly a lot more after that, these devices will start to reinvent themselves as they become more phone-connected and portable. Apple could have an early chance at shaping that narrative… or, if it doesn’t, others will get a chance after Apple. We’ll likely know more, or at least get an early glimpse, at WWDC.
Technologies
Turns Out Perplexity Might Be the Sleeper Feature on Samsung’s Galaxy S26
Having Perplexity’s AI and models on devices from the world’s biggest phone-maker puts the company under a brighter light.
There were plenty of references to AI at today’s Galaxy Unpacked event. But Samsung isn’t alone; nearly every major smartphone launch in recent years has included new AI features or partnerships with AI companies.
Samsung launched its latest iteration of Galaxy AI, debuting it alongside Galaxy S26 phones. This follows weekend news that the company plans to integrate Perplexity’s AI agent — and even support a «Hey Plex» wake word — on its new phones. But the partnership appears to go beyond simply giving Samsung users another AI option.
Since late 2023, phone-makers have been leapfrogging one another to add generative AI features and integrate AI agents. Nearly every new Android phone supports Google’s Gemini assistant. Apple’s iPhones integrate OpenAI’s ChatGPT into the phone’s Visual Intelligence feature and its Siri overhaul will incorporate Google’s Gemini AI models.
While Perplexity has partnered with phone-makers such as Motorola to preload its app — and has been integrated into devices for Deutsche Telekom — having its AI and models built directly into phones from the world’s largest manufacturer puts the company on a much bigger stage. It marks a shift toward AI agents being just another tool people choose to use, much like a phone app.
«The first step toward an agentic mobile ecosystem is the user getting to choose whatever agent they want,» Dmitry Shevelenko, Perplexity’s chief business officer, told CNET. «I think this is where Samsung is taking a big, big leap forward.»
Perplexity’s Sonar API powers aspects of Samsung’s Galaxy AI ecosystem. Shevelenko said that the company’s engineers worked closely with Samsung’s team to revamp its Bixby assistant at the framework level, getting deep system access. He noted that it’s the first time a third-party AI company has achieved parity on a major mobile OS. The Galaxy S26 phones that Samsung announced support the new «Hey Plex» wake word, putting Perplexity shoulder-to-shoulder with Google’s Gemini AI assistant, which is integrated into Android on Samsung devices.
«What’s unique is the only other company that has it is Google, right?» said Shevelenko. «It’s a real paradigm shift for Samsung to be going into a multi-AI direction, where they are giving their users choice. And I think they see this as a strategic differentiator.»
Samsung’s inclusion of Perplexity touches many of the company’s own apps including Calendar, Clock, Gallery, Notes and Reminders. The benefit of structuring Perplexity’s AI deeply into Samsung’s software is that people can have a lighter interaction with their phones. As opposed to unlocking their device, navigating the home page, opening the app and entering a query, people will be able to simply press a button, say, «Hey Plex,» starting their search within seconds.
But the integration of Perplexity isn’t limited to Bixby. Shevelenko said Samsung’s browser, aptly named Internet, includes agentic browsing using Perplexity’s Comet technology as well.
Such a significant moment for Perplexity naturally draws parallels to Apple and its partnership with OpenAI, which has partnered with former Apple designer Jony Ive for its own hardware efforts. When I asked Shevelenko about the possibility of Perplexity making its own phone or hardware, he responded emphatically, «No.»
«We are laser-focused on working with all the best OEMs,» he said. «Our thing we’re world-class at is building accurate AI that is easy to use and delightful to use and growing that curiosity.»
And while we wait for Samsung to announce new phones, it’ll be interesting to see how Galaxy phone owners use the phone’s AI agents. Soon, people could say, «Hey Google» into their Samsung devices to prompt Gemini, or «Hey Plex» to trigger a query with Perplexity. And options are usually a good thing.
Samsung did not immediately respond to a request for comment.
Technologies
ADT Acquires AI Company for Sensing People and Activity in Your Home
ADT’s acquisition of Origin AI brings presence-sensing technology under the home security company’s umbrella.
ADT on Tuesday announced an interesting new acquisition for anyone looking to the future of home security — and it’s no surprise AI is a part of the story. In a $170 million deal, ADT has purchased Origin AI, which specializes in people detection in spaces like the inside of your home, something the security company is calling AI-sensing technology.
ADT has not disclosed specific plans for AI technology, but this comes at a time when concerns about corporate surveillance by companies like Ring and Flock have reached a fever pitch.
«ADT has been testing and evaluating Origin’s technology pre-acquisition,» ADT Chief Business Officer Omar Kahn told me. «In 2026, the focus is on integrating the technology into ADT’s platform, with commercialization expected to begin in 2027.»
Presence sensing doesn’t sound like the chatty, summary-creating large language models we consider AI these days, nor the person and car recognition features companies like Flock use. It’s a system that analyzes home Wi-Fi frequencies for disruptions. The AI is trained in pattern recognition to identify which disruptions indicate that humans are at home (ignoring pets) and what they may be doing.
The technology has cropped up in many spots over the past couple of years. I’ve seen it before with aging-in-place technology and Philips Hue’s newest smart bulbs, but most recently with Aqara’s sensor at CES 2026, which can detect when multiple people are congregating, standing, sitting or lying down.
How does presence sensing affect people’s privacy?
It’s not clear how ADT will use Origin’s presence sensing in its home security systems, though the company did mention smart automation, personalization and reducing false alarms. In one example, it could automatically adjust an ADT-supported thermostat when multiple people are detected moving around a house. But that also raises privacy questions.
Presence sensing, like Origin’s tech, has certain privacy benefits. It doesn’t use cameras to film anyone or save video recordings of people, and it doesn’t create identity profiles based on someone’s face or other data. It can’t tell who is in a house, only where they are and how/when they are moving around (or not moving).
That allows for capabilities such as notifying a nursing home that a resident hasn’t gotten out of bed when they usually do, without invasive investigation. But the technology also raises privacy concerns: A company could know when people in their own home are in bed, watching TV, or sitting to eat dinner, even if it can’t identify them by name.
ADT calls features like these home awareness, but also mentions municipal compliance and coordination with first responders. That could mean giving firefighters information on how many people are in a burning building. But there are concerns. Recent news reports indicate that some local law enforcement agencies have shared information with US Immigration and Customs Enforcement for use in home and apartment raids, raising the possibility that the technology could be applied in similar contexts.
The technology’s implications may ultimately hinge on how ADT chooses to implement and regulate it. Until those details are clearer, its promise and its risks remain closely intertwined.
Technologies
New York Times Debuts the Midi Crossword, Its In-Between Puzzle
Is the Mini Crossword too easy, but the original one just too time-consuming? Here’s your new puzzle.
The daily New York Times Mini Crossword can be solved in a minute or so, while the newspaper’s iconic original crossword puzzle might take hours. Now, puzzlers who want an in-between diversion can try a new puzzle from the Times, introduced this week — the Midi Crossword puzzle. (And CNET readers can get daily answers for five Times puzzles — Wordle, Connections, Strands, Connections: Sports Edition and the Mini Crossword.)
New York Times Games subscribers can play the Midi in the New York Times Games app for iOS and Android devices, or on mobile or desktop web. It’s online-only, not in the print newspaper.
«We’re really leaning into the digital-first nature of the puzzle,» NYT Games Puzzle Editor Ian Livengood said in a Times article about the new puzzle. «About once a week, the puzzle will have a visual effect — an extra flourish when you start or after you solve. This could be a cool animation or colorful shading.»
As the name «Midi» suggests, this is a mid-sized crossword puzzle. Where the Mini Crossword usually only has 5 Across and 5 Down clues, the Midi is usually a 9-by-9 puzzle, sometimes as long as 11-by-11.
«If you feel like the Mini is not enough but the Daily is too much, this will be the perfect puzzle for you,» Livengood said.
Each Midi Crossword has a theme that hints at the topics of the clues and answers. Unlike the other puzzles, Livengood says the Midi might occasionally have two-letter words and repeating answers.
I tried the Midi Crossword
I tried Wednesday’s Midi Crossword and solved it in just over 3 minutes. That’s much longer than I spend on the Mini Crossword, but much faster than the original New York Times crossword puzzle takes me.
I thought most of the clues were pretty simple, and the few tricky ones filled themselves in once I moved from Across to Down.
If you’re a New York Times Games subscriber, this is a nice addition to your daily puzzle stable. It tests your mind a bit more than the Mini, but you can also solve it while watching TV or waiting for someone to text you back.
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow
