Technologies
Apple’s Mixed Reality Headset: What to Expect From WWDC’s Big Reveal
Long-awaited and still mysterious, Apple’s VR headset could be the spark for a whole new wave of hardware and software.
Apple’s next big product looks like it’ll cost $3,000, rest on your face and need to be tethered to a battery pack. Whatever this expected VR headset ends up being, it isn’t immediately clear what it’ll do or who it’s for. The Reality Pro headset, as it’s expected to be called when it’s likely unveiled at Apple’s WWDC developer conference on June 5, is Apple’s biggest new product in nearly a decade. It’s also totally different than anything Apple has ever made before.
VR headsets have been a standard consumer tech thing for years, and your family, or families you know, may already have one lying in a corner. They’re used for games, fitness, creative collaboration, even theater. Still, VR and AR have been outlier technologies, not deeply connected enough to the phones, tablets and laptops most of us use every day.
Apple could change that. And of course, don’t expect the word «metaverse» to be uttered even once. The metaverse became Meta’s buzzword to envision its future of AR and VR. Apple will have its own parallel, possibly unique, pitch.
A connection to everything?
I pair my Quest 2, from Meta, to my phone, and it gets my texts and notifications. I connect it to my Mac to cast extra monitors around my desk using an app called Immersed. But VR and AR don’t often feel deeply intertwined with the devices I use. They aren’t seamless in the way my watch feels when used with an iPhone, or AirPods feel when used with an iPad or Mac.
Apple needs this headset to bridge all of its devices, or at least make a good starting effort. Reports say the headset will run iPad apps on its built-in 4K displays, suggesting a common app ecosystem. It’s also possible that the Apple Watch could be a key peripheral, tracking fitness and also acting as a vibrating motion-control accessory.
VR is a self-contained experience, but mixed reality – which Apple’s headset should lean on heavily – uses pass-through cameras to blend virtual things with video of the real world. In Apple’s case, its own devices could act as spatially linked accessories, using keyboards and touchscreens and ways to show virtual screens springing from real ones.
Apple’s expected headset is supposed to be self-contained, a standalone device like the Quest 2 and Quest Pro. But that interconnectivity, and its position in Apple’s continuity-handoff connected ecosystem, is a big opportunity and a big question mark.
However, Apple does have a big AR head start: Its iOS ecosystem has supported AR for years, and the iPhone and iPad Pro already have depth-sensing lidar scanners that can map out rooms in ways that Apple’s headset should replicate. Apple could emphasize making its existing AR tools on other devices more usable and visible through a new interface.
Apple’s head of AR, Mike Rockwell – the person expected to be leading this new headset’s development – told me in a conversation about AR in 2020 that «AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we’ve got to make sure that it is successful. For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort.»

The Quest Pro and other headsets already support hand tracking. Will Apple refine the technology?
How do we control it?
I’m less curious about the Apple headset display – which sounds extremely promising with a possible 4K resolution per eye and a Micro OLED display – and more focused on how Apple solves what we do with our hands.
Interfaces in VR and AR are very much a work in progress. VR has tended to lean on split game controllers for most inputs, with optional (and steadily improving) hand tracking that still isn’t perfected.
Apple isn’t expected to have any controller at all with its Reality Pro headset. Instead, it’ll likely use both eye tracking and hand tracking to create a more accurate and possibly streamlined style of interface that could make targeting intended actions feel faster. Eye tracking already works this way, sometimes, in headsets that use it: The PlayStation VR 2 has some games that use eye tracking for controlling menus.
Accessibility is a big question here. Apple’s design choices are often very accessibility-conscious, and VR and AR headsets often rely on eye movement or physical hand movements that aren’t always easy for everyone. Voice control is a possible option here, or maybe some Apple Watch-connected functions that improve gesture accuracy and offer some touch controls could be in the cards, too. I don’t know. Apple already added some gesture controls for accessibility purposes on the Apple Watch, so the door’s open.
A lot of hand gestures in VR feel complicated to me, and involve lots of movement. Can Apple make a gesture language that feels as intuitive and as easy as multitouch on iPhones and iPads? It’s a big hurdle.

Supernatural has been a popular VR fitness app for the Meta Quest 2 for years.
Fitness focus
VR has already been a surprisingly effective fitness tool for years. Apple could address a whole bunch of opportunities that could open the landscape a lot further, though.
I’ve used Beat Saber and Supernatural on the Quest 2 for years as home exercise options, but the Quest 2 (and most VR headsets) aren’t designed with fitness in mind. Foam and silicone face pieces get sweaty, hardware can feel weirdly balanced, and no company has really spent targeted effort yet on making headgear that’s aimed at breathability and comfort like a piece of athletic equipment. There are plenty of third-party Quest accessories that help, but it still feels like an imperfect situation.
That’s Apple’s wheelhouse. After designing the Apple Watch, AirPods and, most recently, the Watch Ultra’s new straps, conceiving of materials and design that could feel better during workouts seems like an achievable goal. If the Reality Pro feels like a better piece of workout gear, it could inspire others to invest in better designs, too.
Apple should, and could, integrate the Apple Watch and fitness and health tracking into the headset’s functions. The Quest 2 can do this too to some degree, but most smartwatches and fitness trackers, like Fitbit, don’t have deep connections with VR headsets yet. They should, and again, introducing a clear wearable relationship between watch and headset feels like an overdue bridge.
Of all the things I’m trying to imagine Apple positioning an expensive headset to be in people’s lives, a fitness device keeps coming to mind as a much more likely proposition than a gaming gadget. Not that many people own gym equipment, or have space for it. Could headsets fill that role? I think they could. For me, they already do, sometimes.
Will Apple just focus on making it a great wearable display?
I’m starting to wonder if maybe Apple’s first goal with Reality Pro is just to nail a great audio/video experience. I’ve thought of VR/AR glasses as eventually needing to be «earbuds for your eyes,» as easy to use and as good as headphones are now. VR and AR headsets I’ve used all far short of being perfect displays, with the exception of the highly expensive Varjo XR-3. Could Apple achieve making the Reality Pro a headset that looks and sounds good enough to truly want to watch movies in?
Some reports that the Apple headset runs iPad apps, and that perhaps the iPad Pro with its lidar/camera array is in fact the «developer kit» for the headset, make me wonder if the headset will feel like a wearable extension of iOS rather than a whole new experience.

The inside of the Vive XR Elite: prescription adjustments allow a wide range of vision to fit… but not as wide as mine.
What about my glasses?
VR and AR headsets aren’t making it easy for me to live with my own eyewear. Some hardware fits right over my own chunky glasses, and some doesn’t. As headsets get smaller, a lot of them are trying to add vision-adjustment diopters right into the hardware – like the Vive XR Elite – or add optional prescription inserts.
Maybe someday we’ll have AR glasses that double as our own everyday glasses, and Apple can morph into a Warby Parker optical shop for its retail glasses fittings. In the meantime, these sometimes-on headsets also need to work without being annoying. Am I going to have to order prescription lenses? And how? And will they fit my needs? It’s a big responsibility for VR/AR manufacturers, and I’ve found that some of the insert options don’t meet my heavily near-sighted needs.
What are the killer apps?
Finally, of course, I’m curious about how this headset is defined. The Quest 2 is a game console with benefits. The Quest Pro was aimed at work. The PlayStation VR 2 is a PS5 extension.
The iPhone was a browser, an iPod, and an email device at first. The iPad wanted to be an easy way for users to read and look at the web. The Apple Watch was a fitness device, iPod, and wrist-communicator. What will Version One of the Apple Mixed Reality Headset be positioned as?
Apple did pepper a ton of extras into the Apple Watch at first, almost to test the waters with possibilities: a camera remote, a virtual way to send love taps and scribbles, voice memos. Reports of an avatar-based FaceTime, multiscreen immersive sports, and maybe 3D immersive versions of Apple’s already 3D-enabled Maps are clear starts. Apple’s collaborative Freeform app could be pitched as a mixed reality workplace, and movies could be watched in a virtual theater, in a way that VR headsets have enabled for years (but maybe here with an even better display and audio). AR-enabled iPhone and iPad home improvement apps, 3D scanning apps, and games could be ported over, leaning on similar lidar-scanning AR functions in-headset. Apple fitness workouts, clearly, could be big. Gaming? With Arcade, or some early partners, sure.
Will any of these be enough? Will Apple define a territory that right now has had a hard time defining itself beyond gaming? This first headset may not be the one most people buy, but it could be the one that tries to map out some clear directions for development beyond gaming. With Samsung and Google’s headset on the horizon, and possibly a lot more after that, these devices will start to reinvent themselves as they become more phone-connected and portable. Apple could have an early chance at shaping that narrative… or, if it doesn’t, others will get a chance after Apple. We’ll likely know more, or at least get an early glimpse, at WWDC.
Technologies
Today’s NYT Connections Hints, Answers and Help for Dec. 14, #917
Here are some hints and the answers for the NYT Connections puzzle for Dec. 14, #917.
Looking for the most recent Connections answers? Click here for today’s Connections hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections: Sports Edition and Strands puzzles.
Today’s NYT Connections puzzle is an odd one in that the purple category, usually the toughest, was the easiest — if you know a certain group of fictional animals. If you need help sorting them into groups, you’re in the right place. Read on for clues and today’s Connections answers.
The Times now has a Connections Bot, like the one for Wordle. Go there after you play to receive a numeric score and to have the program analyze your answers. Players who are registered with the Times Games section can now nerd out by following their progress, including the number of puzzles completed, win rate, number of times they nabbed a perfect score and their win streak.
Read more: Hints, Tips and Strategies to Help You Win at NYT Connections Every Time
Hints for today’s Connections groups
Here are four hints for the groupings in today’s Connections puzzle, ranked from the easiest yellow group to the tough (and sometimes bizarre) purple group.
Yellow group hint: Butter up.
Green group hint: Like The Little Match Girl.
Blue group hint: Letter that makes no sound.
Purple group hint: Oink!
Answers for today’s Connections groups
Yellow group: Lay it on thick.
Green group: Hans Christian Anderson figures.
Blue group: Silent «L.»
Purple group: Fictional pigs.
Read more: Wordle Cheat Sheet: Here Are the Most Popular Letters Used in English Words
What are today’s Connections answers?
The yellow words in today’s Connections
The theme is lay it on thick. The four answers are fawn, flatter, gush and praise.
The green words in today’s Connections
The theme is Hans Christian Anderson figures. The four answers are duckling, emperor, mermaid and princess.
The blue words in today’s Connections
The theme is silent «L.» The four answers are calf, chalk, colonel and would.
The purple words in today’s Connections
The theme is fictional pigs. The four answers are Babe, Napoleon, Piglet and Porky.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Technologies
Today’s NYT Strands Hints, Answers and Help for Dec. 14 #651
Here are hints and answers for the NYT Strands puzzle for Dec. 14, No. 651.
Looking for the most recent Strands answer? Click here for our daily Strands hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections and Connections: Sports Edition puzzles.
Today’s NYT Strands puzzle may leave you wanting to make a reservation at a fancy restaurant. Some of the answers are difficult to unscramble, so if you need hints and answers, read on.
I go into depth about the rules for Strands in this story.
If you’re looking for today’s Wordle, Connections and Mini Crossword answers, you can visit CNET’s NYT puzzle hints page.
Read more: NYT Connections Turns 1: These Are the 5 Toughest Puzzles So Far
Hint for today’s Strands puzzle
Today’s Strands theme is: Pricy pairing.
If that doesn’t help you, here’s a clue: May I see the menu?
Clue words to unlock in-game hints
Your goal is to find hidden words that fit the puzzle’s theme. If you’re stuck, find any words you can. Every time you find three words of four letters or more, Strands will reveal one of the theme words. These are the words I used to get those hints but any words of four or more letters that you find will work:
- FLOP, POLL, POLLS, RARE, CARE, HARE, SURE, SPAT, SPATS, PATS, CRUST, RUST
Answers for today’s Strands puzzle
These are the answers that tie into the theme. The goal of the puzzle is to find them all, including the spangram, a theme word that reaches from one side of the puzzle to the other. When you have all of them (I originally thought there were always eight but learned that the number can vary), every letter on the board will be used. Here are the nonspangram answers:
- CRAB, RIBEYE, SHRIMP, LOBSTER, SCALLOP, SIRLOIN
Today’s Strands spangram
Today’s Strands spangram is SURFANDTURF. To find it, start with the S that’s the far-left letter on the top row, and wind down.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Technologies
Can My iPhone 17 Pro Match a 6K Cinema Camera? I Teamed Up With a Pro to Find Out
I put a video shoot together to see just how close an iPhone can get to a pro cinema setup.
The iPhone 17 Pro packs a powerful video setup with a trio of cameras, large image sensors (for a phone), ProRes raw codecs and Log color profiles for advanced editing. It makes the phone one of the most powerful and dependable video shooters among today’s smartphones.
Apple often boasts about famous directors using the iPhone to shoot films and music videos. The company even records its event videos for new products with the iPhone.
But is the iPhone really good enough at shooting video to replace a traditional cinema camera? To see how good the iPhone 17 Pro is for professional use, I gave it a proper test.
I put together a video shoot where I pitted the $1,000 iPhone against a full professional cinema camera rig, worth thousands of dollars, to see just how well Apple’s phone can hold its own. I planned a video production at my favorite coffee roaster in Edinburgh, called Santu, which is based in a stunning building that I knew would look amazing on camera.
To give both cameras the best chance, I worked with Director of Photography Cal Hallows, who has been responsible for production on major shoots around the world, working with brands including Aston Martin, the BBC, IBM and Hilton Hotels.
Here’s what happened.
Our filming equipment
We didn’t use any external lenses with the iPhone; instead, we relied on either the built-in main, ultrawide or telephoto options. I shot my footage using the BlackMagic Camera app. I had a Crucial X10 external SSD since I was recording in Apple’s ProRes raw codec, which creates large files.
I also had a variable neutral density filter to achieve a consistent shutter speed. For some shots, I used Moment’s SuperCage to help give me a better grip — and therefore smoother footage. But for other shots, I just used the phone by itself to make it easier to get into tight spaces. More on that later.
The iPhone’s competition was the $3,300 BlackMagic Pyxis 6K. It’s a professional cinema camera with a full-frame 6K resolution image sensor and raw video capabilities. I paired that with some stunning pro cine lenses, including a set of Arles Primes, the XTract Probe lens from DZO Film and a couple of choice cine primes from Sigma. It’s a formidable and pricey setup for any cinematographer.
The shoot day
We shot over the course of a single day. I’d already created a rough storyboard of the shots I wanted to get, which helped me plan my angles and lens choices. I wanted to try and replicate some angles directly with both cameras.
This shot of the store room being opened (above), for example — was a lovely scene, and I didn’t see much difference in quality between the iPhone’s video and the BlackMagic’s. This was the case with a few of the scenes we replicated. Apple’s ProRes raw codec on the iPhone provided a lot of scope for adjusting the color, allowing us to create beautiful color grades that looked every bit as striking as footage from the Blackmagic camera.
Sure, you could tell that they were different, but I couldn’t honestly say if one was better than the other.
Other shots were more difficult to replicate. I love this low-angle of the roastery owner, Washington, pulling his trolley through the scene. On the iPhone, the main lens wasn’t wide enough to capture everything we wanted but switching to the ultrawide was too much the other way and we ended up having spare gear and other people in the frame.
This made several shots a challenge to replicate as the fixed zoom ranges of the iPhone simply didn’t translate to the same fields of view offered by our lenses on the BlackMagic camera. As a result, getting the right framing for shots from the iPhone was trickier than I expected. But focal length wasn’t the only reason using «real» lenses was better.
The DZO Arles Primes are awesome cinema lenses that offer wide apertures that allowed us to shoot with gorgeous natural bokeh. We used this to our advantage on several shots where we really wanted the subject to be isolated against an out-of-focus background.
Secret weapons
That was especially the case when we used our secret weapon: the DZO Films Xtract probe lens. This bizarre-looking, long, thin lens gives both a wide-angle perspective coupled with a close focusing distance.
I loved using the probe lens for this shot, particularly where we’ve focused on exactly where Washington was using the bean grinder. I tried to replicate it on the iPhone using the close-focusing ultrawide lens and the shot looks good, but it lacks the visual sophistication that I can get from a big, professional camera. Especially because the lack of background blur makes it easier to see distracting background items stored under the counter that are otherwise «hidden» in the blur on the main camera.
But the iPhone has its own secret weapon, too. Its size. The tiny dimensions of the iPhone — even with a filter and the SSD crudely taped to it — is so small that we were able to get shots that we simply couldn’t have achieved with the big cinema camera.
In particular, this shot, where I rigged the iPhone to an arm inside the cooling machine so that it travelled around as the beans were churned. I love this shot — and a top-down view I shot of the arms turning beneath. Both angles give this incredible energy to the film and I think they are my favourite scenes of the whole production. It wasn’t easy to see the phone screen in these positions but SmallRig’s wireless iPhone monitor made it much easier to get my angles just right. Trying to rig up a large, heavy camera and lens to get the same shots was simply out of the question.
How well did the iPhone compare?
I’m really impressed with both cameras on this project, but my expert Director of Photography, Cal, had some thoughts, too.
«The thing I really found with the iPhone,» Cal explained, «was simply the creative freedom to get shots that I’d have never had time to set up. There’s only so long in a day and only so long you have access to filming locations or actors, so the fact that you can just grab your iPhone and get these shots is amazing.»
«I have used my iPhone on professional shoots before. One time in particular was when I was driving away from set and I saw this great sunset. If I’d have spent time rigging up my regular camera, I’d have missed the sunset. So I shot it on my phone and the client loved it — it ended up being the final shot of the film. At the end of the day, a good shot is a good shot and it doesn’t matter what you shot it with,» said Cal.
So was it all good for the iPhone?
«The depth of field and the overall look of the cinema lenses still come out on top — you’re just not going to get that on a phone,» explained Cal. «When it came to grading the footage, I had to use a lot of little workarounds to get the iPhones to match. The quality quickly started to fall apart in certain challenging scenes that just weren’t a problem with the BlackMagic.»
So it’s not a total win for the iPhone, but then, I never expected it to be. The iPhone was never going to replace the pro camera on this shoot, but it instead allowed us to augment our video with shots that we would otherwise never have gotten.
I love the creative angles we found using just the phone, and while Cal struggled to balance its colors as easily, the footage does fit in nicely with the rest of the video and makes it more dynamic and engaging as a result.
And that’s not to say the shots we didn’t use from it weren’t good. I’m actually impressed with how the iPhone handled most of the things we threw at it.
So don’t assume that if you want to get into filmmaking, you need to drop tens of thousands on a pro cinema camera and a set of cine primes. Your iPhone has everything you need to get started, and it’ll let you flex your creativity much more easily.
Our days of shooting, editing and grading have proven that the iPhone isn’t yet ready to be the only camera you need on a professional set. But mix its small size in with your other cameras, and then you’ve got yourself a truly powerful production setup.
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow
