Connect with us

Technologies

Apple’s Mixed Reality Headset: What to Expect From WWDC’s Big Reveal

Long-awaited and still mysterious, Apple’s VR headset could be the spark for a whole new wave of hardware and software.

Apple’s next big product looks like it’ll cost $3,000, rest on your face and need to be tethered to a battery pack. Whatever this expected VR headset ends up being, it isn’t immediately clear what it’ll do or who it’s for. The Reality Pro headset, as it’s expected to be called when it’s likely unveiled at Apple’s WWDC developer conference on June 5, is Apple’s biggest new product in nearly a decade. It’s also totally different than anything Apple has ever made before.

VR headsets have been a standard consumer tech thing for years, and your family, or families you know, may already have one lying in a corner. They’re used for games, fitness, creative collaboration, even theater. Still, VR and AR have been outlier technologies, not deeply connected enough to the phones, tablets and laptops most of us use every day.

Apple could change that. And of course, don’t expect the word «metaverse» to be uttered even once. The metaverse became Meta’s buzzword to envision its future of AR and VR. Apple will have its own parallel, possibly unique, pitch.

A connection to everything?

I pair my Quest 2, from Meta, to my phone, and it gets my texts and notifications. I connect it to my Mac to cast extra monitors around my desk using an app called Immersed. But VR and AR don’t often feel deeply intertwined with the devices I use. They aren’t seamless in the way my watch feels when used with an iPhone, or AirPods feel when used with an iPad or Mac.

Apple needs this headset to bridge all of its devices, or at least make a good starting effort. Reports say the headset will run iPad apps on its built-in 4K displays, suggesting a common app ecosystem. It’s also possible that the Apple Watch could be a key peripheral, tracking fitness and also acting as a vibrating motion-control accessory. 

VR is a self-contained experience, but mixed reality – which Apple’s headset should lean on heavily – uses pass-through cameras to blend virtual things with video of the real world. In Apple’s case, its own devices could act as spatially linked accessories, using keyboards and touchscreens and ways to show virtual screens springing from real ones.

Apple’s expected headset is supposed to be self-contained, a standalone device like the Quest 2 and Quest Pro. But that interconnectivity, and its position in Apple’s continuity-handoff connected ecosystem, is a big opportunity and a big question mark.

However, Apple does have a big AR head start: Its iOS ecosystem has supported AR for years, and the iPhone and iPad Pro already have depth-sensing lidar scanners that can map out rooms in ways that Apple’s headset should replicate. Apple could emphasize making its existing AR tools on other devices more usable and visible through a new interface.

Apple’s head of AR, Mike Rockwell – the person expected to be leading this new headset’s development – told me in a conversation about AR in 2020 that «AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we’ve got to make sure that it is successful. For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort.»

Quest Pro VR headset, being worn while sitting at a desk Quest Pro VR headset, being worn while sitting at a desk

The Quest Pro and other headsets already support hand tracking. Will Apple refine the technology?

Meta

How do we control it?

I’m less curious about the Apple headset display – which sounds extremely promising with a possible 4K resolution per eye and a Micro OLED display – and more focused on how Apple solves what we do with our hands. 

Interfaces in VR and AR are very much a work in progress. VR has tended to lean on split game controllers for most inputs, with optional (and steadily improving) hand tracking that still isn’t perfected. 

Apple isn’t expected to have any controller at all with its Reality Pro headset. Instead, it’ll likely use both eye tracking and hand tracking to create a more accurate and possibly streamlined style of interface that could make targeting intended actions feel faster. Eye tracking already works this way, sometimes, in headsets that use it: The PlayStation VR 2 has some games that use eye tracking for controlling menus.

Accessibility is a big question here. Apple’s design choices are often very accessibility-conscious, and VR and AR headsets often rely on eye movement or physical hand movements that aren’t always easy for everyone. Voice control is a possible option here, or maybe some Apple Watch-connected functions that improve gesture accuracy and offer some touch controls could be in the cards, too. I don’t know. Apple already added some gesture controls for accessibility purposes on the Apple Watch, so the door’s open.

A lot of hand gestures in VR feel complicated to me, and involve lots of movement. Can Apple make a gesture language that feels as intuitive and as easy as multitouch on iPhones and iPads? It’s a big hurdle.

A woman wearing a VR headset punches an object, shattering it A woman wearing a VR headset punches an object, shattering it

Supernatural has been a popular VR fitness app for the Meta Quest 2 for years.

Within

Fitness focus

VR has already been a surprisingly effective fitness tool for years. Apple could address a whole bunch of opportunities that could open the landscape a lot further, though. 

I’ve used Beat Saber and Supernatural on the Quest 2 for years as home exercise options, but the Quest 2 (and most VR headsets) aren’t designed with fitness in mind. Foam and silicone face pieces get sweaty, hardware can feel weirdly balanced, and no company has really spent targeted effort yet on making headgear that’s aimed at breathability and comfort like a piece of athletic equipment. There are plenty of third-party Quest accessories that help, but it still feels like an imperfect situation.

That’s Apple’s wheelhouse. After designing the Apple Watch, AirPods and, most recently, the Watch Ultra’s new straps, conceiving of materials and design that could feel better during workouts seems like an achievable goal. If the Reality Pro feels like a better piece of workout gear, it could inspire others to invest in better designs, too.

Apple should, and could, integrate the Apple Watch and fitness and health tracking into the headset’s functions. The Quest 2 can do this too to some degree, but most smartwatches and fitness trackers, like Fitbit, don’t have deep connections with VR headsets yet. They should, and again, introducing a clear wearable relationship between watch and headset feels like an overdue bridge.

Of all the things I’m trying to imagine Apple positioning an expensive headset to be in people’s lives, a fitness device keeps coming to mind as a much more likely proposition than a gaming gadget. Not that many people own gym equipment, or have space for it. Could headsets fill that role? I think they could. For me, they already do, sometimes.

Will Apple just focus on making it a great wearable display?

I’m starting to wonder if maybe Apple’s first goal with Reality Pro is just to nail a great audio/video experience. I’ve thought of VR/AR glasses as eventually needing to be «earbuds for your eyes,» as easy to use and as good as headphones are now. VR and AR headsets I’ve used all far short of being perfect displays, with the exception of the highly expensive Varjo XR-3. Could Apple achieve making the Reality Pro a headset that looks and sounds good enough to truly want to watch movies in

Some reports that the Apple headset runs iPad apps, and that perhaps the iPad Pro with its lidar/camera array is in fact the «developer kit» for the headset, make me wonder if the headset will feel like a wearable extension of iOS rather than a whole new experience.

Inside a pair of VR goggles, showing lenses and a dial with numbers on the outside Inside a pair of VR goggles, showing lenses and a dial with numbers on the outside

The inside of the Vive XR Elite: prescription adjustments allow a wide range of vision to fit… but not as wide as mine.

Scott Stein/CNET

What about my glasses?

VR and AR headsets aren’t making it easy for me to live with my own eyewear. Some hardware fits right over my own chunky glasses, and some doesn’t. As headsets get smaller, a lot of them are trying to add vision-adjustment diopters right into the hardware – like the Vive XR Elite – or add optional prescription inserts.

Maybe someday we’ll have AR glasses that double as our own everyday glasses, and Apple can morph into a Warby Parker optical shop for its retail glasses fittings. In the meantime, these sometimes-on headsets also need to work without being annoying. Am I going to have to order prescription lenses? And how? And will they fit my needs? It’s a big responsibility for VR/AR manufacturers, and I’ve found that some of the insert options don’t meet my heavily near-sighted needs.

What are the killer apps?

Finally, of course, I’m curious about how this headset is defined. The Quest 2 is a game console with benefits. The Quest Pro was aimed at work. The PlayStation VR 2 is a PS5 extension. 

The iPhone was a browser, an iPod, and an email device at first. The iPad wanted to be an easy way for users to read and look at the web. The Apple Watch was a fitness device, iPod, and wrist-communicator. What will Version One of the Apple Mixed Reality Headset be positioned as?

Apple did pepper a ton of extras into the Apple Watch at first, almost to test the waters with possibilities: a camera remote, a virtual way to send love taps and scribbles, voice memos. Reports of an avatar-based FaceTime, multiscreen immersive sports, and maybe 3D immersive versions of Apple’s already 3D-enabled Maps are clear starts. Apple’s collaborative Freeform app could be pitched as a mixed reality workplace, and movies could be watched in a virtual theater, in a way that VR headsets have enabled for years (but maybe here with an even better display and audio). AR-enabled iPhone and iPad home improvement apps, 3D scanning apps, and games could be ported over, leaning on similar lidar-scanning AR functions in-headset. Apple fitness workouts, clearly, could be big. Gaming? With Arcade, or some early partners, sure.

Will any of these be enough? Will Apple define a territory that right now has had a hard time defining itself beyond gaming? This first headset may not be the one most people buy, but it could be the one that tries to map out some clear directions for development beyond gaming. With Samsung and Google’s headset on the horizon, and possibly a lot more after that, these devices will start to reinvent themselves as they become more phone-connected and portable. Apple could have an early chance at shaping that narrative… or, if it doesn’t, others will get a chance after Apple. We’ll likely know more, or at least get an early glimpse, at WWDC.

Technologies

The Fastest Way to Open Any App Is Hiding on the Back of Your iPhone

Your iPhone’s Back Tap feature can be customized to open any app.

Tapping the screen on an iPhone opens an app. What does tapping on the back of your phone do? A number of things, it turns out. It’s a super useful hack that you’ve likely been missing out on. In fact, it’s the fastest way to launch the camera or open specific apps without hunting through folders. In 2026, it’s the ultimate hack for making your hardware work harder for you without touching the display.

The feature is part of the Back Tap tool in your iPhone’s accessibility settings. Once enabled, it can trigger almost anything your phone can do, from turning on the flashlight to opening Shazam before a song ends. You can even set it to open the Control Center, take a screenshot or run a custom Shortcut with two or three quick taps. It’s fast, discreet and surprisingly powerful once you set it up.

The feature is called Back Tap and, like the Action Button on newer iPhones, it gives you one more way to use your device without touching the screen. You can activate it by tapping anywhere on the back of your phone, including on the camera module. The best part is that it works even if you have a fairly thick case on your iPhone.

Back Tap is available on iPhones as old as the iPhone 8, as long as they’re running iOS 14 or later. We’ll show you how to enable it and how to use it with your Shortcuts app for nearly endless possibilities.

Read more: All the Ways the iPhone 16’s Camera Control Button Will Change Your iPhone Photography

What is the iPhone Back Tap feature?

Back Tap is an iPhone feature introduced in iOS 14. It lets you perform shortcuts on your iPhone by double- or triple-tapping on the back of the device.

You can customize Back Tap on your iPhone to easily perform common actions like pulling up the Control Center or Notification Center, especially useful if you have a larger phone and can’t swipe down from the top of the screen without some complex finger gymnastics. You can even have two separate functions enabled at the same time: Back Tap can distinguish between a Double Tap and a Triple Tap.

Depending on the number of times you touch the back of your iPhone, you can set Double Tap to open your Notification Center and Triple Tap to take a screenshot. Or, you can make Double Tap open the Control Center and Triple Tap launch the Magnifier app. Experiment with Back Tap to find the right combinations of taps and functions that best fit your needs.

And you aren’t limited to just the Back Tap options that are available by default. Thanks to the Shortcuts app, you can set up Back Tap to perform specific functions or launch any app. For example, you can create a simple shortcut that opens Shazam or starts a voice recording, then activate it with a quick Double Tap or Triple Tap. You can also use Back Tap to trigger a more elaborate shortcut, such as automatically sending photos and videos to specific photo albums.

How do I set up Back Tap on my iPhone?

To enable Back Tap, go to your Settings app. Then go to AccessibilityTouchBack Tap. There, you’ll find a list of options for configuring Double Tap and Triple Tap.

Here is the full list of functions that you can map to a Double Tap or Triple Tap:

  • None
  • Accessibility Shortcut

System

  • App Switcher
  • Camera
  • Control Center
  • Flashlight
  • Home
  • Lock Rotation
  • Lock Screen
  • Mute
  • Notification Center
  • Reachability
  • Screenshot
  • Shake
  • Spotlight
  • Volume Down
  • Volume Up

Accessibility

  • AssistiveTouch
  • Background Sounds
  • Classic Invert
  • Color Filters
  • Control Nearby Devices
  • Dim Flashing Lights
  • Live Captions
  • Live Speech
  • Magnifier
  • Smart Invert
  • Speak Screen
  • VoiceOver
  • Zoom
  • Zoom Controller

Scroll Gestures

  • Scroll Down
  • Scroll Up

At the bottom of the menu, you’ll also see a list of Shortcuts. These options will vary depending on what’s available in your Shortcuts app.

The one potential downside to Back Tap is that you don’t get any tactile feedback when you use it, so you might accidentally trigger it at the wrong time and not realize it until later. For instance, you might double-tap without meaning to and set off your flashlight by accident. In that case, you might want to remap your Double Tap to a less conspicuous function. Or, you can leave Double Tap off and only use Triple Tap, which you probably won’t trigger as often.

How do I use Back Tap to take a quick photo?

One way to set up Back Tap is to map Double Tap to the Camera and Triple Tap to Volume Up or Volume Down. Because you can press either of the volume buttons to instantly take a picture, you can get the same effect if your volume buttons are mapped to Back Tap. With this combination, you can capture a photo with five quick taps on the back of your iPhone (though you’ll have to pause briefly between performing the Double Tap and Triple Tap, so that your phone can distinguish between the two actions).

This Back Tap combination even works if your phone is locked. Again, spend some time trying out different combinations of taps and features to find which ones are most useful for you.

Continue Reading

Technologies

Social Media and AI Want Your Attention at All Times. This New Documentary Says That’s Bad

Your Attention Please, a documentary premiering this week at SXSW in Austin, Texas, explores how we live in the attention economy.

«Do you remember the world before cellphones?»

The question comes early in Your Attention Please, a documentary premiering this week at South by Southwest in Austin, Texas. And it hit me harder than I expected. As a 27-year-old tech reporter, I realized I don’t have too many clear memories of life before smartphones. My adolescence unfolded alongside the rise of smartphones, social media, push notifications and the routine of endless scrolling. Like many people my age, I’ve spent most of my life inside the attention economy — without ever really stepping outside it.

That’s the uneasy territory the documentary explores. 

CNET was given exclusive early access to the film’s trailer, embedded below.

Exploring how tech shapes our behavior

Director Sara Robin said she originally set out to make something smaller: a documentary about people trying to reclaim their attention by breaking unhealthy phone habits. In an interview with CNET, Robin described the idea as a personal story about focus and self-control in an age of constant distraction.

As Robin interviewed researchers, technologists and families affected by social media and cyberbullying, the film’s scope widened. What started as a question about individual habits quickly became a larger investigation into how modern technology systems are designed to shape human behavior. The story stretches from the rise of social media to the emerging influence of AI. 

Along the way, Robin and her collaborators kept hearing the same observation from different corners of the digital world: Social media didn’t just change how people communicate; it quietly rewired what we value. Experiences that were once private or emotional — friendship, affection, belonging — began to acquire numerical equivalents. Followers, likes, comments, views and shares began to be how we saw our own self-worth. In the architecture of social platforms, those numbers function as a kind of social currency.

Trisha Prabhu, a digital-safety advocate and inventor of the anti-cyberbullying technology ReThink, argues that social platforms did more than create new online spaces. She says they fundamentally reshaped how social validation works. The metrics that define popularity often reward attention-seeking behavior and amplify conflict, while genuine connection is now harder to quantify and, therefore, easier to overlook.

Prabhu warns that the same dynamics already driving problems like cyberbullying could accelerate as automated systems become more capable. AI tools can generate abusive messages at scale, produce convincing impersonations or create deepfakes that spread rapidly online. In some cases, the technology may even blur the line between human interaction and machine-generated communication, which could deepen loneliness or encourage harmful behavior.

«There’s AI exacerbating existing harms [like automating cyberbullying], but then I also think that there’s AI creating completely new harms,» Prabhu told CNET. «There are reports of AI tools encouraging users, including minor users, to commit self-harm… Even for the everyday user who’s not experiencing the extreme outcome, I think we have to ask ourselves how much of our time and connection we want spent with an AI tool as opposed to a fellow human being.»

Bringing attention to attention

What struck Robin during filming the documentary was how universal these anxieties felt. Across conversations with families, educators and advocates around the world, the themes were remarkably consistent: overstimulated attention, declining focus in classrooms, rising anxiety among young people and a persistent sense of dread that comes from always being plugged in.

Those shared concerns have helped spark a coordinated moment around the film’s release.

On March 11, more than 25 organizations focused on digital well-being will simultaneously release the trailer for Your Attention Please as part of an initiative called Stand for Their Attention. What began as a small collaboration among five groups quickly grew as word spread through advocacy networks. The coalition now includes organizations such as Common Sense Media, Protect Young Eyes, Mothers Against Media Addiction, the Center for Humane Technology, Smartphone Free Childhood and Scrolling to Death. 

The idea behind the synchronized launch is simple: Use the attention surrounding the documentary to highlight the growing movement that’s already working to reshape digital culture. 

Many people feel overwhelmed by the scale of the problem, Robin says, but behind the scenes, a widening ecosystem of advocates is experimenting with ways to build healthier digital environments, from redesigning products to changing norms around screen use.

The campaign also arrives at a moment of growing scrutiny around the attention economy. Lawmakers in the US and abroad are increasingly debating how social platforms affect youth mental health and childhood development. Boycotts around AI use are taking off. Researchers are studying how these algorithms and chatbots influence behavior. Individuals are trying to figure out how much technology belongs in everyday life.

What can we do about it? 

Despite the weight of those conversations, Robin says the goal of the film isn’t to leave audiences feeling powerless. In fact, the rapid rise of public awareness around AI has made her more optimistic than she was during the early days of social media. The systems shaping digital life, she argues, are built by people, which means they can also be rebuilt.

«We have more power than we think,» Robin said. «And there are a lot of different ways to get involved in this, from changing individual habits to changing the culture in your own family and in your community, designing technology differently, getting engaged in these conversations, all the way to pushing for legislative change.»

The film intentionally avoids presenting a single solution.

Instead, Your Attention Please asks a broader question: What happens when attention, one of the most human parts of our lives, becomes one of the most valuable commodities in the global economy? And perhaps more importantly, what kind of digital world do we want to build next?

Continue Reading

Technologies

Today’s NYT Connections: Sports Edition Hints and Answers for March 12, #535

Here are hints and the answers for the NYT Connections: Sports Edition puzzle for March 12, No. 535.

Looking for the most recent regular Connections answers? Click here for today’s Connections hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle and Strands puzzles.


Today’s Connections: Sports Edition is a tough one, with some very unusual categories. The blue one is pretty fun, actually. If you’re struggling with today’s puzzle but still want to solve it, read on for hints and the answers.

Connections: Sports Edition is published by The Athletic, the subscription-based sports journalism site owned by The Times. It doesn’t appear in the NYT Games app, but it does in The Athletic’s own app. Or you can play it for free online.

Read more: NYT Connections: Sports Edition Puzzle Comes Out of Beta

Hints for today’s Connections: Sports Edition groups

Here are four hints for the groupings in today’s Connections: Sports Edition puzzle, ranked from the easiest yellow group to the tough (and sometimes bizarre) purple group.

Yellow group hint: City of Brotherly Love.

Green group hint: NBA star.

Blue group hint: Grr! Meow! Roar!

Purple group hint: Think alphabet.

Answers for today’s Connections: Sports Edition groups

Yellow group: Philadelphia teams.

Green group: Associated with Larry Bird.

Blue group: Sports figures with animal names.

Purple group: Sports figures whose first names sound like two letters.

Read more: Wordle Cheat Sheet: Here Are the Most Popular Letters Used in English Words

What are today’s Connections: Sports Edition answers?

The yellow words in today’s Connections

The theme is Philadelphia teams. The four answers are 76ers, Flyers, Penn and Temple.

The green words in today’s Connections

The theme is associated with Larry Bird. The four answers are Celtics, French Lick, Pacers and Sycamores.

The blue words in today’s Connections

The theme is sports figures with animal names. The four answers are Bear Bryant, Cat Osterman, Catfish Hunter and Tiger Woods.

The purple words in today’s Connections

The theme is sports figures whose first names sound like two letters. The four answers are Casey Stengel (KC), CeeDee Lamb (CD), Katie Ledecky (KT) and Vijay Singh (VJ).

Continue Reading

Trending

Copyright © Verum World Media