Connect with us

Technologies

Apple’s Vision Pro Hands-On: This Is the VR Headset I’d Use to Watch 3D Avatar

I experienced incredible fidelity, surprising video quality and a really smooth interface. Apple’s first VR headset nails it.

I was in a movie theater last December watching Avatar: The Way of Water in 3D and I said to myself: «Wow, this is an immersive film I’d love to watch in next-gen VR.» That’s exactly what I just experienced in Apple’s Vision Pro headset, and yeah, it’s amazing.

I just tried out Vision Pro in a series of carefully-picked WWDC demos at WWDC at Apple’s Cupertino headquarters. I’ve been using cutting-edge VR devices for years and I found all sorts of augmented reality memories bubbling up in my brain. Apple’s compact – but still not small – headset reminds me of an Apple-designed Meta Quest Pro. The fit of the back strap was comfy yet stretchy, with a dial to adjust the rear fit and a top strap for the top of my head. 

apple-reveals-vision-pro-mixed-reality-headset-mp4-00-00-12-10-still001.png apple-reveals-vision-pro-mixed-reality-headset-mp4-00-00-12-10-still001.png
Watch this: First Impressions of Apple’s Vision Pro Mixed Reality Headset

04:21

I couldn’t wear my glasses during the demo. Apple’s headset does not support glasses, instead relying on Zeiss custom inserts to correct wearers’ vision. Apple did manage to easily find lenses that fit my vision well enough so that everything seemed crystal-clear, which is not an easy task. Also, we adjusted the fit and tuned spatial audio for my head, a process that will be finessed when the headset releases in 2024.

From there, I did my demos seated, mostly, and found myself surprised from the start. The passthrough video camera quality of this headset is good. Really, really good. Not as good as my own vision, but good enough that I could see the room well, see people in it with me, see my watch notifications easily on my wrist. The only headset that’s done this is the Varjo XR-3, and Apple’s display and cameras may rival that one.

Apple’s floating grid of apps appears when I press the top digital crown, which auto-centers the home screen to wherever I am looking. I set up eye tracking, which worked like many other VR headsets I’ve used: I looked at glowing dots as musical notes played, and a successful chime when it all worked.

An app menu in Apple's VisionOS. An app menu in Apple's VisionOS.

A list of apps as they would appear inside of the Apple Vision Pro headset.

Apple

From there, the interface was surprisingly fluid. Looking at icons or interface options slightly enlarges them, or changes their boldness. Tapping with my fingers while looking at something opens an app up. 

I’ve used tons of hand tracking technology on headsets like the Hololens 2 and Meta Quest 2 and Pro, and usually there’s a lot of hand motion. Here, I could be really lazy. I pinched to open icons even as my hand was resting in my lap, and it worked. 

Scrolling involves pinching and pulling with my fingers, again pretty easy to do. I resized windows by moving my hand to throw a window across the room or pin it closer to me. I opened multiple apps at once, including Safari, Messages, and Photos. It was easy enough to scroll around, although sometimes my eye tracking needed a bit of extra concentration to pull off.

I don’t know how the Vision Pro will work with keyboards and trackpads, since I didn’t get to demo the headset that way. But it works with Apple’s Magic Keyboard and Magic Trackpad, and Macs. But, not iPhone and iPad or Watch touchscreens…not now, at least.

Dialing in reality

I scrolled through some photos in Apple’s preset photo album, plus a few 3D photos and video clips shot with the Vision Pro’s 3D camera. All the images looked really crisp, and a panoramic photo that spread around me looked almost like it was a window on a landscape that extended just beyond the room I was in. 

Apple has volumetric 3D landscapes on the Vision Pro that are immersive backgrounds like 3D wallpaper, but looking at one really shows off how nice that Micro OLED display looks. A lake looked like it was rolling up to a rocky shore that ended right where the real coffee table in front of me was. 

man using keyboard to work using apple vision pro headset man using keyboard to work using apple vision pro headset

Apple/Screenshot by James Martin/CNET

Raising my hands to my face, I saw how the headset separates my hands out from VR, a trick that’s already in ARKit. It’s a little rough around the edges but good enough. Similarly, there’s a wild new trick where anyone else in the room can ghost into view if you look at them, a fuzzy halo with their real passthrough video image slowly materializing. It’s meant to help create meaningful contact with people while wearing the headset. I also wondered how you could turn that off or tune it to be less present, but it’s a very new idea in mixed reality.

Apple’s digital crown, a small dial borrowed from the Apple Watch, handles reality blend. I could turn the dial to slowly extend the 3D panorama until it surrounded me everywhere, or dial it back so it just emerged a little bit like a 3D window. 

Cinematic fidelity that wowed me

The cinema demo was what really shocked me, though. I played a 3D clip of Avatar: The Way of Water in-headset, on a screen in various viewing modes including a cinema. Apple’s mixed reality passthrough can also dim the rest of the world down a bit, in a way like the Magic Leap 2 does with its AR. But the scenes of Way of Water sent little chills through me. It was vivid. This felt like a movie experience. I don’t feel that way in other VR headsets.

Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water

Avatar: The Way of Water looked great in Vision Pro.

20th Century Studios

Apple also demonstrated its Immersive Video format that’s coming as an extension to Apple TV Plus. It’s a 180-degree video format, similar to what I’ve seen before in concept, but with really strong resolution and video quality. A splash demo reel of Alisha Keys singing, Apple Sports events, documentary footage, and more reeled off in front of me, a teaser of what’s to come. 180-degree video is never quite as crisp to me as big-screen film content, but the sports clips I saw made me wonder how good virtual Jets games could be in the future. Things have come a long way.

Would I pay $3,499 for a head-worn cinema? No, but it’s clearly one of this device’s greatest unique strengths. The resolution and brightness of the display was surprising.

Convincing avatars (I mean personas)

Apple’s Personas are 3D-scanned avatars generated by using the Vision Pro to scan your face, making a version of yourself that shows up in FaceTime chats if you want, or also on the outside of the Vision Pro’s curved OLED display to show whether you’re «present» or in an app. I didn’t see how that outer display worked, but I had a FaceTime with someone in their Persona form, and it was good. Again, it looked surprisingly good.

I’ve seen Meta’s ultra-realistic Codec Avatars, which aim for realistic representations of people in VR. Those are stunning, and I’ve also seen Meta’s phone-scanned step-down version in an early form last year, where a talking head spoke to me in VR. Apple’s Persona looked better than Meta’s phone-scanned avatar, although a bit fuzzy around the edges, like a dream. The woman whose Persona was scanned appeared in her own window, not in a full-screen form. 

And I wondered how expressive the emotions are with the Vision Pro’s scanning cameras. The Pro has an ability to scan jaw movement similar to the Quest Pro, and the Persona I chatted with was friendly and smiling. How would it look for someone I know, like my mom? Here, it was good enough that I forgot it was a scan.

We demoed a bit of Apple’s Freeform app, where a collaboration window opened up while my Persona friend chatted in another window. 3D objects popped up in the Freeform app, a full home scan. It looked realistic enough.

Dinosaurs in my world

The final demo was an app experience called Encounter Dinosaurs, which reminded me of early VR app demos I had way back: an experience emphasizing just the immersive wow factor of dinosaurs appearing in a 3D window that seemed to open up in the back wall of my demo room. Creatures that looked like carnotauruses slowly walked through the window and into my space. 

All my demos were seated except for this one, where I stood up and walked around a bit. This sounds like it wouldn’t be an impressive demo, but again, the quality of the visuals and how they looked in relation to the room’s passthrough video capture was what made it feel so great. As the dinosaur snapped at my hand, it felt pretty real. And so did a butterfly that danced through the room and tried to land on my extended finger.

I smiled. But even more so, I was impressed when I took off the headset. My own everyday vision wasn’t that much sharper than what Apple’s passthrough cameras provided. The gap between the two was closer than I would have expected, and it’s what makes Apple’s take on mixed reality in VR work so well.

But, the battery pack. There’s a corded battery that’s needed to power the headset, instead of a built-in battery like most others have. That meant I had to make sure to grab the battery pack as I started to move around, which is probably a reason why so many of Apple’s demos were seated.

What about fitness, and everything else?

Apple didn’t emphasize fitness much at all, a surprise to me. VR is already a great platform for fitness, although no one’s finessed headset design for fitness comfort. Maybe having that battery pack right now will limit movement in active games and experiences. Maybe Apple will announce more plans here later. The only taste I got of health and wellness was a one-minute micro meditation, which was similar to the one on the Apple Watch. Pretty, and again a great showcase of the display quality, but I want more.

2024 is still a while away, and Apple’s headset is priced way out of range for most people. And, I have no idea how much this current headset will feel functional doing everyday work. But Apple did show off a display, and an interface, that are far better than I was ready for. If Apple can build on that, and Vision Pro finds ways of expanding its mixed reality capabilities, then who knows what else is possible?

This was only my speedy reaction to a quick set of demos one day in Cupertino. There are a lot more questions to come.

Technologies

Want New iPhone Controls? Here’s the Latest From iOS 18.4

One control in particular brings Apple’s Visual Intelligence to more iPhones.

Apple released iOS 18.4 on March 31, and the update brought bug fixes, new emoji and a new recipes section in Apple News to all iPhones. The update also brought a handful of new controls to your iPhone’s Control Center, including a control that brings Visual Intelligence to the iPhone 15 Pro and Pro Max.

When Apple released iOS 18 in September, the update remodeled the Control Center and gave you more, uh… control over how the feature functions. With iOS 18, you can resize controls, assign some controls to their own dedicated page and adjust the placement of controls to your liking. Apple has also introduced more controls to the feature, making it a central hub for all your most-used iPhone features. 

With iOS 18.4, Apple continues to expand the number of controls you can add to the Control Center. If you have the update on your iPhone, you can add ambient music controls, and Apple Intelligence-enabled iPhones get a few AI controls in the menu, too.

Read more: Everything You Need to Know About iOS 18

Here’s what to know about the new controls and how to add them to your Control Center.

Ambient Music controls

Apple gave everyone four new controls in the Control Center library under the Ambient Music category. These controls are Sleep, Chill, Productivity and Wellbeing. Each of these controls can activate a playlist filled with music that corresponds to the specific control — Sleep, for example, plays ambient music to help lull you to bed.

Some studies suggest white noise could help adults learn words and improve learning in environments full of distractions. According to the mental health company Calm, certain kinds of music can help you fall asleep faster and improve the quality of your sleep. So these new controls can help you learn, fall asleep and more.

Here’s how to find these controls. 

1. Swipe down from the top-right corner of your Home Screen to open your Control Center. 
2. Tap the plus (+) sign in the top-left corner of your screen. 
3. Tap Add a Control. 

You should see a section of controls called Ambient Music. You can also search for «Ambient Music» in the search bar at the top of the control library. 

Under Ambient Music, you’ll see all four controls. Tap one (or all) of them to add them to your Control Center. Once you’ve added one or all the controls to your Control Center, go back to your Control Center and tap one to start playing music.

You can also change the playlist for each control. Here’s how.

1. Swipe down from the top-right corner of your Home Screen to open your Control Center. 
2. Tap the plus (+) sign in the top-left corner of your screen. 
3. Tap the Ambient Music control you want to edit.
4. Tap the playlist to the right of Playlist

A dropdown menu will appear with additional playlists for each control. So if you’re in the Sleep control, you’ll see other playlists like Restful Notes and Lo-Fi Snooze. If you have playlists in your Music app, you’ll also see the option From Library, which pulls music from your library. Tap whichever playlist you want and it will be assigned to that control. 

Apple already lets you transform your iPhone into a white noise machine with Background Sounds, like ocean and rain. But Ambient Music is actual music as opposed to more static sounds like in that feature. 

Both of these features feel like a way for Apple to present itself as the first option for when you want some background music to help you fall asleep or be productive. Other services, like Spotify and YouTube, already have ambient music playlists like these, so this could be Apple’s way of taking some of those service’s audience.

Apple Intelligence controls

Only people with an iPhone 15 Pro, Pro Max or the iPhone 16 lineup can access Apple Intelligence features for now, and those people got three new dedicated Apple Intelligence controls with iOS 18.4. Those controls are Talk to Siri, Type to Siri and Visual Intelligence. 

Here’s how to find these controls. 

1. Swipe down from the top-right corner of your Home Screen to open your Control Center. 
2. Tap the plus (+) sign in the top-left corner of your screen. 
3. Tap Add a Control. 

Then you can use the search bar near the top of the screen to search for «Apple Intelligence» or you can scroll through the menu to find the Apple Intelligence & Siri section. Tap any (or all) of these controls to add them to your Control Center.

While Talk to Siri and Type to Siri controls can be helpful if you have trouble accessing the digital assistant, the Visual Intelligence control is important because it brings the Apple Intelligence feature to the iPhone 15 Pro and Pro Max.

Originally, Visual Intelligence was only accessible on the iPhone 16 lineup since those devices had the Camera Control button. Visual Intelligence was only accessible through that button. 

With iOS 18.4, Visual Intelligence is now accessible on more devices and people thanks to the titular control in Control Center. But remember, Visual Intelligence is like any other AI tool so it won’t always be accurate. You should double check results and important information it shows you.

For more on iOS 18, here are all the new emoji you can use now and what to know about the recipes section in Apple News. You can also check out everything included in iOS 18.4 and our iOS 18 cheat sheet.

Continue Reading

Technologies

Gemini Live Now Has Eyes. We Put the New Feature to the Test

The new feature gives Gemini Live eyes to «see.» I put it through a series of tests. Here are the results.

There I was, walking around my apartment, taking a video with my phone and talking to Google’s Gemini Live. I was giving the AI a tour – and a quiz, asking it to name specific objects it saw. After it identified the flowers in a vase in my living room (chamomile and dianthus, by the way), I tried a curveball: I asked it to tell me where I’d left a pair of scissors. «I just spotted your scissors on the table, right next to the green package of pistachios. Do you see them?»

It was right, and I was wowed. 

Gemini Live will recognize a whole lot more than household odds and ends. Google says it’ll help you navigate a crowded train station or figure out the filling of a pastry. It can give you deeper information about artwork, like where an object originated and whether it was a limited edition.

It’s more than just a souped-up Google Lens. You talk with it and it talks to you. I didn’t need to speak to Gemini in any particular way – it was as casual as any conversation. Way better than talking with the old Google Assistant that the company is quickly phasing out.

Google and Samsung are just now starting to formally roll out the feature to all Pixel 9 (including the new, Pixel 9a) and Galaxy S25 phones. It’s available for free for those devices, and other Pixel phones can access it via a Google AI Premium subscription. Google also released a new YouTube video for the April 2025 Pixel Drop showcasing the feature, and there’s now a dedicated page on the Google Store for it.

All you have to do to get started is go live with Gemini, enable the camera and start talking.

Gemini Live follows on from Google’s Project Astra, first revealed last year as possibly the company’s biggest «we’re in the future» feature, an experimental next step for generative AI capabilities, beyond your simply typing or even speaking prompts into a chatbot like ChatGPT, Claude or Gemini. It comes as AI companies continue to dramatically increase the skills of AI tools, from video generation to raw processing power. Somewhat similar to Gemini Live, there’s Apple’s Visual Intelligence, which the iPhone maker released in a beta form late last year. 

My big takeaway is that a feature like Gemini Live has the potential to change how we interact with the world around us, melding our digital and physical worlds together just by holding your camera in front of almost anything.

I put Gemini Live to a real test

Somehow Gemini Live showed up on my Pixel 9 Pro XL a few days early, so I’ve already had a chance to play around with it. 

The first time I tried it, Gemini was shockingly accurate when I placed a very specific gaming collectible of a stuffed rabbit in my camera’s view. The second time, I showed it to a friend when we were in an art gallery. It not only identified the tortoise on a cross (don’t ask me), but it also immediately identified and translated the kanji right next to the tortoise, giving both of us chills and leaving us more than a little creeped out. In a good way, I think.

In the tour of my apartment, I was following the lead of the demo that Google did last summer when it first showed off these Live video AI capabilities. I tried random objects in my apartment (fruit, books, Chapstick), many of which it easily identified. 

Then I got thinking about how I could stress-test the feature. I tried to screen-record it in action, but it consistently fell apart at that task. And what if I went off the beaten path with it? I’m a huge fan of the horror genre — movies, TV shows, video games — and have countless collectibles, trinkets and what have you. How well would it do with more obscure stuff — like my horror-themed collectibles?

First, let me say that Gemini can be both absolutely incredible and ridiculously frustrating in the same round of questions. I had roughly 11 objects that I was asking Gemini to identify, and it would sometimes get worse the longer the live session ran, so I had to limit sessions to only one or two objects. My guess is that Gemini attempted to use contextual information from previously identified objects to guess new objects put in front of it, which sort of makes sense, but ultimately neither I nor it benefited from this.

Sometimes, Gemini was just on point, easily landing the correct answers with no fuss or confusion, but this tended to happen with more recent or popular objects. For example, I was pretty surprised when it immediately guessed one of my test objects was not only from Destiny 2, but was a limited edition from a seasonal event from last year. 

At other times, Gemini would be way off the mark, and I would need to give it more hints to get into the ballpark of the right answer. And sometimes, it seemed as though Gemini was taking context from my previous live sessions to come up with answers, identifying multiple objects as coming from Silent Hill when they were not. I have a display case dedicated to the game series, so I could see why it would want to dip into that territory quickly.

Gemini can get full-on bugged out at times. On more than one occasion, Gemini misidentified one of the items as a made-up character from the unreleased Silent Hill: f game, clearly merging pieces of different titles into something that never was. The other consistent bug I experienced was when Gemini would produce an incorrect answer, and I would correct it and hint closer at the answer — or straight up give it the answer, only to have it repeat the incorrect answer as if it was a new guess. When that happened, I would close the session and start a new one, which wasn’t always helpful.

One trick I found was that some conversations did better than others. If I scrolled through my Gemini conversation list, tapped an old chat that had gotten a specific item correct, and then went live again from that chat, it would be able to identify the items without issue. While that’s not necessarily surprising, it was interesting to see that some conversations worked better than others, even if you used the same language. 

Google didn’t respond to my requests for more information on how Gemini Live works.

I wanted Gemini to successfully answer my sometimes highly specific questions, so I provided plenty of hints to get there. The nudges were often helpful, but not always. Below are a series of objects I tried to get Gemini to identify and provide information about. 

Continue Reading

Technologies

Today’s NYT Strands Hints, Answers and Help for April 11, #404

Do you want to play a game? Here are hints and answers for the NYT Strands puzzle No. 404 for April 11.

Looking for the most recent Strands answer? Click here for our daily Strands hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections and Connections: Sports Edition puzzles.


Today’s NYT Strands puzzle might be tricky. You’ll do well if you watch a lot of a certain kind of TV competition. (I don’t, so I did horribly today.) If you need hints and answers, read on.

I go into depth about the rules for Strands in this story. 

If you’re looking for today’s Wordle, Connections and Mini Crossword answers, you can visit CNET’s NYT puzzle hints page.

Read more: NYT Connections Turns 1: These Are the 5 Toughest Puzzles So Far

Hint for today’s Strands puzzle

Today’s Strands theme is: Buzzing in

If that doesn’t help you, here’s a clue: Win big.

Clue words to unlock in-game hints

Your goal is to find hidden words that fit the puzzle’s theme. If you’re stuck, find any words you can. Every time you find three words of four letters or more, Strands will reveal one of the theme words. These are the words I used to get those hints, but any words of four or more letters that you find will work:

  • READ, READY, DIME, CHOP, CHOPS, PASS, DREAD, DOME, DOMES, WORD, SHOP, SHOW, WORDS, PARE, SWORD.

Answers for today’s Strands puzzle

These are the answers that tie into the theme. The goal of the puzzle is to find them all, including the spangram, a theme word that reaches from one side of the puzzle to the other. When you’ve got all of them (I originally thought there were always eight but learned that the number can vary), every letter on the board will be used. Here are the nonspangram answers:

  • LINGO, PYRAMID, JEOPARDY, PASSWORD, CATCHPHRASE.

Today’s Strands spangram

Today’s Strands spangram is GAMESHOWS.  To find it, start with the G that’s five letters down on the farthest row to the left, and wind across.

Toughest Strands puzzles

Here are some of the Strands topics I’ve found to be the toughest in recent weeks.

#1: Dated slang, Jan. 21. Maybe you didn’t even use this lingo when it was cool. Toughest word: PHAT.

#2: Thar she blows! Jan.15. I guess marine biologists might ace this one. Toughest word: BALEEN or RIGHT. 

#3: Off the hook, Jan. 9. Similar to the Jan. 15 puzzle in that it helps to know a lot about sea creatures. Sorry, Charlie. Toughest word: BIGEYE or SKIPJACK

Continue Reading

Trending

Copyright © Verum World Media