Connect with us

Technologies

Apple Vision Pro Hands-On: Far Better Than I Was Ready For

I experienced incredible fidelity, surprising video quality and a really smooth interface. Apple’s first mixed-reality headset nails those, but lots of questions remain.

I was in a movie theater last December watching Avatar: The Way of Water in 3D, and I said to myself: «Wow, this is an immersive film I’d love to watch in next-gen VR.» That’s exactly what I experienced in Apple’s Vision Pro headset, and yeah, it’s amazing.

On Monday, I tried out the Vision Pro in a series of carefully picked demos during WWDC at Apple’s Cupertino, California, headquarters. I’ve been using cutting-edge VR devices for years, and I found all sorts of augmented reality memories bubbling up in my brain. Apple’s compact — but still not small —headset reminds me of an Apple-designed Meta Quest Pro. The fit of the back strap was comfy yet stretchy, with a dial to adjust the rear fit and a top strap for stability. The headset’s sleek design, and even its glowing front faceplate, also gave me an instant Ready Player One vibe. 

vision-pro-apple-walks-through-mixed-reality-headset-design-mp4-00-00-37-04-still001.png vision-pro-apple-walks-through-mixed-reality-headset-design-mp4-00-00-37-04-still001.png
Watch this: Apple Vision Pro: I Tried Apple’s AR/VR Headset

05:35

I couldn’t wear my glasses during the demo, though, and neither will you. Apple’s headset does not support glasses, instead relying on Zeiss custom inserts to correct wearers’ vision. Apple did manage, through a setup process, to easily find lenses that fit my vision well enough so that everything seemed crystal clear, which is not an easy task. Also, we adjusted the fit and tuned spatial audio for my head using an iPhone, a system that will be finessed when the headset is released in 2024.

From there, I did my demos seated, mostly, and found myself surprised from the start. The passthrough video camera quality of this headset is good —really, really good. Not as good as my own vision, but good enough that I could see the room well, see people in it with me, see my watch notifications easily on my wrist. The only headset that’s done this previously was the extremely impressive but PC-connected Varjo XR-3, and Apple’s display and cameras feel even better.

Apple’s floating grid of apps appears when I press the top digital crown, which autocenters the home screen to wherever I’m looking. I set up eye tracking, which worked like on many other VR headsets I’ve used: I looked at glowing dots as musical notes played, and got a chime when it all worked.

An app menu in Apple's VisionOS. An app menu in Apple's VisionOS.

A list of apps as they would appear inside of the Apple Vision Pro headset.

Apple/Screenshot by CNET

From there, the interface was surprisingly fluid. Looking at icons or interface options slightly enlarges them, or changes how bold they appear. Tapping with my fingers while looking at something opens an app. 

I’ve used tons of hand-tracking technology on headsets like the HoloLens 2 and the Meta Quest 2 and Pro, and usually there’s a lot of hand motion required. Here, I could be really lazy. I pinched to open icons even while my hand was resting in my lap, and it worked. 

Scrolling involves pinching and pulling with my fingers; again, pretty easy to do. I resized windows by moving my hand to throw a window across the room or pin it closer to me. I opened multiple apps at once, including Safari, Messages and Photos. It was easy enough to scroll around, although sometimes my eye tracking needed a bit of extra concentration to pull off.

Apple’s headset uses eye tracking constantly in its interface, something Meta’s Quest Pro and even the PlayStation VR 2 don’t do. That might be part of the reason for the external battery pack. The emphasis on eye tracking as a major part of the interface felt transformative, in a way I expected might be the case for VR and AR years ago. What I don’t know is how it will feel in longer sessions.

I don’t know how the Vision Pro will work with keyboards and trackpads, since I didn’t get to demo the headset that way. It works with Apple’s Magic Keyboard and Magic Trackpad, and Macs, but not with iPhone and iPad or Watch touchscreens —not now, at least.

Dialing in reality

I scrolled through some photos in Apple’s preset photo album, plus a few 3D photos and video clips shot with the Vision Pro’s 3D camera. All the images looked really crisp, and a panoramic photo that spread around me looked almost like it was a window on a landscape that extended just beyond the room I was in. 

Apple has volumetric 3D landscapes on the Vision Pro that are immersive backgrounds like 3D wallpaper, but looking at one really shows off how nice that Micro OLED display looks. A lake looked like it was rolling up to a rocky shore that ended right where the real coffee table was in front of me. 

man using keyboard to work using apple vision pro headset man using keyboard to work using apple vision pro headset

Raising my hands to my face, I saw how the headset separates my hands from VR, a trick that’s already in Apple’s ARKit. It’s a little rough around the edges but good enough. Similarly, there’s a wild new trick where anyone else in the room can ghost into view if you look at them, a fuzzy halo with their real passthrough video image slowly materializing. It’s meant to help create meaningful contact with people while wearing the headset. I wondered how you could turn that off or tune it to be less present, but it’s a very new idea in mixed reality.

Apple’s digital crown, a small dial borrowed from the Apple Watch, handles reality blend. I could turn the dial to slowly extend the 3D panorama until it surrounded me everywhere, or dial it back so it just emerged a little bit like a 3D window. 

Mixed reality in Apple’s headset looks so casually impressive that I almost didn’t appreciate how great it was. Again, I’ve seen mixed reality in VR headsets before (Varjo XR-3, Quest Pro), and I’ve understood its capabilities. Apple’s execution of mixed reality felt much more immersive, rich and effortless on most fronts, with a field of view that felt expansive and rich. I can’t to see more experiences in it.

Cinematic fidelity that wowed me

The cinema demo was what really shocked me, though. I played a 3D clip of Avatar: The Way of Water in-headset, on a screen in various viewing modes including a cinema. Apple’s mixed-reality passthrough can also dim the rest of the world down a bit, in a way similar to how the Magic Leap 2 does with its AR. But the scenes of Way of Water sent little chills through me. It was vivid. This felt like a movie experience. I don’t feel that way in other VR headsets.

Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water

Avatar: The Way of Water looked great in the Vision Pro.

20th Century Studios

Apple also demonstrated its Immersive Video format that’s coming as an extension to Apple TV Plus. It’s a 180-degree video format, similar to what I’ve seen before in concept, but with really strong resolution and video quality. A splash demo reel of Alicia Keys singing, Apple Sports events, documentary footage and more reeled off in front of me, a teaser of what’s to come. One-eighty-degree video never appears quite as crisp to me as big-screen film content, but the sports clips I saw made me wonder how good virtual Jets games could be in the future. Things have come a long way.

Would I pay $3,499 for a head-worn cinema? No, but it’s clearly one of this device’s greatest unique strengths. The resolution and brightness of the display were surprising.

appledisneypic appledisneypic
Watch this: Apple, Disney Partner on Vision Pro Entertainment

03:59

Convincing avatars (I mean, Personas)

Apple’s Personas are 3D-scanned avatars generated by using the Vision Pro to scan your face, making a version of yourself that shows up in FaceTime chats if you want, or also on the outside of the Vision Pro’s curved OLED display to show whether you’re «present» or in an app. I didn’t see how that outer display worked, but I had a FaceTime with someone in their Persona form, and it was good. Again, it looked surprisingly good.

I’ve chatted with Meta’s ultra-realistic Codec Avatars, which aim for realistic representations of people in VR. Those are stunning, and I’ve also seen Meta’s phone-scanned step-down version in an early form last year, where a talking head spoke to me in VR. Apple’s Persona looked better than Meta’s phone-scanned avatar, although a bit fuzzy around the edges, like a dream. The woman whose Persona was scanned appeared in her own window, not in a full-screen form. 

And I wondered how expressive the emotions are with the Vision Pro’s scanning cameras. The Pro has an ability to scan jaw movement similar to the Quest Pro, and the Persona I chatted with was friendly and smiling. How would it look for someone I know, like my mom? Here, it was good enough that I forgot it was a scan.

We demoed a bit of Apple’s Freeform app, where a collaboration window opened up while my Persona friend chatted in another window. 3D objects popped up in the Freeform app, a full home scan. It looked realistic enough.

Dinosaurs in my world

The final demo was an app experience called Encounter Dinosaurs, which reminded me of early VR app demos I had years ago: An experience emphasizing just the immersive «wow» factor of dinosaurs appearing in a 3D window that seemed to open up in the back wall of my demo room. Creatures that looked like carnotauruses slowly walked through the window and into my space. 

All my demos were seated except for this one, where I stood up and walked around a bit. This sounds like it wouldn’t be an impressive demo, but again, the quality of the visuals and how they looked in relation to the room’s passthrough video capture was what made it feel so great. As the dinosaur snapped at my hand, it felt pretty real. And so did a butterfly that danced through the room and tried to land on my extended finger.

I smiled. But even more so, I was impressed when I took off the headset. My own everyday vision wasn’t that much sharper than what Apple’s passthrough cameras provided. The gap between the two was closer than I would have expected, and it’s what makes Apple’s take on mixed reality in VR work so well.

Then there’s the battery pack. There’s a corded battery that’s needed to power the headset, instead of a built-in battery like most others have. That meant I had to make sure to grab the battery pack as I started to move around, which is probably a reason why so many of Apple’s demos were seated.

230605-clean-apple-wwdc-supercut-thumbnail-1 230605-clean-apple-wwdc-supercut-thumbnail-1
Watch this: Everything Apple Announced at WWDC 2023

11:44

What about fitness and everything else?

Apple didn’t emphasize fitness much at all, a surprise to me. VR is already a great platform for fitness, although no one’s finessed headset design for fitness comfort. Maybe having that battery pack right now will limit movement in active games and experiences. Maybe Apple will announce more plans here later. The only taste I got of health and wellness was a one-minute micro meditation, which was similar to the one on the Apple Watch. It was pretty, and again a great showcase of the display quality, but I want more.

2024 is still a while away, and Apple’s headset is priced way out of range for most people. And I have no idea how functional this current headset would feel if I were doing everyday work. But Apple did show off a display, and an interface, that are far better than I was ready for. If Apple can build on that, and the Vision Pro finds ways of expanding its mixed-reality capabilities, then who knows what else is possible?

This was just my fast-take reaction to a quick set of demos on one day in Cupertino. There are a lot more questions to come, but this first set of demos resonated with me. Apple showed what it can do, and we’re not even at the headset’s launch yet.

Technologies

YouTubers Sue Amazon, Claim AI Tool Was Trained on Scraped Videos

The lawsuit alleges that Amazon bypassed YouTube protections to collect content for its generative AI video system.

A group of YouTube creators is suing Amazon, accusing the tech giant of secretly scraping their videos to train its AI video model without permission.

The proposed class action lawsuit, filed in federal court in Seattle, alleges Amazon used automated tools to download and extract data from millions of YouTube videos to build and improve its Nova Reel generative AI system — a model that can create short videos from text prompts and images. 

At the center of the complaint is how that data was obtained. The plaintiffs claim that Amazon bypassed YouTube’s protections using virtual machines and rotating IP addresses to avoid detection, effectively sidestepping the platform’s safeguards against bulk downloading

The lawsuit was brought by several creators, including Ted Entertainment (the company behind the H3 Podcast and h3h3 Productions), as well as individual YouTubers and channel operators. They argue that the alleged scraping violated copyright law and the Digital Millennium Copyright Act, and are seeking damages as well as an injunction to stop the practice. 

Amazon did not respond to a request for comment.

The case lands at a pivotal moment for generative AI, as courts weigh whether training on copyrighted material qualifies as fair use and how much control creators retain once their work is used to build these systems. The disputes have often centered on written material, which has been at the center of the AI revolution for several years, while AI video generators such as OpenAI’s Sora and Google’s Veo have emerged more recently.

The lawsuit is one of dozens testing the boundaries of AI training practices, alongside high-profile cases from authors, artists and news organizations, including lawsuits against OpenAI and Meta, all circling the same unresolved question: Where does fair use end and infringement begin?

Continue Reading

Technologies

The Galaxy Z TriFold Is Back. You Can Buy It From Samsung Soon

The $2,899 phone paused its sales in March after selling through its inventory, but Samsung is bringing it back to its online store.

Samsung’s $2,899 Galaxy Z TriFold is going back on sale on Friday, following a halt to its sales in March after the foldable phone sold through its inventory. Samsung has announced the TriFold’s return with a countdown clock on the phone’s online store page along with a Wednesday newsletter email sent to customers.

The initial pause, which Samsung said at the time was related to the TriFold being a «super-premium device in limited quantities,» happened after just three months of availability. The TriFold first went on sale in South Korea on Dec. 12 and then arrived in Samsung’s US store on Jan. 30. The TriFold sold out in the US within minutes of going on sale — which I know personally after joining my colleagues that morning in an attempt to buy it. Thankfully Senior Reporter Abrar Al-Heeti succeeded, and then reviewed the TriFold.

It’s unclear whether the Galaxy Z TriFold is now permanently returning to Samsung’s online store or if it is again on sale until its stock sells through. Given that the phone is very expensive, and unfolds to reveal a large, 10-inch display, it wouldn’t be surprising if its stock will be in limited quantities. We’ve asked a Samsung representative to clarify and will update if we hear more.

The Galaxy Z TriFold’s return also comes ahead of the summer season when we expect a slew of other foldable phones: Samsung typically refreshes its Galaxy Z Fold and Z Flip line in July or August, and Motorola has announced its first book-style Razr Fold phone will also debut during the season. And Apple’s rumored iPhone Fold (or perhaps iPhone Ultra based on latest rumors) could also be teased later this year.

Continue Reading

Technologies

Help Us Crown the Most Loved Headphones and Earbuds of 2026

Got a pair you swear by? Take our People’s Picks survey to help us find a winner.

CNET just launched People’s Picks, a series of surveys where actual humans like you vote for the products and services you use. Starting in April, we want you to weigh in on your favorite headphones and earbuds. We’ll pick a winner based on which ones you love the most. 

Why we want to hear from you

Our writers and editors test hundreds of products each year, but your real-world experience with these devices is something we can’t replicate in our labs. You’ve used these headphones at the gym, on your commute to work and on long flights, and that perspective is invaluable. Your voice helps others know about the headphones or earbuds you love, too.

«I review a lot of headphones and earbuds for CNET, and there are plenty of great models from the top brands in this survey that I rate highly. I’m always curious about what models people ultimately choose and why, so I’m excited to get your feedback and learn the results of this survey,» says David Carnoy, CNET’s executive editor and headphones expert.

With our survey, we’ll collect answers from real-world users like you. The headphones and earbuds chosen through our 3-minute survey will be featured in our People’s Picks roundup of the top picks based on your recommendation.

Make your voice heard

Whether you swear by a pair of $25 earbuds or love a pair of high-end headphones, your pick counts. The survey takes just a few minutes to complete, and after we gather enough information, we’ll tally the results and publish the winners.

Not sure what to pick? Check out our Best Headphones to revisit your favorites before voting.

Continue Reading

Trending

Copyright © Verum World Media