Connect with us

Technologies

Apple Vision Pro Hands-On: Far Better Than I Was Ready For

I experienced incredible fidelity, surprising video quality and a really smooth interface. Apple’s first mixed-reality headset nails those, but lots of questions remain.

I was in a movie theater last December watching Avatar: The Way of Water in 3D, and I said to myself: «Wow, this is an immersive film I’d love to watch in next-gen VR.» That’s exactly what I experienced in Apple’s Vision Pro headset, and yeah, it’s amazing.

On Monday, I tried out the Vision Pro in a series of carefully picked demos during WWDC at Apple’s Cupertino, California, headquarters. I’ve been using cutting-edge VR devices for years, and I found all sorts of augmented reality memories bubbling up in my brain. Apple’s compact — but still not small —headset reminds me of an Apple-designed Meta Quest Pro. The fit of the back strap was comfy yet stretchy, with a dial to adjust the rear fit and a top strap for stability. The headset’s sleek design, and even its glowing front faceplate, also gave me an instant Ready Player One vibe. 

vision-pro-apple-walks-through-mixed-reality-headset-design-mp4-00-00-37-04-still001.png vision-pro-apple-walks-through-mixed-reality-headset-design-mp4-00-00-37-04-still001.png
Watch this: Apple Vision Pro: I Tried Apple’s AR/VR Headset

05:35

I couldn’t wear my glasses during the demo, though, and neither will you. Apple’s headset does not support glasses, instead relying on Zeiss custom inserts to correct wearers’ vision. Apple did manage, through a setup process, to easily find lenses that fit my vision well enough so that everything seemed crystal clear, which is not an easy task. Also, we adjusted the fit and tuned spatial audio for my head using an iPhone, a system that will be finessed when the headset is released in 2024.

From there, I did my demos seated, mostly, and found myself surprised from the start. The passthrough video camera quality of this headset is good —really, really good. Not as good as my own vision, but good enough that I could see the room well, see people in it with me, see my watch notifications easily on my wrist. The only headset that’s done this previously was the extremely impressive but PC-connected Varjo XR-3, and Apple’s display and cameras feel even better.

Apple’s floating grid of apps appears when I press the top digital crown, which autocenters the home screen to wherever I’m looking. I set up eye tracking, which worked like on many other VR headsets I’ve used: I looked at glowing dots as musical notes played, and got a chime when it all worked.

An app menu in Apple's VisionOS. An app menu in Apple's VisionOS.

A list of apps as they would appear inside of the Apple Vision Pro headset.

Apple/Screenshot by CNET

From there, the interface was surprisingly fluid. Looking at icons or interface options slightly enlarges them, or changes how bold they appear. Tapping with my fingers while looking at something opens an app. 

I’ve used tons of hand-tracking technology on headsets like the HoloLens 2 and the Meta Quest 2 and Pro, and usually there’s a lot of hand motion required. Here, I could be really lazy. I pinched to open icons even while my hand was resting in my lap, and it worked. 

Scrolling involves pinching and pulling with my fingers; again, pretty easy to do. I resized windows by moving my hand to throw a window across the room or pin it closer to me. I opened multiple apps at once, including Safari, Messages and Photos. It was easy enough to scroll around, although sometimes my eye tracking needed a bit of extra concentration to pull off.

Apple’s headset uses eye tracking constantly in its interface, something Meta’s Quest Pro and even the PlayStation VR 2 don’t do. That might be part of the reason for the external battery pack. The emphasis on eye tracking as a major part of the interface felt transformative, in a way I expected might be the case for VR and AR years ago. What I don’t know is how it will feel in longer sessions.

I don’t know how the Vision Pro will work with keyboards and trackpads, since I didn’t get to demo the headset that way. It works with Apple’s Magic Keyboard and Magic Trackpad, and Macs, but not with iPhone and iPad or Watch touchscreens —not now, at least.

Dialing in reality

I scrolled through some photos in Apple’s preset photo album, plus a few 3D photos and video clips shot with the Vision Pro’s 3D camera. All the images looked really crisp, and a panoramic photo that spread around me looked almost like it was a window on a landscape that extended just beyond the room I was in. 

Apple has volumetric 3D landscapes on the Vision Pro that are immersive backgrounds like 3D wallpaper, but looking at one really shows off how nice that Micro OLED display looks. A lake looked like it was rolling up to a rocky shore that ended right where the real coffee table was in front of me. 

man using keyboard to work using apple vision pro headset man using keyboard to work using apple vision pro headset

Raising my hands to my face, I saw how the headset separates my hands from VR, a trick that’s already in Apple’s ARKit. It’s a little rough around the edges but good enough. Similarly, there’s a wild new trick where anyone else in the room can ghost into view if you look at them, a fuzzy halo with their real passthrough video image slowly materializing. It’s meant to help create meaningful contact with people while wearing the headset. I wondered how you could turn that off or tune it to be less present, but it’s a very new idea in mixed reality.

Apple’s digital crown, a small dial borrowed from the Apple Watch, handles reality blend. I could turn the dial to slowly extend the 3D panorama until it surrounded me everywhere, or dial it back so it just emerged a little bit like a 3D window. 

Mixed reality in Apple’s headset looks so casually impressive that I almost didn’t appreciate how great it was. Again, I’ve seen mixed reality in VR headsets before (Varjo XR-3, Quest Pro), and I’ve understood its capabilities. Apple’s execution of mixed reality felt much more immersive, rich and effortless on most fronts, with a field of view that felt expansive and rich. I can’t to see more experiences in it.

Cinematic fidelity that wowed me

The cinema demo was what really shocked me, though. I played a 3D clip of Avatar: The Way of Water in-headset, on a screen in various viewing modes including a cinema. Apple’s mixed-reality passthrough can also dim the rest of the world down a bit, in a way similar to how the Magic Leap 2 does with its AR. But the scenes of Way of Water sent little chills through me. It was vivid. This felt like a movie experience. I don’t feel that way in other VR headsets.

Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water

Avatar: The Way of Water looked great in the Vision Pro.

20th Century Studios

Apple also demonstrated its Immersive Video format that’s coming as an extension to Apple TV Plus. It’s a 180-degree video format, similar to what I’ve seen before in concept, but with really strong resolution and video quality. A splash demo reel of Alicia Keys singing, Apple Sports events, documentary footage and more reeled off in front of me, a teaser of what’s to come. One-eighty-degree video never appears quite as crisp to me as big-screen film content, but the sports clips I saw made me wonder how good virtual Jets games could be in the future. Things have come a long way.

Would I pay $3,499 for a head-worn cinema? No, but it’s clearly one of this device’s greatest unique strengths. The resolution and brightness of the display were surprising.

appledisneypic appledisneypic
Watch this: Apple, Disney Partner on Vision Pro Entertainment

03:59

Convincing avatars (I mean, Personas)

Apple’s Personas are 3D-scanned avatars generated by using the Vision Pro to scan your face, making a version of yourself that shows up in FaceTime chats if you want, or also on the outside of the Vision Pro’s curved OLED display to show whether you’re «present» or in an app. I didn’t see how that outer display worked, but I had a FaceTime with someone in their Persona form, and it was good. Again, it looked surprisingly good.

I’ve chatted with Meta’s ultra-realistic Codec Avatars, which aim for realistic representations of people in VR. Those are stunning, and I’ve also seen Meta’s phone-scanned step-down version in an early form last year, where a talking head spoke to me in VR. Apple’s Persona looked better than Meta’s phone-scanned avatar, although a bit fuzzy around the edges, like a dream. The woman whose Persona was scanned appeared in her own window, not in a full-screen form. 

And I wondered how expressive the emotions are with the Vision Pro’s scanning cameras. The Pro has an ability to scan jaw movement similar to the Quest Pro, and the Persona I chatted with was friendly and smiling. How would it look for someone I know, like my mom? Here, it was good enough that I forgot it was a scan.

We demoed a bit of Apple’s Freeform app, where a collaboration window opened up while my Persona friend chatted in another window. 3D objects popped up in the Freeform app, a full home scan. It looked realistic enough.

Dinosaurs in my world

The final demo was an app experience called Encounter Dinosaurs, which reminded me of early VR app demos I had years ago: An experience emphasizing just the immersive «wow» factor of dinosaurs appearing in a 3D window that seemed to open up in the back wall of my demo room. Creatures that looked like carnotauruses slowly walked through the window and into my space. 

All my demos were seated except for this one, where I stood up and walked around a bit. This sounds like it wouldn’t be an impressive demo, but again, the quality of the visuals and how they looked in relation to the room’s passthrough video capture was what made it feel so great. As the dinosaur snapped at my hand, it felt pretty real. And so did a butterfly that danced through the room and tried to land on my extended finger.

I smiled. But even more so, I was impressed when I took off the headset. My own everyday vision wasn’t that much sharper than what Apple’s passthrough cameras provided. The gap between the two was closer than I would have expected, and it’s what makes Apple’s take on mixed reality in VR work so well.

Then there’s the battery pack. There’s a corded battery that’s needed to power the headset, instead of a built-in battery like most others have. That meant I had to make sure to grab the battery pack as I started to move around, which is probably a reason why so many of Apple’s demos were seated.

230605-clean-apple-wwdc-supercut-thumbnail-1 230605-clean-apple-wwdc-supercut-thumbnail-1
Watch this: Everything Apple Announced at WWDC 2023

11:44

What about fitness and everything else?

Apple didn’t emphasize fitness much at all, a surprise to me. VR is already a great platform for fitness, although no one’s finessed headset design for fitness comfort. Maybe having that battery pack right now will limit movement in active games and experiences. Maybe Apple will announce more plans here later. The only taste I got of health and wellness was a one-minute micro meditation, which was similar to the one on the Apple Watch. It was pretty, and again a great showcase of the display quality, but I want more.

2024 is still a while away, and Apple’s headset is priced way out of range for most people. And I have no idea how functional this current headset would feel if I were doing everyday work. But Apple did show off a display, and an interface, that are far better than I was ready for. If Apple can build on that, and the Vision Pro finds ways of expanding its mixed-reality capabilities, then who knows what else is possible?

This was just my fast-take reaction to a quick set of demos on one day in Cupertino. There are a lot more questions to come, but this first set of demos resonated with me. Apple showed what it can do, and we’re not even at the headset’s launch yet.

Technologies

Watch a Robot Stuff Cash Into a Wallet Just Like You Do

Generalist AI’s Gen-1 model is all about «teaching robots physical common sense.»

In 2026, we’re seeing robots progress by leaps and bounds with markedly improved dexterity, the kind of progress long needed in the quest for truly useful household helpers. Now a new AI model has arrived to power robots through activities, including folding laundry, constructing boxes, fixing other robots and even filling wallets with flimsy paper money.

Earlier this month, California-based company Generalist AI released Gen-1, a new physical AI model that makes robots capable of performing all of these tasks (and more) with success. It’s a big step forward in terms of robots designed for the real world based on intelligence born from the real world, Pete Florence, co-founder and CEO of Generalist AI told me.

In most of the example videos published by the company, Gen-1 is seen running on a pair of robotic arms, but that’s not all it’s built for. «Gen-1 is designed to be the brain of any robot, meaning the same model can run on a humanoid, an industrial arm or other robotic systems,» said Florence.

Already, this has proved to be a breakthrough year for general-purpose humanoid robots, with companies including Boston Dynamics and Honor unveiling cutting-edge bots capable of uncannily humanlike movements. The market for robots is expected to explode, with one estimate from Morgan Stanley predicting growth to a $5 trillion market by 2050. Predictions see robots coming for industry, retail, hospitality and care environments before eventually landing in our homes. To get us there, we need to see further advances in AI.

Training robots to live alongside humans

Over the past few years we’ve seen large language models, such as ChatGPT, Gemini and Claude, evolve at lightning speed. The same hasn’t been true of the physical AI models required to power robots, in large part because of a lack of data to train those models on. Robots — and especially humanoid robots — must learn to navigate a world built for humans just as a human would.

Often this data is collected from robots performing tasks while being teleoperated by humans, but not Gen-1. Instead, the dataset used to train Generalist AI’s models has been assembled by humans completing millions of different tasks using wearable technology.

«We built our own lightweight ‘data hands’ and distributed them globally to learn how people actually interact with objects, with all the subtle force feedback, tactile feel, slips, corrections and recoveries that define human dexterity in the real world,» said Florence. «That kind of data is critical for teaching robots physical common sense, the intuitive understanding and ability to adapt in real time rather than execute rigid instructions.»

Generalist AI has released a series of videos showing the model running on robots repetitively performing a range of different tasks, with the most compelling, perhaps, being a robot drawing cash out of a wallet before reinserting it into the same pocket. This is a fiddly task that many humans fumble over. It’s clearly not easy for the robot, either, given the flimsiness of the paper money and the fabric of the wallet — and yet it completes the task.

Another video shows a robot sorting socks by color, folding them in neat piles and counting the number of pairs using a touchscreen. Other tricky tasks the model can complete include unzipping and filling a pencil case with pens, stacking oranges in a neat pyramid and plugging in an Ethernet cable.

These videos show the breadth of Gen-1’s capabilities, but more impressive is the success rate with which it can complete certain tasks. Generalist AI measured the model’s hit rate against the previous version and found Gen-1 could successfully service a robot vacuum cleaner in 99% of cases (up from 50% for Gen-0), fold boxes in 99% of cases (up from 81% for Gen-0) and package up phones in 99% of cases (up from 62% for Gen-0).

Robots do improv

Most robots are programmed to complete a task in a specific and orderly way. But what happens when a curve ball gets thrown? «The smallest changes in the environment can cause failures,» said Florence.

An important skill robots need, which humans innately possess, is the ability to think on their feet. This is why Gen-1 has been designed with improvisation in mind so it can come up with strategies to complete tasks. Florence gives me an example of a robot using two hands to reposition an awkwardly placed part for an automotive task, even though it has only been trained to use one. 

«This kind of creativity has been largely absent from robotics until now,» he said.

Significant work still needs to be done when it comes to beefing up robots’ improv chops, but early progress show glimpses of a positive impact on both reliability and speed, says Florence. «We’re beginning to see real progress and are excited to push the boundaries of embodied intelligence.» 

After all, there may come a day when you need a robot in your house that can fix all your other smaller robots.  

Continue Reading

Technologies

iPhone 17 Pro Camera Battles the Galaxy S26 Ultra: Let the Fun Begin

They’re both top-end flagship phones, but which one takes better photos? I wanted to find out.

Both Apple’s iPhone 17 Pro and Samsung’s Galaxy S26 Ultra earned coveted CNET Editors’ Choice awards in their full reviews. And they damned well earned them, too, thanks to their stellar overall performance and wealth of top-end tech on board. But they also garnered praise for their camera quality, with both able to take great-looking photos in a variety of conditions. But which does it better? 

As a professional photographer myself, I was keen to find out, so I took them on a series of photo walks around Scotland to put them to the test in the same conditions. 

Before we dive in, a few notes from me. First, all images were captured in JPEG format using the standard camera app on each phone. On some images on the iPhone, Apple’s Gold Photographic Style was activated; on others, it was set to Standard, and I’ll be highlighting which is which. The images have been imported into Adobe Lightroom for comparison purposes and exported at smaller file sizes to better suit online viewing. No edits to the images themselves were made, and no sharpening was applied on the export. 

Read moreThese Are the Best Phone Cameras That We’ve Tested

Crucially, though, it’s important to keep in mind that the analysis here is my opinion. Photography is largely subjective, and what might look good to one person might not to another. For me, I love a more natural-looking image with accurate tones that I could then edit further later if I want to. You may like a punchy, vibrant tone straight out of the camera, and that’s fine. You’ll just need to take my results here with a slight pinch of salt. 

All that said, let’s dive in.

This was an image I took with the Gold filter accidentally enabled on the iPhone. So its warmer color tones are to be expected to an extent, but what I liked more here is the depth of shadow that the iPhone has maintained. The S26 Ultra has done a fair bit of processing here to lift those shadows and create a more balanced exposure overall, but I think it’s killed some of the evening drama as a result. I see this in a lot of Android phones, to be fair. 

Taken earlier in the day, there’s much less difference to be seen here. The iPhone’s colors are a bit warmer, thanks to the Gold filter, but they actually look more natural as a result. The shot doesn’t look warm in its white balance; it just has a richness to it, while the S26 Ultra’s shot looks quite cold. 

I switched the iPhone to Standard Photographic Style here, and as a result, the shot it took looks pretty similar to that taken by the Galaxy S26 Ultra. The exposures are pretty much the same, and while the green plants on the steps definitely look more vivid in the Galaxy’s shot, the colors elsewhere are broadly on par. 

If I’m nitpicking — which I really have to when the phones cost this much money — the S26 Ultra appears to have done a neater job rendering the details on the front of the VW Camper’s spare wheel. I also noticed more detail in some of the small twigs on the tree, especially where they’re visible against the sky. Is that a difference you’d ever notice without a side-by-side comparison? Definitely not. But this whole article is basically an exercise in pedantry, so I will continue to pick away at even the tiniest of things in these photos.

I’m back on the Gold Photographic Style with the iPhone here, so again, those warmer tones are to be expected, but I will say again that I much prefer the deeper shadows seen on the house in the Apple phone’s image. It looks much more natural, while the S26 Ultra’s shot looks a bit too HDR and oversaturated for my tastes. But that’s not the most important thing here…

What took me more by surprise was what happened when I put each phone into the ultrawide camera mode. The iPhone’s color tones stay almost exactly the same, but the Galaxy’s image has shifted quite dramatically between the main and ultrawide lenses.

The blue sky has shifted its hue into a much more teal-toned color, and I’m surprised by just how different it looks from the main camera. I usually expect to see these sorts of color shifts on cheaper phones, where there’s less effort put into ensuring consistent colors across the lenses. So I’m a bit disappointed to see Samsung’s phones producing such a noticeable shift here. 

The iPhone 17 Pro also displays a color shift, but it’s far less pronounced than the S26 Ultra’s.

I turned on the zooms on both phones. With its 10x optical zoom, the S26 Ultra has a longer reach than the 8x on the iPhone 17 Pro, but in terms of details within those images, there’s honestly nothing to choose between them. Again, the iPhone had the Gold style applied, so it looks warmer, and also again, the S26 Ultra has gone further in lightening those shadows. I can’t really say either one is better than the other in this example. 

But there’s a much bigger difference in this example. The colors are much richer in the iPhone’s shot, even though the Photographic Style is set to Standard. The S26 Ultra’s shot looks like the phone’s white balance has been tricked by the warm orange tones of the brickwork, and produced a colder-looking image as a result. 

But I also don’t like what the S26 Ultra has done with the details here. It’s oversharpened the scene, giving a weird, crunchy look to the subject that looks extremely unnatural. The iPhone, despite not having the same zoom range on paper, has delivered a much better-looking image, even when viewed at the same scale. 

But here the opposite seems to have happened. The iPhone has looked at this warm, sun-drenched scene and automatically set its white balance to cool it, while the S26 Ultra has maintained those warmer tones. Sure, the greens of the leaves in the S26’s image look almost neon, but the image overall is the nicer of the two in my view. 

The iPhone has done a much better job here of capturing the warmer tones that I loved so much when I took these images. I do think the S26 Ultra has gone too far in its hyper-saturation of the green leaves. Sure, it’s a punchy look, but if I wanted that much saturation, I’d maybe add a bit more back in in the editing stage. I’d much rather have a more natural image as a starting point, so the iPhone takes the win here for me.

There’s so little to pick out between the images here. The greens are a little more vibrant in the S26 Ultra’s shot, but the tones overall in the iPhone’s are a bit more natural. Neither one is a spectacular photo, and honestly, you may as well toss a coin to decide which one is better. 

Switching to the ultrawide lenses on both phones, the S26 Ultra has again gone quite hard on the saturation, delivering a much more vibrant blue sky than it did in its image from the main camera. As before, I’m not a fan of this sort of high-contrast, high-saturation photo. As a result, the iPhone 17 Pro is my preferred shot here.

I think the S26 Ultra’s tendency towards vibrancy has helped here, however, with this shot of spring blossom looking more joyful than the almost drab-looking image from the iPhone. 

And sure, the colors are a little overbaked from the S26 Ultra’s ultrawide image, but it still screams «spring» more than the iPhone’s shot, which again looks pretty dull and lifeless by comparison.

I was thrilled to find these fishermen hanging out in Edinburgh, and I think the iPhone has done the better job of capturing the moment. The Gold Photographic Style hasn’t produced an overly warm image here. It’s more like it applied just the correct white balance, with the S26 Ultra’s shot looking quite cold. It’s especially the case on the pink paintwork on the base of the building, which looks richer and much more true-to-life on the iPhone’s image.

At night, both phones have done a good job of capturing this complex image. The bright moon has been kept under control, and there’s plenty of detail still visible in some of the more shadowy areas. The exposures are also broadly similar (the iPhone’s is a touch brighter), and even when peering up close, there’s not much to choose from in terms of detail. 

It’s a slightly different story here, though. The iPhone’s shot is much brighter, but that results in some detail being lost in the highlights inside the phone booth. The S26 Ultra has retained that highlight detail, though its overall shot is darker. Personally, I prefer the darker version, especially as it’s much more in line with the moody nighttime aesthetic I was going for. 

What I don’t love is how much the S26 Ultra has oversharpened its image. Like the earlier image of the figure sitting on the wall, this image has been digitally sharpened to the point that the details look crunchy, high-contrast and ultimately quite unnatural. Which image would I choose — properly exposed but oversharpened, or natural details with blown-out highlights? Ideally, I’d simply take the photo again on the iPhone and lower the exposure a tad. But between the two images above, I’d probably go for the one shot on the Samsung phone.

iPhone 17 Pro vs. Galaxy S26 Ultra: Which has the better camera?

I always complain that these photo-capturing comparison stories are really close and therefore difficult to make into compelling articles, but this one felt especially close. In some shots, the iPhone’s more natural shadow rendering and less reliance on over-sharpening and other digital processing factors make them look better to my eye. But in other examples — especially the image with the tree trunks surrounded by ivy — the S26 Ultra has done a much better job with its color balancing. 

Overall, Samsung’s phone leans harder into contrast and saturation, which is literally the same thing we’ve said about Samsung’s phones since it first started putting cameras in them. Buying a Samsung camera phone has always meant getting more vibrant, punchy images out of it, and that’s exactly the case here. If you want quick images of your friends and family that look good enough to share straight to your family WhatsApp group, the S26 Ultra will serve you well. 

The iPhone 17 Pro tends to be more neutral in its color and contrast adjustments, which typically gives a more natural base for you to then add any extra edits of your own. It’s why Apple’s phones have typically always been the device of choice for more enthusiast or pro photographers and video creators. I count myself among that crowd, and it’s why the iPhone 17 Pro remains my preferred model of the two. But really, these are both excellent phones with superb cameras, and you can’t go far wrong with either.

Continue Reading

Technologies

Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay

Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay

In its latest update, Verum Messenger takes a major step toward integrating communication and financial services. Users can now enjoy a long-awaited feature — topping up their Verum Finance card directly through Apple Pay.

A New Level of Convenience

The integration with Apple Pay significantly simplifies the top-up process. Users no longer need to go through complex transfer steps or rely on third-party services. Just a few taps — and the funds are instantly credited to the card.

This is especially valuable for those who use Verum Messenger not only for communication but also for managing their finances within the ecosystem.

Finance and Messaging in One App

This update reinforces Verum’s strategy to combine in a single product:

  • secure communication
  • cryptocurrency operations
  • everyday financial tools

Verum Messenger is no longer just a messaging app — it is evolving into a полноценную fintech platform.

Security and Speed

Apple Pay is known for its high level of security thanks to:

  • biometric authentication
  • payment tokenization
  • no sharing of card details

By integrating these technologies, Verum Messenger ensures that financial operations are not only convenient but also максимально secure.

What This Means for Users

The update brings several key benefits:

  • instant card top-ups
  • simplified user experience
  • reduced reliance on third-party payment services
  • deeper integration of finance into everyday communication

Looking Ahead

The addition of Apple Pay is just one step in the evolution of the Verum ecosystem. It’s clear the team is moving toward creating a unified digital environment where users can handle most of their needs — from communication to capital management — within a single app.

Continue Reading

Trending

Copyright © Verum World Media