Connect with us

Technologies

Apple Vision Pro Hands-On: Far Better Than I Was Ready For

I experienced incredible fidelity, surprising video quality and a really smooth interface. Apple’s first mixed-reality headset nails those, but lots of questions remain.

I was in a movie theater last December watching Avatar: The Way of Water in 3D, and I said to myself: «Wow, this is an immersive film I’d love to watch in next-gen VR.» That’s exactly what I experienced in Apple’s Vision Pro headset, and yeah, it’s amazing.

On Monday, I tried out the Vision Pro in a series of carefully picked demos during WWDC at Apple’s Cupertino, California, headquarters. I’ve been using cutting-edge VR devices for years, and I found all sorts of augmented reality memories bubbling up in my brain. Apple’s compact — but still not small —headset reminds me of an Apple-designed Meta Quest Pro. The fit of the back strap was comfy yet stretchy, with a dial to adjust the rear fit and a top strap for stability. The headset’s sleek design, and even its glowing front faceplate, also gave me an instant Ready Player One vibe. 

vision-pro-apple-walks-through-mixed-reality-headset-design-mp4-00-00-37-04-still001.png vision-pro-apple-walks-through-mixed-reality-headset-design-mp4-00-00-37-04-still001.png
Watch this: Apple Vision Pro: I Tried Apple’s AR/VR Headset

05:35

I couldn’t wear my glasses during the demo, though, and neither will you. Apple’s headset does not support glasses, instead relying on Zeiss custom inserts to correct wearers’ vision. Apple did manage, through a setup process, to easily find lenses that fit my vision well enough so that everything seemed crystal clear, which is not an easy task. Also, we adjusted the fit and tuned spatial audio for my head using an iPhone, a system that will be finessed when the headset is released in 2024.

From there, I did my demos seated, mostly, and found myself surprised from the start. The passthrough video camera quality of this headset is good —really, really good. Not as good as my own vision, but good enough that I could see the room well, see people in it with me, see my watch notifications easily on my wrist. The only headset that’s done this previously was the extremely impressive but PC-connected Varjo XR-3, and Apple’s display and cameras feel even better.

Apple’s floating grid of apps appears when I press the top digital crown, which autocenters the home screen to wherever I’m looking. I set up eye tracking, which worked like on many other VR headsets I’ve used: I looked at glowing dots as musical notes played, and got a chime when it all worked.

An app menu in Apple's VisionOS. An app menu in Apple's VisionOS.

A list of apps as they would appear inside of the Apple Vision Pro headset.

Apple/Screenshot by CNET

From there, the interface was surprisingly fluid. Looking at icons or interface options slightly enlarges them, or changes how bold they appear. Tapping with my fingers while looking at something opens an app. 

I’ve used tons of hand-tracking technology on headsets like the HoloLens 2 and the Meta Quest 2 and Pro, and usually there’s a lot of hand motion required. Here, I could be really lazy. I pinched to open icons even while my hand was resting in my lap, and it worked. 

Scrolling involves pinching and pulling with my fingers; again, pretty easy to do. I resized windows by moving my hand to throw a window across the room or pin it closer to me. I opened multiple apps at once, including Safari, Messages and Photos. It was easy enough to scroll around, although sometimes my eye tracking needed a bit of extra concentration to pull off.

Apple’s headset uses eye tracking constantly in its interface, something Meta’s Quest Pro and even the PlayStation VR 2 don’t do. That might be part of the reason for the external battery pack. The emphasis on eye tracking as a major part of the interface felt transformative, in a way I expected might be the case for VR and AR years ago. What I don’t know is how it will feel in longer sessions.

I don’t know how the Vision Pro will work with keyboards and trackpads, since I didn’t get to demo the headset that way. It works with Apple’s Magic Keyboard and Magic Trackpad, and Macs, but not with iPhone and iPad or Watch touchscreens —not now, at least.

Dialing in reality

I scrolled through some photos in Apple’s preset photo album, plus a few 3D photos and video clips shot with the Vision Pro’s 3D camera. All the images looked really crisp, and a panoramic photo that spread around me looked almost like it was a window on a landscape that extended just beyond the room I was in. 

Apple has volumetric 3D landscapes on the Vision Pro that are immersive backgrounds like 3D wallpaper, but looking at one really shows off how nice that Micro OLED display looks. A lake looked like it was rolling up to a rocky shore that ended right where the real coffee table was in front of me. 

man using keyboard to work using apple vision pro headset man using keyboard to work using apple vision pro headset

Raising my hands to my face, I saw how the headset separates my hands from VR, a trick that’s already in Apple’s ARKit. It’s a little rough around the edges but good enough. Similarly, there’s a wild new trick where anyone else in the room can ghost into view if you look at them, a fuzzy halo with their real passthrough video image slowly materializing. It’s meant to help create meaningful contact with people while wearing the headset. I wondered how you could turn that off or tune it to be less present, but it’s a very new idea in mixed reality.

Apple’s digital crown, a small dial borrowed from the Apple Watch, handles reality blend. I could turn the dial to slowly extend the 3D panorama until it surrounded me everywhere, or dial it back so it just emerged a little bit like a 3D window. 

Mixed reality in Apple’s headset looks so casually impressive that I almost didn’t appreciate how great it was. Again, I’ve seen mixed reality in VR headsets before (Varjo XR-3, Quest Pro), and I’ve understood its capabilities. Apple’s execution of mixed reality felt much more immersive, rich and effortless on most fronts, with a field of view that felt expansive and rich. I can’t to see more experiences in it.

Cinematic fidelity that wowed me

The cinema demo was what really shocked me, though. I played a 3D clip of Avatar: The Way of Water in-headset, on a screen in various viewing modes including a cinema. Apple’s mixed-reality passthrough can also dim the rest of the world down a bit, in a way similar to how the Magic Leap 2 does with its AR. But the scenes of Way of Water sent little chills through me. It was vivid. This felt like a movie experience. I don’t feel that way in other VR headsets.

Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water Jake Sully flies over Pandora's waters on a winged creature's back in Avatar: The Way of Water

Avatar: The Way of Water looked great in the Vision Pro.

20th Century Studios

Apple also demonstrated its Immersive Video format that’s coming as an extension to Apple TV Plus. It’s a 180-degree video format, similar to what I’ve seen before in concept, but with really strong resolution and video quality. A splash demo reel of Alicia Keys singing, Apple Sports events, documentary footage and more reeled off in front of me, a teaser of what’s to come. One-eighty-degree video never appears quite as crisp to me as big-screen film content, but the sports clips I saw made me wonder how good virtual Jets games could be in the future. Things have come a long way.

Would I pay $3,499 for a head-worn cinema? No, but it’s clearly one of this device’s greatest unique strengths. The resolution and brightness of the display were surprising.

appledisneypic appledisneypic
Watch this: Apple, Disney Partner on Vision Pro Entertainment

03:59

Convincing avatars (I mean, Personas)

Apple’s Personas are 3D-scanned avatars generated by using the Vision Pro to scan your face, making a version of yourself that shows up in FaceTime chats if you want, or also on the outside of the Vision Pro’s curved OLED display to show whether you’re «present» or in an app. I didn’t see how that outer display worked, but I had a FaceTime with someone in their Persona form, and it was good. Again, it looked surprisingly good.

I’ve chatted with Meta’s ultra-realistic Codec Avatars, which aim for realistic representations of people in VR. Those are stunning, and I’ve also seen Meta’s phone-scanned step-down version in an early form last year, where a talking head spoke to me in VR. Apple’s Persona looked better than Meta’s phone-scanned avatar, although a bit fuzzy around the edges, like a dream. The woman whose Persona was scanned appeared in her own window, not in a full-screen form. 

And I wondered how expressive the emotions are with the Vision Pro’s scanning cameras. The Pro has an ability to scan jaw movement similar to the Quest Pro, and the Persona I chatted with was friendly and smiling. How would it look for someone I know, like my mom? Here, it was good enough that I forgot it was a scan.

We demoed a bit of Apple’s Freeform app, where a collaboration window opened up while my Persona friend chatted in another window. 3D objects popped up in the Freeform app, a full home scan. It looked realistic enough.

Dinosaurs in my world

The final demo was an app experience called Encounter Dinosaurs, which reminded me of early VR app demos I had years ago: An experience emphasizing just the immersive «wow» factor of dinosaurs appearing in a 3D window that seemed to open up in the back wall of my demo room. Creatures that looked like carnotauruses slowly walked through the window and into my space. 

All my demos were seated except for this one, where I stood up and walked around a bit. This sounds like it wouldn’t be an impressive demo, but again, the quality of the visuals and how they looked in relation to the room’s passthrough video capture was what made it feel so great. As the dinosaur snapped at my hand, it felt pretty real. And so did a butterfly that danced through the room and tried to land on my extended finger.

I smiled. But even more so, I was impressed when I took off the headset. My own everyday vision wasn’t that much sharper than what Apple’s passthrough cameras provided. The gap between the two was closer than I would have expected, and it’s what makes Apple’s take on mixed reality in VR work so well.

Then there’s the battery pack. There’s a corded battery that’s needed to power the headset, instead of a built-in battery like most others have. That meant I had to make sure to grab the battery pack as I started to move around, which is probably a reason why so many of Apple’s demos were seated.

230605-clean-apple-wwdc-supercut-thumbnail-1 230605-clean-apple-wwdc-supercut-thumbnail-1
Watch this: Everything Apple Announced at WWDC 2023

11:44

What about fitness and everything else?

Apple didn’t emphasize fitness much at all, a surprise to me. VR is already a great platform for fitness, although no one’s finessed headset design for fitness comfort. Maybe having that battery pack right now will limit movement in active games and experiences. Maybe Apple will announce more plans here later. The only taste I got of health and wellness was a one-minute micro meditation, which was similar to the one on the Apple Watch. It was pretty, and again a great showcase of the display quality, but I want more.

2024 is still a while away, and Apple’s headset is priced way out of range for most people. And I have no idea how functional this current headset would feel if I were doing everyday work. But Apple did show off a display, and an interface, that are far better than I was ready for. If Apple can build on that, and the Vision Pro finds ways of expanding its mixed-reality capabilities, then who knows what else is possible?

This was just my fast-take reaction to a quick set of demos on one day in Cupertino. There are a lot more questions to come, but this first set of demos resonated with me. Apple showed what it can do, and we’re not even at the headset’s launch yet.

Technologies

Today’s NYT Connections Hints, Answers and Help for May 24, #713

Hints and answers for Connections for May 24, #713.

Looking for the most recent Connections answers? Click here for today’s Connections hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections: Sports Edition and Strands puzzles.


Today’s Connections puzzle has a fun variety of categories. The purple one appeals to my English major heart. Read on for clues and today’s Connections answers.

The Times now has a Connections Bot, like the one for Wordle. Go there after you play to receive a numeric score and to have the program analyze your answers. Players who are registered with the Times Games section can now nerd out by following their progress, including number of puzzles completed, win rate, number of times they nabbed a perfect score and their win streak.

Read more: Hints, Tips and Strategies to Help You Win at NYT Connections Every Time

Hints for today’s Connections groups

Here are four hints for the groupings in today’s Connections puzzle, ranked from the easiest yellow group, to the tough (and sometimes bizarre) purple group.

Yellow group hint: Goo-goo.

Green group hint: Not shirts.

Blue group hint: City that never sleeps.

Purple group hint: Acclaimed writers.

Answers for today’s Connections groups

Yellow group: Baby gear.

Green group: Kinds of pants minus «s.»

Blue group: New York sports team members.

Purple group: Black women authors.

Read more: Wordle Cheat Sheet: Here Are the Most Popular Letters Used in English Words

What are today’s Connections answers?

The yellow words in today’s Connections

The theme is baby gear. The four answers are bib, bottle, monitor and stroller.

The green words in today’s Connections

The theme is kinds of pants minus «s.» The four answers are capri, jean, jogger and slack.

The blue words in today’s Connections

The theme is New York sports team members. The four answers are Jet, Met, Net and Ranger.

The purple words in today’s Connections

The theme is black women authors. The four answers are Butler, Gay, Hooks and Walker.

Continue Reading

Technologies

Today’s NYT Mini Crossword Answers for Tuesday, May 20

Here are the answers for The New York Times Mini Crossword for May 20.

Looking for the most recent Mini Crossword answer? Click here for today’s Mini Crossword hints, as well as our daily answers and hints for The New York Times Wordle, Strands, Connections and Connections: Sports Edition puzzles.


Today’s NYT Mini Crossword is a fun one, and now I’m singing the song from 1-Across in my head. Need some help with today’s Mini Crossword? Read on. And if you could use some hints and guidance for daily solving, check out our Mini Crossword tips.

The Mini Crossword is just one of many games in the Times’ games collection. If you’re looking for today’s Wordle, Connections, Connections: Sports Edition and Strands answers, you can visit CNET’s NYT puzzle hints page.

Read more: Tips and Tricks for Solving The New York Times Mini Crossword

Let’s get at those Mini Crossword clues and answers.

Mini across clues and answers

1A clue: «Pink ___ Club» (Chappell Roan hit)
Answer: PONY

5A clue: Instrument that might be made with a comb and wax paper
Answer: KAZOO

6A clue: How bedtime stories are often read
Answer: ALOUD

7A clue: On edge
Answer: TENSE

8A clue: Short Instagram video
Answer: REEL

Mini down clues and answers

1D clue: Less colorful
Answer: PALER

2D clue: Layer of the upper atmosphere
Answer: OZONE

3D clue: Totally pointless
Answer: NOUSE

4D clue: Hit a high note in a high place, perhaps
Answer: YODEL

5D clue: Kit ___ bar
Answer: KAT

How to play more Mini Crosswords

The New York Times Games section offers a large number of online games, but only some of them are free for all to play. You can play the current day’s Mini Crossword for free, but you’ll need a subscription to the Times Games section to play older puzzles from the archives.

Continue Reading

Technologies

Want to Speak to Dolphins? Researchers Won $100,000 AI Prize Studying Their Whistling

The scientists studied a bottlenose dolphin community in Sarasota, Florida, uncovering evidence of language-like communications.

If any dolphins are reading this: hello!

A team of scientists studying a community of Florida dolphins has been awarded the first $100,000 Coller Dolittle Challenge prize, set up to award research in interspecies communication algorithms.

The US-based team, led by Laela Sayigh of the Woods Hole Oceanographic Institution, found that a type of whistle that dolphins employ is used as an alarm. Another whistle they studied is used by dolphins to respond to unexpected or unfamiliar situations. The team used non-invasive hydrophones to perform the research, which provides evidence that dolphins may be using whistles like words, shared with multiple members of their communities.

Capturing the sounds is just the beginning. Researchers will use AI to continue deciphering the whistles to try to find more patterns. 

«The main thing stopping us cracking the code of animal communication is a lack of data. Think of the 1 trillion words needed to train a large language model like ChatGPT. We don’t have anything like this for other animals,» said Jonathan Birch, a professor at the London School of Economics and Politics and one of the judges for the prize.

«That’s why we need programs like the Sarasota Dolphin Research Program, which has built up an extraordinary library of dolphin whistles over 40 years. The cumulative result of all that work is that Laela Sayigh and her team can now use deep learning to analyse the whistles and perhaps, one day, crack the code,» he said.

The award was part of a ceremony honoring the work of four teams from across the world. In addition to the dolphin project, researchers studied ways in which nightingales, marmoset monkeys and cuttlefish communicate.

The challenge is a collaboration between the Jeremy Coller Foundation and Tel Aviv University. Submissions for next year open up in August. 

Dolphins are just the beginning

Researching animals and trying to learn the secrets of their communication is nothing new; but AI is speeding up the creation of larger and lager datasets.

«Breakthroughs are inevitable,» says Kate Zacarian, CEO and co-founder of Earth Species Project, a California-based nonprofit that also works in breaking down language barriers with the animal world.

«Just as AI has revolutionized the fields of medicine and material science, we see a similar opportunity to bring those advances to the study of animal communication and empower researchers in this space with entirely new capabilities,» Zacarian said.

Zacarian applauded Sayigh’s team and their win and said it will help bring broader recognition to the study of non-human animal communication. It could also bring more attention to ways that AI can change the nature of this type of research.
«The AI systems aren’t just faster — they allow for entirely new types of inquiry,» she said. «We’re moving from decoding isolated signals to exploring communication as a rich, dynamic, and structure phenomenon — whish is a task that’s simply too big for our human brains, but possible for large-scale AI models.»

Earth Species recently released an open-source large audio language model for analyzing animal sounds called NatureLM-audio. The organization is currently working with biologists and ethologists to study species including carrion crows, orcas, jumping spiders and others and plans to release some of their findings later this year, Zacarian said.

Continue Reading

Trending

Copyright © Verum World Media