Connect with us

Technologies

The Revelation I Got From Experiencing HaptX Is Wild

I tested gloves and buzzing things in Las Vegas to see where the future points.

I put my hands out flat and loaded them into a pair of gloves loaded with joints, cables, pumps and tightening straps. All of this was connected to a backpack-size box that helped pump pressure around my fingers and create sensations of touching things. I was about to play Jenga in VR using an $80,000 pair of haptic gloves made by HaptX.

The future of the metaverse, or how we’ll dip into virtual worlds, seems to involve VR and AR, sometimes. If it does, it’ll also mean solving what we do with our hands. While companies like Meta are already researching ways that neural input bands and haptic gloves could replace controllers, none of that is coming for years. In the meantime, is there anything better than the VR game controllers already out there or basic camera-based hand tracking? I’ve tried a couple of haptic gloves before, but I was ready to try more.

I poked around CES 2023 in Las Vegas to get some experiences with devices I hadn’t tried before, and it suddenly hit me that there’s already a spectrum of options. Each of them was a little revelation.

High end: Massive power gloves

HaptX has been recognized for years as one of the best haptic gloves products on the market, but I’d never had a chance to experience them. The hardware is highly specialized and also extremely large and expensive. I wish I’d gotten a chance to see them at the last CES I attended before this, in 2020. Finally, in 2023, I got a chance.

The gloves use microfluidics, pumping air into small bladders that create touch sensations in 133 zones per hand across the fingers and palm. At the same time, cables on the backs of the fingers pull back to simulate up to 8 pounds of force feedback. Used with apps that support them, you can reach out, grab things and actually feel them.

I’ve tried lower-cost haptic gloves at home that didn’t have the air bladders but did have cables to apply resistance. The HaptX gloves are a big step forward and the most eerily realistic ones I’ve ever tried. I wouldn’t say everything «felt real,» but the poking finger-feelings I had in my fingers and palms let me feel shapes of things, while the resistance gave me a sense of grabbing and holding stuff.

The most amazing moments were when I placed objects on my palm and seemed to feel their weight. Also, when another person’s finger virtually touched mine. Another journalist was in another VR headset with haptic gloves playing Jenga next to me. We never made contact, but occasionally we shook hands virtually or gave high-fives. Our fingers touching felt… well, oddly real, like sensing someone’s finger touching your glove.

HaptX is making another pair of smaller, more mobile gloves later this year that cost less (about $5,000) while still promising the same level of feedback, plus tactile vibrations like the haptic buzzes you might feel with game controllers. I didn’t get to demo that, but I can’t wait.

While HaptX’s tech is wild, it’s meant for industrial purposes and simulations. It represents actual reality, but it’s so massive that it wouldn’t let me do anything else other than live in its simulated world. For instance, how would I type or pull out my phone? Still, I’ll dream of interfaces that let me feel as immersed as these gloves can accomplish.

Budget gloves: bHaptics’ TactGloves

At $300, bHaptics‘ yellow haptic gloves are far, far less expensive than HaptX. They’re also completely different. Instead of creating pressure or resistance, all they really do is have various zones inside that electrically buzz, like your phone, watch or game controller, to sync up with moments when your fingers in VR would virtually touch something. Strangely, it’s very effective. In a few demos I tried, pushing buttons and touching objects provided enough feedback to feel like I was really «clicking» a thing. Another demo, which had me hug a virtual avatar mirroring my movements or shake hands, gave enough contact to fool me into feeling I was touching them.

bHaptics also makes a haptic vest I tried called the TactSuit that vibrates with feedback with supported games and apps. There aren’t many apps that work ideally with haptic gloves right now, because no one’s using haptic gloves. But bHaptics’ support of the standalone Meta Quest 2, and its wireless Bluetooth pairing, means they’re actually portable… even if they look like giant janitorial cleaning gloves. The tradeoff with being so small and wireless is their range is short. I had to keep the gloves within about two feet of the headset, otherwise they’d lose connection.

The buzzing feedback didn’t prove to me that I could absolutely reach into other worlds, but they offered enough sensation to make hand tracking feel more precise, Instead of wondering whether my hand gestures had actually contacted a virtual object, I could get a buzzing confirmation. The whole experience reminded me of some sort of game controller feedback I could wear on my fingers, in a good way.

No gloves at all: Ultraleap’s Ultrasonics

Ultraleap, a company that’s specialized in hand tracking for years, has a different approach to haptics: sensations you can feel in the air. I waved my hand above a large rectangular panel and felt ripples and buzzes beneath my fingers. The feelings are created with ultrasonic waves, high-powered sound bursts that move air almost like super-precise fans against your fingers. I tried Ultraleap’s tech back in 2020, but trying the latest and more compact arrays this year made me think about a whole new use case. It was easy to make this logic leap, since Ultraleap’s booth also demonstrated hand tracking (without haptic feedback) on Pico Neo 3 and Lynx R1 VR and mixed reality headsets.

What if… this air vibration could be used for headsets? Ultraleap is already dreaming and planning for this solution, but right now ultrasonic tech is too power hungry, and the panels too large, for headgear. The tech is mainly being used in car interface concepts, where the hand gestures and feedback could make adjusting car controls while driving easier to use and less dangerous or awkward. The range of the sensations, at least several feet, seem ideal for the arm length and radius of most existing camera-based hand-tracking tech being used right now on devices like the Meta Quest 2.

I tried a demo where I adjusted a virtual volume slider by pinching and raising the volume up and down, while feeling discrete clicks to let me know I was doing something. I could feel a virtual «bar» in the air that I could feel and perhaps even move. The rippling, subtle buzzes are far more faint than those on haptic gloves or game controllers (or your smartwatch), but they could be just enough to give that extra sense that a virtual button press, for instance, actually succeeded…or that a gesture to turn something on or off was registered.

If these interfaces move to VR and AR, Ultreleap’s representatives said they’d likely end up in larger installations first: maybe theme park rides. Ultraleap’s tech is already in experiences like the hands-free Ninjago ride at Legoland, which I’ve tried with my kids. The 3D hand-tracking ride lets me throw stars at enemies, but sometimes I’m not sure my gestures were registered. What if buzzing let me know I was making successful hits?

Haptics are likely to come from stuff we already wear

Of course, I skipped the most obvious step for AR and VR haptic feedback: smartwatches and rings. We wear buzzing things on our wrists already. Apple’s future VR/AR device might work with the Apple Watch this way, and Meta, Google, Samsung, Qualcomm and others could follow a similar path with dovetailing products. I didn’t come across any wearable watch or ring VR/AR haptics at CES 2023 (unless I missed them). But I wouldn’t be surprised if they’re coming soon. If AR and VR are ever going to get small enough to wear more often, we’re going to need controls that are far smaller than game controllers… and ways to make gesture inputs feel far less weird. Believe the buzz: Haptics is better than you think.

Technologies

Google Making AI-Powered Glasses With Warby Parker, Gentle Monster

Google revealed its first two partnerships with eyeglass brands, with more to come.

The tech world has rarely been called stylish. But at Google’s annual I/O developers conference on Tuesday, the company took one step into the fashion world — kind of. The company revealed that the first eyeglass brands to carry Android XR AI-powered glasses will be Warby Parker and Gentle Monster, with more brand partners to be revealed in the future. Android XR is Google’s upcoming platform for VR, AR and AI on glasses and headsets.

Yes, there was a Superman joke as the company joked that unlike Clark Kent, who hid his superpowers behind nerdy glasses, the Android XR glasses will give you superpowers. That remains to be seen, although NBA star Giannis Antetokounmpo did show up at Google I/O wearing the XR glasses.

Warby Parker, founded in 2010, was originally an online eyeglass retailer that gained fame for its home try-on program, where customers could order five frames sent to their home to try on and then return. It also allowed customers to upload photos to see how they would look wearing different frames.

South Korean eyeglass brand Gentle Monster, founded in 2011, is known for its luxury eyeglasses and sunglasses. The company’s celebrity customers include Beyoncé, Rihanna, Kendrick Lamar and Billie Eilish.

Continue Reading

Technologies

Tariffs Explained: I Have Everything You Need to Know as Walmart, Subaru Hike Prices

Continue Reading

Technologies

Google I/O Announcements: The Latest AI Upgrades Coming to Gemini, XR and More

From its new Project Aura XR glasses to Chrome’s wants-to-be-more-helpful AI mode, Gemini Live and new Flow generative video tool, Google puts AI everywhere.

As you’d expect, this year’s Google I/O developer’s conference focused almost exclusively on AI — where the company’s Gemini AI platform stands, where it’s going and how much it’s going to cost you now for its new AI Ultra subscription plan (spoiler: $250 per month). Meanwhile, a new Flow app expands the company’s video-generation toolset, and its Android XR glasses make their debut. 

Plus, all AI usage and performance numbers are up! (Given that a new 42.5-exaflop Ironwood Tensor processing unit is coming to Google Cloud later this year, they’ll continue to rise.) 

Google’s Project Aura, a developer kit for Android XR that includes new AR glasses from Xreal, is the company’s next step in the company’s roadmap toward glasses-based, AI-driven extended reality. CNET’s Scott Stein goes in-depth in an exclusive interview with Shahram Izadi, Google’s VP and GM for Android XR about that future. And headset-based Project Moohan, developed in conjunction with Samsung, is now available, and Google’s working with Samsung to extend beyond headsets. 

For a play-by-play of the event, you can read the archive of our live blog.

Google already held a separate event for Android, where it launched Android 16, debuting its new Material 3 Expressive interface, updates to security and an update on Gemini integration and features. 

A lot of the whizzy new AI features are only available via one of its subscription levels. AI Pro is just a rebranding of Google’s $20-per-month Gemini Advanced plan (adding some new features), but Google AI Ultra is a pricier new option — $250 per month, with half off the first three months for the moment — that provides access to the latest, spiffiest and least usage-limited of all its tools and models,  as well as a prototype for managing AI agents and the 30 terabytes of storage you’re going to need to store it all. They’re both available today.

Google also wants to make your automation sound smarter with Personalized Smart Replies, which makes your generated answers sound more like you, as well as plowing through pieces of information on your device to provide relevant information. It’ll be in Gmail this summer for subscribers. Eventually, it’ll be everywhere. 

Also, it includes lots of better models, better coding tools and other details on developer-friendly things you expect from a developer conference. The announcement included its conversational Gemini Live, formerly part of Project Astra, its interactive, agentic, voice AI, kitchen sink AI app. (As Managing Editor Patrick Holland says, «Astra is a rehearsal of features that, when they’re ready for the spotlight, get added to Gemini Live.») And for researchers, NotebookLM incorporates Gemini Live to improve its… everything.

It’s available now in the US. 

Chrome AI Mode

People (that is, those over 18) who pony up for the subscriptions, plus users on the Chrome Beta, Dev and Canary tracks, will be able to try out the company’s expanded Gemini integration with Chrome — summary, research and agentic chat based on the contents of your screen, somewhat like Gemini Live does for phones (which, by the way, is available for free on Android and iOS as of today). But the Chrome version is more suited to the type of things you do at a computer rather than a phone. (Microsoft already does this with Copilot in its own Edge browser.)

Eventually, Google plans for Gemini in Chrome to be capable of synthesizing using multiple tabs and voice navigation. 

The company is also expanding how you can interact with its AI Overviews in Google Search as part of AI Mode, with interactions with AI Overviews and more agentic shopping help. It’s a new tab with search, or on the search bar, and it’s available now. It includes deeper searches, Personal Context — which uses all the information it knows about you, and that’s a lot — to make suggestions and customize replies.

The company detailed its new AI Mode for shopping, which has an improved conversational shopping experience, a checkout that monitors for the best pricing, and an updated «try on» interface that lets you upload a photo of yourself rather than modeling it on a generic body. 

Google plans to launch it soon, though the updated «try on» feature is now available in the US via Search Labs.

Google Beam

Formerly known as Project Starline, Google Beam is the updated version of the company’s 3D videoconferencing, now with AI. It uses a six-camera array to capture all angles of you, which the AI then stitches together, uses head tracking to follow your movements, and sends at up to 60 frames per second.

The platform uses a light field display that doesn’t require wearing any special equipment, but that technology also tends to be sensitive to off-angle viewing. HP is an old hand in the large-scale scanning biz, including 3D scanning, so the partnership with Google isn’t a big surprise. 

Flow and other generative creative tools

Google Flow is a new tool that builds on Imagen 4 and Veo 3 to perform tasks like creating AI video clips and stitching them into longer sequences, or extending them, with a single prompt while keeping them consistent from scene to scene. It also provides editing tools like camera controls. It’s available as part of Gemini AI Ultra. 

Imagen 4 image generation is more detailed, with improved tonality and better text and typography. And it’s faster. Meanwhile, Veo 3, also available today, has a better understanding of physics and native audio generation — sound effects, background sounds and dialogue.

Of course, all this is available under the AI Pro plan. Google’s Synth ID gen AI detection tool is also available today.

Continue Reading

Trending

Copyright © Verum World Media