Connect with us

Technologies

Qualcomm’s New AR Chips Point to a New Generation of Smart Glasses

The less power-hungry chips support Wi-Fi 7 and eye tracking. They’re expected to arrive in glasses between 2023 and 2025.

Amid a recent uptick in VR headsets, Qualcomm’s latest chip announcement hints that the next product wave could be AR glasses. At the company’s recent chip-focused event, the newest Snapdragon phone processors were announced, along with a brand-new line of AR glasses-optimized chips that point to a next wave of advanced smartglasses expected to arrive between 2023 and 2025, with possible features including eye tracking, hand tracking and wireless streaming to phones or from the cloud.

The Snapdragon AR2 Gen 1 is a different type of platform than the company’s top-end XR2 processor, which is already in standalone VR headsets like the Meta Quest 2 and Pico 4. The AR2 focuses more on camera and sensor-based processing than on graphics, aiming to improve battery life on smaller glasses. The design is split into three co-processors, which are meant to live in each arm of a pair of smartglasses and also above the bridge. It’s meant to cut down on wires and reduce overheating on future glasses designs.

Glasses using the AR2 Gen 1 may be a lot faster at using cameras for scanning and depth sensing: Qualcomm is promising faster AI for things like object recognition and hand tracking than even the XR2 chip found on headsets such as the Quest 2, but using half as much power as the XR2 chip. There’s nowhere to hide a big battery on a normal-ish pair of glasses, which is why the AR2 Gen 1 aims to be efficient in ways that are reminiscent of the needs of wearables like smartwatches.

The AR2 Gen 1 chip won’t be used for traditional VR headsets. According to Qualcomm, the resolution and field of view in AR glasses using these new chips won’t be as good as what current VR is capable of. Existing AR glasses and headsets tend to have smaller viewing areas and rely on occasional pop-up graphics, versus the expansive full-field graphics and displays VR needs.

Qualcomm is leaning heavily on phones, computers and the cloud to do a lot of the heavy lifting for these future glasses. The chipset includes Wi-Fi 7, and a range of phones running Qualcomm’s Snapdragon chips and the Snapdragon Spaces software platform could be used to wirelessly process AR graphics for these glasses. Essentially they’re wearable peripherals, although the glasses could do some things on their own, too.

Eye tracking on the glasses comes with support for iris authentication, which is handled on-glasses with a dedicated security chip. How that gets used by other manufacturers, however, remains to be seen.

Qualcomm’s already announced a wave of familiar tech names that are onboard to make AR glasses with the AR2 chip, including Lenovo, LG, Niantic, NReal, Oppo, Pico, NTT Qonoq, Rokid, Sharp, TCL, Vuzix and Mi. Microsoft and Adobe are also working on making their software platforms cross-compatible, which mirrors recent partnership news with Meta earlier this year.

Partnerships are necessary, especially for devices like smart glasses that are trying to be useful tools in a world of already well-connected phones, computers, wearables and smart home gear. Microsoft has already announced a partnership with Qualcomm on future AR glasses chips earlier this year, and the AR2 Gen 1 looks like it’ll be a part of that evolution beyond the expensive, business-focused HoloLens 2.

Qualcomm previously worked on chips for existing AR headsets and smart glasses, including the NReal Light, Lenovo’s ThinkReality A3 and Meta’s Ray-Ban Stories. However, Qualcomm’s head of XR, Hugo Swart, indicated in a briefing with reporters that current efforts haven’t been good enough at running long enough on a single battery charge to be useful. (Battery life on nearly all existing VR and AR headsets tends to be under 2 hours at best.)

Dreams of the metaverse are, for the moment, held back equally by hardware and software. While VR headsets are slowly adding AR-like features using passthrough cameras, like in the Meta Quest Pro, there aren’t any all-day AR glasses that are actually any good, although some headsets like the Magic Leap 2 are trying to get closer to being useful for practical business uses. Perhaps Meta, which has been promising its own AR glasses for years, will lean on the AR2 Gen 1 as well for a future product.

There’s nothing available yet that resembles the eyeglass tech sci-fi writers have been dreaming of for decades. Qualcomm’s new chips may not lead to perfect AR glasses, but these chips may lead to improved, wireless glasses of the type that haven’t existed previously. Maybe this wave of AR2 Gen 1-enabled glasses could be the start of the true AR eyewear we’ve been waiting for.

Technologies

Google Making AI-Powered Glasses With Warby Parker, Gentle Monster

Google revealed its first two partnerships with eyeglass brands, with more to come.

The tech world has rarely been called stylish. But at Google’s annual I/O developers conference on Tuesday, the company took one step into the fashion world — kind of. The company revealed that the first eyeglass brands to carry Android XR AI-powered glasses will be Warby Parker and Gentle Monster, with more brand partners to be revealed in the future. Android XR is Google’s upcoming platform for VR, AR and AI on glasses and headsets.

Yes, there was a Superman joke as the company joked that unlike Clark Kent, who hid his superpowers behind nerdy glasses, the Android XR glasses will give you superpowers. That remains to be seen, although NBA star Giannis Antetokounmpo did show up at Google I/O wearing the XR glasses.

Warby Parker, founded in 2010, was originally an online eyeglass retailer that gained fame for its home try-on program, where customers could order five frames sent to their home to try on and then return. It also allowed customers to upload photos to see how they would look wearing different frames.

South Korean eyeglass brand Gentle Monster, founded in 2011, is known for its luxury eyeglasses and sunglasses. The company’s celebrity customers include Beyoncé, Rihanna, Kendrick Lamar and Billie Eilish.

Continue Reading

Technologies

Tariffs Explained: I Have Everything You Need to Know as Walmart, Subaru Hike Prices

Continue Reading

Technologies

Google I/O Announcements: The Latest AI Upgrades Coming to Gemini, XR and More

From its new Project Aura XR glasses to Chrome’s wants-to-be-more-helpful AI mode, Gemini Live and new Flow generative video tool, Google puts AI everywhere.

As you’d expect, this year’s Google I/O developer’s conference focused almost exclusively on AI — where the company’s Gemini AI platform stands, where it’s going and how much it’s going to cost you now for its new AI Ultra subscription plan (spoiler: $250 per month). Meanwhile, a new Flow app expands the company’s video-generation toolset, and its Android XR glasses make their debut. 

Plus, all AI usage and performance numbers are up! (Given that a new 42.5-exaflop Ironwood Tensor processing unit is coming to Google Cloud later this year, they’ll continue to rise.) 

Google’s Project Aura, a developer kit for Android XR that includes new AR glasses from Xreal, is the company’s next step in the company’s roadmap toward glasses-based, AI-driven extended reality. CNET’s Scott Stein goes in-depth in an exclusive interview with Shahram Izadi, Google’s VP and GM for Android XR about that future. And headset-based Project Moohan, developed in conjunction with Samsung, is now available, and Google’s working with Samsung to extend beyond headsets. 

For a play-by-play of the event, you can read the archive of our live blog.

Google already held a separate event for Android, where it launched Android 16, debuting its new Material 3 Expressive interface, updates to security and an update on Gemini integration and features. 

A lot of the whizzy new AI features are only available via one of its subscription levels. AI Pro is just a rebranding of Google’s $20-per-month Gemini Advanced plan (adding some new features), but Google AI Ultra is a pricier new option — $250 per month, with half off the first three months for the moment — that provides access to the latest, spiffiest and least usage-limited of all its tools and models,  as well as a prototype for managing AI agents and the 30 terabytes of storage you’re going to need to store it all. They’re both available today.

Google also wants to make your automation sound smarter with Personalized Smart Replies, which makes your generated answers sound more like you, as well as plowing through pieces of information on your device to provide relevant information. It’ll be in Gmail this summer for subscribers. Eventually, it’ll be everywhere. 

Also, it includes lots of better models, better coding tools and other details on developer-friendly things you expect from a developer conference. The announcement included its conversational Gemini Live, formerly part of Project Astra, its interactive, agentic, voice AI, kitchen sink AI app. (As Managing Editor Patrick Holland says, «Astra is a rehearsal of features that, when they’re ready for the spotlight, get added to Gemini Live.») And for researchers, NotebookLM incorporates Gemini Live to improve its… everything.

It’s available now in the US. 

Chrome AI Mode

People (that is, those over 18) who pony up for the subscriptions, plus users on the Chrome Beta, Dev and Canary tracks, will be able to try out the company’s expanded Gemini integration with Chrome — summary, research and agentic chat based on the contents of your screen, somewhat like Gemini Live does for phones (which, by the way, is available for free on Android and iOS as of today). But the Chrome version is more suited to the type of things you do at a computer rather than a phone. (Microsoft already does this with Copilot in its own Edge browser.)

Eventually, Google plans for Gemini in Chrome to be capable of synthesizing using multiple tabs and voice navigation. 

The company is also expanding how you can interact with its AI Overviews in Google Search as part of AI Mode, with interactions with AI Overviews and more agentic shopping help. It’s a new tab with search, or on the search bar, and it’s available now. It includes deeper searches, Personal Context — which uses all the information it knows about you, and that’s a lot — to make suggestions and customize replies.

The company detailed its new AI Mode for shopping, which has an improved conversational shopping experience, a checkout that monitors for the best pricing, and an updated «try on» interface that lets you upload a photo of yourself rather than modeling it on a generic body. 

Google plans to launch it soon, though the updated «try on» feature is now available in the US via Search Labs.

Google Beam

Formerly known as Project Starline, Google Beam is the updated version of the company’s 3D videoconferencing, now with AI. It uses a six-camera array to capture all angles of you, which the AI then stitches together, uses head tracking to follow your movements, and sends at up to 60 frames per second.

The platform uses a light field display that doesn’t require wearing any special equipment, but that technology also tends to be sensitive to off-angle viewing. HP is an old hand in the large-scale scanning biz, including 3D scanning, so the partnership with Google isn’t a big surprise. 

Flow and other generative creative tools

Google Flow is a new tool that builds on Imagen 4 and Veo 3 to perform tasks like creating AI video clips and stitching them into longer sequences, or extending them, with a single prompt while keeping them consistent from scene to scene. It also provides editing tools like camera controls. It’s available as part of Gemini AI Ultra. 

Imagen 4 image generation is more detailed, with improved tonality and better text and typography. And it’s faster. Meanwhile, Veo 3, also available today, has a better understanding of physics and native audio generation — sound effects, background sounds and dialogue.

Of course, all this is available under the AI Pro plan. Google’s Synth ID gen AI detection tool is also available today.

Continue Reading

Trending

Copyright © Verum World Media