Connect with us

Technologies

Apple’s Flagship AirPods Pro 3 Could Launch This Year: Here’s What I’d Like to See

Rumors suggest Apple will upgrade its flagship noise-canceling earbuds later this year. Here’s everything I know about the AirPods Pro 3.

With Apple typically updating one of its AirPods models every year and the AirPods Pro due for an upgrade, we’re seeing plenty of chatter that Apple will release its 3rd-generation AirPods Pro 3 sometime in 2025. Last year, we got the all-new AirPods 4 and AirPods 4 with Active Noise Canceling. However, all of us anticipating the arrival of the AirPods Max 2 were disappointed when Apple only refreshed its premium over-ear headphones with USB-C charging and new colors. So, we can only wait and speculate. Here’s a look at the latest AirPods Pro 3 rumors and what improvements I’d like to see.

Read more: Best wireless earbuds of 2025

Rumored AirPods Pro 3 release date

MacOS Rumors recently spotted a post on X from a tipster called Kosutami who said Apple was planning to launch the AirPods Pro 3 and AirTag 2 in May or June of this year. That seems unlikely, given that Apple has previously launched next-generation AirPods alongside new iPhones in the fall. One exception was the AirPods Max, which was announced on Dec. 8, 2020, or about six weeks after the iPhone 12 was released. 

While anything is possible, it seems more likely that the AirPods Pro 3 will arrive at the same time as the iPhone 17 in September. Bloomberg’s Mark Gurman, who’s usually a more reliable Apple whisperer, has said the AirPods Pro 3 will have a new design and feature heart-rate monitoring like Apple’s new Beats Powerbeats Pro 2 — but they’re still months away from being launched. New 

New temperature and heart-rate sensors

With the Powerbeats Pro 2 getting the aforementioned heart-rate sensors, it now seems more likely that the AirPods Pro 3 will get them, too. I’m not sure how useful that feature is, especially if you already own an Apple Watch. However, Apple likely has grander plans for its buds’ heart-rate monitoring and is still fleshing everything out, using the Powerbeats Pro 2 as a bit of a guinea pig. 

Several tech outlets picked up on another Gurman report mentioning the possibility of temperature sensors and other physiological measurements coming to the AirPods. Both he and Apple analyst Ming-Chi Kuo also reported that Apple was working on adding infrared cameras to future AirPods. Kuo’s report suggested that the IR cameras could be used for everything including in-air hand gesture detection, enhancing spatial audio or detecting environmental changes for software, including Apple Intelligence. Don’t expect to see any of that camera tech in the AirPods Pro 3, but maybe the AirPods Pro 4 will get it.

AirPods case with touch screen display

The rumored feature I find most intriguing is an interactive touch display in the AirPods Pro 3’s charging case that acts as a remote control. With Apple filing a patent for it back in 2022, the rumor has been kicking around for a while, and several AirPods knockoffs with touchscreens have shown up on Amazon in recent months. Also, last year JBL released three new Live 3 earbuds, including the Live Beam 3, as a follow-up to 2023’s Tour Pro 2 earbuds, which featured a color touchscreen in their case. All the new Live 3 models feature a 1.5-inch LED touch display in their charging cases, so the feature appears in competing earbuds.

I don’t know what the odds are that the AirPods Pro 3 will get a charging case with a touchscreen, but some changes to the charging case are likely, with some saying the case may shrink a bit. If nothing else, the physical Bluetooth pairing and reset button should get swapped out for a hidden touch-capacitive «button» like the one found in the AirPods 4’s case.

Given that Apple made two versions of the AirPods 4, I could see it making two versions of the AirPods Pro 3 — a more premium model with some extra features like a touchscreen in the charging case and a step-down version that cuts them out.

Improved AirPods Pro 3 performance with H3 chip

The AirPods Pro 2, AirPods 4 and Powerbeats Pro 2 are all powered by Apple’s H2 chip. Rumor has it that the AirPods Pro 3 could get the new H3 chip, presuming Apple sticks with its current earbuds/headphones chip nomenclature. Adding a more powerful, energy-efficient chip along with tweaks to the design of the buds’ acoustic architecture and microphones could lead to several performance improvements, including enhanced sound quality, upgraded active noise canceling and better voice-calling performance. We could also see slightly better battery life.

I don’t expect a huge jump in performance, but the AirPods Pro 3 could sound a little clearer with better bass definition than their predecessor. Their noise canceling may be more proficient and able to muffle a wider range of frequencies. Also, when it comes to voice calling, the buds will likely do an even better job of picking up your voice while reducing background noise.

The AirPods Pro 2 and Powerbeats Pro 2 are ultralow-latency and can even do lossless audio when paired with Apple’s pricey Vision Pro headset. From what I’ve been told, the reason the AirPods Pro 2 and Powerbeats Pro 2 are able to do true lossless audio with the Vision Pro headset is that the buds and headset sit only a few inches apart, making for an extremely short wireless connection that can reliably transmit lossless audio. There’s been talk of Apple coming up with a solution to bring lossless audio to next-gen AirPods when paired with your iPhone. I hope the AirPods Pro 3 has a lossless audio option when connected to the latest iPhones, iPads and Macs, but I’m not counting on it. 

Live translation feature for AirPods Pro 3 (and maybe AirPods Pro 2 and AirPods 4)

Lately I’ve encountered several no-name Chinese earbuds on Amazon with live translation features, so it wasn’t a huge surprise when Bloomberg recently reported that the AirPods Pro 3 may be adding live translation via Apple’s Translate app with the release of iOS 19 this fall. The iOS Translate app already has fairly robust translation capabilities, but the report talks about how Apple plans to simplify the translation experience with the feature built into its earbuds, as well improve the iOS Translation app. (For those who can’t access Bloomberg’s content behind its paywall, MacRumors has a synopsis of the report).

Since this is more of an iPhone/iOS 19 feature, with the translation of what you say played through your iPhone’s speakers for others to hear, live translation seems pretty likely to come to the AirPods Pro 2 and AirPods 4, both of which are equipped with Apple’s H2 chip and have plenty of processing power. Beats Powerbeats Pro 2 might also get the live translation feature.          

My AirPods Pro 3 wishlist: Better sound quality and one key feature 

I don’t care too much about some of these rumored extra features, like heart-rate monitoring and temperature sensing. I’m more excited about any enhancements to the buds’ design and performance upgrades across the board. 

I’ve previously written about how I’d like to see all new AirPods get a case that turns into a Bluetooth transmitter to wirelessly stream the audio from inflight entertainment to the buds. A few true wireless earbuds, including the Jabra Elite 8 Active Gen 2 and Elite 10 Gen 2, Bowers & Wilkins Pi7 S2 and Poly Voyager Free 60 Plus, have charging cases that act as Bluetooth transmitters. With the included cable, you simply connect the case to the 3.5mm port in your seat’s console or armrest and you’re good to go. I don’t expect the AirPods Pro 3 will get this feature, but I sure wish they would.

I’d also like to see Apple add a set of extra large eartips. I barely get a tight seal with the AirPods Pro 2’s current large tips and could really use an XL tip for my left ear, which is slightly different from my right (I have tested a variety of third-party foam tips). When Apple released the AirPods Pro 2, it added a fourth extra small ear tip for those with smaller ears. The challenge to adding a fifth XL tip is that the charging case would have to be able to accommodate a slightly larger ear tip. Since getting a tight seal is so important for optimizing sound quality and noise-canceling performance, it would behoove Apple to offer that fifth XL tip for those who require bigger eartips for that reason. 

Read more: The One Feature I Wish Apple Would Add to All New AirPods

Technologies

Google Making AI-Powered Glasses With Warby Parker, Gentle Monster

Google revealed its first two partnerships with eyeglass brands, with more to come.

The tech world has rarely been called stylish. But at Google’s annual I/O developers conference on Tuesday, the company took one step into the fashion world — kind of. The company revealed that the first eyeglass brands to carry Android XR AI-powered glasses will be Warby Parker and Gentle Monster, with more brand partners to be revealed in the future. Android XR is Google’s upcoming platform for VR, AR and AI on glasses and headsets.

Yes, there was a Superman joke as the company joked that unlike Clark Kent, who hid his superpowers behind nerdy glasses, the Android XR glasses will give you superpowers. That remains to be seen, although NBA star Giannis Antetokounmpo did show up at Google I/O wearing the XR glasses.

Warby Parker, founded in 2010, was originally an online eyeglass retailer that gained fame for its home try-on program, where customers could order five frames sent to their home to try on and then return. It also allowed customers to upload photos to see how they would look wearing different frames.

South Korean eyeglass brand Gentle Monster, founded in 2011, is known for its luxury eyeglasses and sunglasses. The company’s celebrity customers include Beyoncé, Rihanna, Kendrick Lamar and Billie Eilish.

Continue Reading

Technologies

Tariffs Explained: I Have Everything You Need to Know as Walmart, Subaru Hike Prices

Continue Reading

Technologies

Google I/O Announcements: The Latest AI Upgrades Coming to Gemini, XR and More

From its new Project Aura XR glasses to Chrome’s wants-to-be-more-helpful AI mode, Gemini Live and new Flow generative video tool, Google puts AI everywhere.

As you’d expect, this year’s Google I/O developer’s conference focused almost exclusively on AI — where the company’s Gemini AI platform stands, where it’s going and how much it’s going to cost you now for its new AI Ultra subscription plan (spoiler: $250 per month). Meanwhile, a new Flow app expands the company’s video-generation toolset, and its Android XR glasses make their debut. 

Plus, all AI usage and performance numbers are up! (Given that a new 42.5-exaflop Ironwood Tensor processing unit is coming to Google Cloud later this year, they’ll continue to rise.) 

Google’s Project Aura, a developer kit for Android XR that includes new AR glasses from Xreal, is the company’s next step in the company’s roadmap toward glasses-based, AI-driven extended reality. CNET’s Scott Stein goes in-depth in an exclusive interview with Shahram Izadi, Google’s VP and GM for Android XR about that future. And headset-based Project Moohan, developed in conjunction with Samsung, is now available, and Google’s working with Samsung to extend beyond headsets. 

For a play-by-play of the event, you can read the archive of our live blog.

Google already held a separate event for Android, where it launched Android 16, debuting its new Material 3 Expressive interface, updates to security and an update on Gemini integration and features. 

A lot of the whizzy new AI features are only available via one of its subscription levels. AI Pro is just a rebranding of Google’s $20-per-month Gemini Advanced plan (adding some new features), but Google AI Ultra is a pricier new option — $250 per month, with half off the first three months for the moment — that provides access to the latest, spiffiest and least usage-limited of all its tools and models,  as well as a prototype for managing AI agents and the 30 terabytes of storage you’re going to need to store it all. They’re both available today.

Google also wants to make your automation sound smarter with Personalized Smart Replies, which makes your generated answers sound more like you, as well as plowing through pieces of information on your device to provide relevant information. It’ll be in Gmail this summer for subscribers. Eventually, it’ll be everywhere. 

Also, it includes lots of better models, better coding tools and other details on developer-friendly things you expect from a developer conference. The announcement included its conversational Gemini Live, formerly part of Project Astra, its interactive, agentic, voice AI, kitchen sink AI app. (As Managing Editor Patrick Holland says, «Astra is a rehearsal of features that, when they’re ready for the spotlight, get added to Gemini Live.») And for researchers, NotebookLM incorporates Gemini Live to improve its… everything.

It’s available now in the US. 

Chrome AI Mode

People (that is, those over 18) who pony up for the subscriptions, plus users on the Chrome Beta, Dev and Canary tracks, will be able to try out the company’s expanded Gemini integration with Chrome — summary, research and agentic chat based on the contents of your screen, somewhat like Gemini Live does for phones (which, by the way, is available for free on Android and iOS as of today). But the Chrome version is more suited to the type of things you do at a computer rather than a phone. (Microsoft already does this with Copilot in its own Edge browser.)

Eventually, Google plans for Gemini in Chrome to be capable of synthesizing using multiple tabs and voice navigation. 

The company is also expanding how you can interact with its AI Overviews in Google Search as part of AI Mode, with interactions with AI Overviews and more agentic shopping help. It’s a new tab with search, or on the search bar, and it’s available now. It includes deeper searches, Personal Context — which uses all the information it knows about you, and that’s a lot — to make suggestions and customize replies.

The company detailed its new AI Mode for shopping, which has an improved conversational shopping experience, a checkout that monitors for the best pricing, and an updated «try on» interface that lets you upload a photo of yourself rather than modeling it on a generic body. 

Google plans to launch it soon, though the updated «try on» feature is now available in the US via Search Labs.

Google Beam

Formerly known as Project Starline, Google Beam is the updated version of the company’s 3D videoconferencing, now with AI. It uses a six-camera array to capture all angles of you, which the AI then stitches together, uses head tracking to follow your movements, and sends at up to 60 frames per second.

The platform uses a light field display that doesn’t require wearing any special equipment, but that technology also tends to be sensitive to off-angle viewing. HP is an old hand in the large-scale scanning biz, including 3D scanning, so the partnership with Google isn’t a big surprise. 

Flow and other generative creative tools

Google Flow is a new tool that builds on Imagen 4 and Veo 3 to perform tasks like creating AI video clips and stitching them into longer sequences, or extending them, with a single prompt while keeping them consistent from scene to scene. It also provides editing tools like camera controls. It’s available as part of Gemini AI Ultra. 

Imagen 4 image generation is more detailed, with improved tonality and better text and typography. And it’s faster. Meanwhile, Veo 3, also available today, has a better understanding of physics and native audio generation — sound effects, background sounds and dialogue.

Of course, all this is available under the AI Pro plan. Google’s Synth ID gen AI detection tool is also available today.

Continue Reading

Trending

Copyright © Verum World Media