Connect with us

Technologies

Apple AirPods Pro 2 vs. AirPods 3: The Biggest Differences

Active noise cancellation is the biggest benefit you’ll get from buying the AirPods Pro 2 over the AirPods 3.

If you’re trying to decide between Apple’s AirPods 3 and its AirPods Pro 2, the biggest questions are whether you want active noise cancellation in a noise-isolating design or open earbuds that don’t require you to jam silicone ear tips into your ears. Yes, there’s a price difference — the AirPods Pro 2 sell for about $200 online while the AirPods 3 cost about $150. But with only about $50 separating the two AirPods models, it’s probably more important to focus on those key differences rather than dwelling too much on their price. 

Apple has bridged the gap between its Pro and regular AirPods by upgrading the AirPods 3’s design — it now looks more like the Pro’s design minus the silicone ear tips — and giving it the same IPX4 splash-proof water resistance rating. Additionally, the AirPods 3, like the AirPods Pro and Pro 2, also have Apple’s spatial audio with head-tracking feature.

Read more: Best Wireless Earbuds for 2023

But there are still certain benefits you can only get on the $249 AirPods Pro 2, the biggest being active noise cancellation and transparency mode. Multiple ear tip sizes, the ability to swipe up and down to control music volume and ultra wideband support are also exclusive to the Pro 2. But noise cancellation will likely make the biggest impact in everyday use, and it’s the most important factor to consider.

AirPods Pro 2 vs. AirPods 3

AirPods Pro 2 AirPods 3
Price (USD) $249 $169 (Lightning case)
Price (UK) £249 £179 (Lightning case)
Price (AU) AU$399 AU$279 (Lightning case)
Weight (earbuds) 0.19 ounce 0.15 ounce
Audio features Active noise cancellation, Adaptive transparency, spatial audio with dynamic head tracking Spatial audio with dynamic head tracking
Audio technology Adaptive EQ, custom high-excursion Apple driver, custom high dynamic range amplifier, vent system for pressure equalization Adaptive EQ, custom high-excursion Apple driver, custom high dynamic range amplifier
Durability IPX4 sweat and water resistant IPX4 sweat and water resistant
Charging MagSafe or Lightning MagSafe or Lightning (extra $10 for MagSafe case)
Multiple ear tips Yes No
Chip H2 chip, U1 chip in charging case H1 chip
Battery life (earbuds) 6 hours of listening time 6 hours of listening time
Battery life (case) 30 hours of listening time 30 hours of listening time
Microphones Dual beamforming microphones; inward-facing microphone Dual beamforming microphones; inward-facing microphone
Sensors Skin detect sensor, motion detecting accelerometer, speech detecting accelerometer, touch control Skin detect sensor, motion detecting accelerometer, speech detecting accelerometer, Force sensor
Controls Hey Siri, touch controls Hey Siri, Force sensor

AirPods Pro 2 vs. AirPods 3: Design and case

The AirPods Pro 2 pictured with their case The AirPods Pro 2 pictured with their case

The AirPods Pro 2.

David Carnoy/CNET

The biggest difference in terms of design is that the $169 AirPods don’t have interchangeable silicone tips like the AirPods Pro 2, which include four sizes to choose from. The AirPods 3 are also lighter than the AirPods Pro 2 at 0.15 ounce (4.3 grams) versus 0.19 ounce (5.3 grams). 

The AirPods 3 and new AirPods Pro share some similarities when it comes to design, although it’s very easy to tell them apart. The AirPods Pro 2’s stems, for example, are noticeably shorter than those on the AirPods 3. But both models are sweat and water resistant, which could make them more appealing than the $129 regular AirPods for those who want to wear them during exercise. 

AirPods 3rd Generation AirPods 3rd Generation

The third-generation AirPods.

David Carnoy/CNET

The case for the AirPods 3 sort of looks like a cross between the case for the standard AirPods and that of the AirPods Pro. It’s much shorter and wider than the entry-level AirPods case, but it’s not as wide as the holster for the AirPods Pro. You can also charge the case for the AirPods 3 or the AirPods Pro via Apple’s wireless MagSafe charger, or by plugging it in with a Lightning cable. But you’ll have to pay an extra $10 to get the MagSafe wireless charging case bundled with the AirPods 3. The MagSafe-compatible case for the AirPods Pro 2 also has a lanyard loop, unlike the AirPods 3’s case. 

The second-generation AirPods Pro’s case also has another capability: ultra wideband support. That essentially means the case has a built-in AirTag for easier location tracking.

airpodspro2-00-00-16-15-still010 airpodspro2-00-00-16-15-still010
Watch this: AirPods Pro 2 Review: Hard to Beat for Apple Users

10:21

AirPods Pro 2 vs. AirPods 3: Audio 

AirPods Pro 2 being held by the stem AirPods Pro 2 being held by the stem

The AirPods Pro 2 have active noise cancellation and transparency mode.

David Carnoy/CNET

You’ll still have to splurge on Apple’s top-of-the-line earbuds to get active noise cancellation and transparency mode. Since the second-generation AirPods Pro have Apple’s new H2 chip, they can cancel up to twice as much noise as the previous AirPods Pro, according to Apple’s claims. Transparency Mode has also gotten an upgrade on the second-generation model. The new chip can reduce loud noises from your surroundings when in Transparency Mode, which should make sounds like a passing vehicle seem less jarring.

That new H2 chip also brings improved audio to the AirPods Pro 2, further distinguishing them from the AirPods 3. As my colleague David Carnoy wrote in his review, the H2’s computational power helps the AirPods Pro process a broader range of frequencies.

You’ll also get swipe controls for managing volume levels on the AirPods Pro 2. The AirPods 3 just have Apple’s force sensors, which you can press to skip ahead, pause music or answer calls. 

But both the AirPods Pro 2 and AirPods 3 have dynamic spatial audio and adaptive EQ. The former is essentially virtual surround sound, while the latter adjusts the sound to your ears.

AirPods Pro 2 vs. AirPods 3: Battery life

The AirPods 3 on a MagSafe charger The AirPods 3 on a MagSafe charger

The AirPods 3 (pictured) and AirPods Pro 2 offer similar battery life.

David Carnoy/CNET

Battery life is similar for both models, although there are some slight differences. Both earbuds should provide up to 6 hours of listening time, according to Apple’s claims. But you’ll get 5.5 hours of battery life when using spatial audio and head tracking on the AirPods Pro 2, while the AirPods 3 offer slightly shorter 5-hour battery life with that surround sound feature enabled. Apple also claims the AirPods Pro 2 provides 4.5 hours of talk time, while the AirPods 3 offer up to 4 hours. 

The case for both earbuds should provide up to 30 hours of listening time, says Apple. But when it comes to talk time, you can expect to get 24 hours from the AirPods Pro 2’s case and 20 hours from the AirPods 3’s case. Five minutes in each case is expected to replenish around one hour of listening time or roughly 1 hour of talk time.

AirPods Pro 2 vs. AirPods 3: How to choose

The AirPods Pro 2 are for those who want active noise cancellation, better audio and a more customizable fit. You’ll also get some other perks, like the ability to track them down more easily should they get lost, thanks to the U1 chip. The AirPods 3 are a more suitable choice if you don’t care about noise cancellation and prefer earbuds with an open design (and yes, they cost about $50 less, so they do offer some appeal to those on tighter budgets). At the same time, the AirPods 3 still have more to offer than the AirPods 2, which lack features like water resistance, adaptive EQ and spatial audio with head-tracking.

airpods-pro-2-green-background-2 airpods-pro-2-green-background-2

David Carnoy/CNET

Battery Life Rated up to 6 hoursNoise Canceling Yes (ANC)Multipoint NoHeadphone Type Wireless earbudsWater-Resistant Yes (IPX4 — splash-proof)

The new AirPods Pro (2nd generation) are powered by Apple’s new H2 chip, which delivers more processing power while being more energy efficient, according to Apple. The new chip, combined with new low-distortion drivers, allows for improved sound that offers better clarity and depth. The noise canceling is also improved — Apple says the new AirPods have «double» the noise canceling of the original AirPods Pro. Additionally, the new AirPods add an extra hour of battery life, up from five to six hours with noise canceling on. Plus, a speaker in the case that emits a sound that helps locate your buds via Find My should they decide to hide from you.

Note that while Apple has discontinued the , they’ll remain on sale  until supplies are exhausted. However, most people should get this newer model if they can afford it. The AirPods Pro 2 continue to see small discounts, dipping to as low as $223 during Amazon’ Early Access Prime event in October.

Read our Apple AirPods Pro 2 review.

Apple AirPods 3rd gen on concrete Apple AirPods 3rd gen on concrete

David Carnoy/CNET

Battery Life Rated up to 6 hoursNoise Canceling NoMultipoint NoHeadphone Type Wireless earbudsWater-Resistant Yes (IPX4 — splash-proof)

Take one look at the new design of the third-gen AirPods, and the first thing you’ll probably think is: «Those look like the AirPods Pro without ear tips.» You wouldn’t be wrong. While they’re more fraternal than identical twins, the AirPods 3 are shaped like the AirPods Pro, with the same shorter stems and same pinch controls as those of the Pro. Aside from the design change, which should fit most ears better than the AirPods 2nd Generation (though not very small ears), the biggest change is to the sound quality: It’s much improved. Also, battery life is better, and the AirPods 3 are officially water-resistant.

Read our Apple AirPods 3 review.

More headphone recommendations

Technologies

Watch a Robot Stuff Cash Into a Wallet Just Like You Do

Generalist AI’s Gen-1 model is all about «teaching robots physical common sense.»

In 2026, we’re seeing robots progress by leaps and bounds with markedly improved dexterity, the kind of progress long needed in the quest for truly useful household helpers. Now a new AI model has arrived to power robots through activities, including folding laundry, constructing boxes, fixing other robots and even filling wallets with flimsy paper money.

Earlier this month, California-based company Generalist AI released Gen-1, a new physical AI model that makes robots capable of performing all of these tasks (and more) with success. It’s a big step forward in terms of robots designed for the real world based on intelligence born from the real world, Pete Florence, co-founder and CEO of Generalist AI told me.

In most of the example videos published by the company, Gen-1 is seen running on a pair of robotic arms, but that’s not all it’s built for. «Gen-1 is designed to be the brain of any robot, meaning the same model can run on a humanoid, an industrial arm or other robotic systems,» said Florence.

Already, this has proved to be a breakthrough year for general-purpose humanoid robots, with companies including Boston Dynamics and Honor unveiling cutting-edge bots capable of uncannily humanlike movements. The market for robots is expected to explode, with one estimate from Morgan Stanley predicting growth to a $5 trillion market by 2050. Predictions see robots coming for industry, retail, hospitality and care environments before eventually landing in our homes. To get us there, we need to see further advances in AI.

Training robots to live alongside humans

Over the past few years we’ve seen large language models, such as ChatGPT, Gemini and Claude, evolve at lightning speed. The same hasn’t been true of the physical AI models required to power robots, in large part because of a lack of data to train those models on. Robots — and especially humanoid robots — must learn to navigate a world built for humans just as a human would.

Often this data is collected from robots performing tasks while being teleoperated by humans, but not Gen-1. Instead, the dataset used to train Generalist AI’s models has been assembled by humans completing millions of different tasks using wearable technology.

«We built our own lightweight ‘data hands’ and distributed them globally to learn how people actually interact with objects, with all the subtle force feedback, tactile feel, slips, corrections and recoveries that define human dexterity in the real world,» said Florence. «That kind of data is critical for teaching robots physical common sense, the intuitive understanding and ability to adapt in real time rather than execute rigid instructions.»

Generalist AI has released a series of videos showing the model running on robots repetitively performing a range of different tasks, with the most compelling, perhaps, being a robot drawing cash out of a wallet before reinserting it into the same pocket. This is a fiddly task that many humans fumble over. It’s clearly not easy for the robot, either, given the flimsiness of the paper money and the fabric of the wallet — and yet it completes the task.

Another video shows a robot sorting socks by color, folding them in neat piles and counting the number of pairs using a touchscreen. Other tricky tasks the model can complete include unzipping and filling a pencil case with pens, stacking oranges in a neat pyramid and plugging in an Ethernet cable.

These videos show the breadth of Gen-1’s capabilities, but more impressive is the success rate with which it can complete certain tasks. Generalist AI measured the model’s hit rate against the previous version and found Gen-1 could successfully service a robot vacuum cleaner in 99% of cases (up from 50% for Gen-0), fold boxes in 99% of cases (up from 81% for Gen-0) and package up phones in 99% of cases (up from 62% for Gen-0).

Robots do improv

Most robots are programmed to complete a task in a specific and orderly way. But what happens when a curve ball gets thrown? «The smallest changes in the environment can cause failures,» said Florence.

An important skill robots need, which humans innately possess, is the ability to think on their feet. This is why Gen-1 has been designed with improvisation in mind so it can come up with strategies to complete tasks. Florence gives me an example of a robot using two hands to reposition an awkwardly placed part for an automotive task, even though it has only been trained to use one. 

«This kind of creativity has been largely absent from robotics until now,» he said.

Significant work still needs to be done when it comes to beefing up robots’ improv chops, but early progress show glimpses of a positive impact on both reliability and speed, says Florence. «We’re beginning to see real progress and are excited to push the boundaries of embodied intelligence.» 

After all, there may come a day when you need a robot in your house that can fix all your other smaller robots.  

Continue Reading

Technologies

iPhone 17 Pro Camera Battles the Galaxy S26 Ultra: Let the Fun Begin

They’re both top-end flagship phones, but which one takes better photos? I wanted to find out.

Both Apple’s iPhone 17 Pro and Samsung’s Galaxy S26 Ultra earned coveted CNET Editors’ Choice awards in their full reviews. And they damned well earned them, too, thanks to their stellar overall performance and wealth of top-end tech on board. But they also garnered praise for their camera quality, with both able to take great-looking photos in a variety of conditions. But which does it better? 

As a professional photographer myself, I was keen to find out, so I took them on a series of photo walks around Scotland to put them to the test in the same conditions. 

Before we dive in, a few notes from me. First, all images were captured in JPEG format using the standard camera app on each phone. On some images on the iPhone, Apple’s Gold Photographic Style was activated; on others, it was set to Standard, and I’ll be highlighting which is which. The images have been imported into Adobe Lightroom for comparison purposes and exported at smaller file sizes to better suit online viewing. No edits to the images themselves were made, and no sharpening was applied on the export. 

Read moreThese Are the Best Phone Cameras That We’ve Tested

Crucially, though, it’s important to keep in mind that the analysis here is my opinion. Photography is largely subjective, and what might look good to one person might not to another. For me, I love a more natural-looking image with accurate tones that I could then edit further later if I want to. You may like a punchy, vibrant tone straight out of the camera, and that’s fine. You’ll just need to take my results here with a slight pinch of salt. 

All that said, let’s dive in.

This was an image I took with the Gold filter accidentally enabled on the iPhone. So its warmer color tones are to be expected to an extent, but what I liked more here is the depth of shadow that the iPhone has maintained. The S26 Ultra has done a fair bit of processing here to lift those shadows and create a more balanced exposure overall, but I think it’s killed some of the evening drama as a result. I see this in a lot of Android phones, to be fair. 

Taken earlier in the day, there’s much less difference to be seen here. The iPhone’s colors are a bit warmer, thanks to the Gold filter, but they actually look more natural as a result. The shot doesn’t look warm in its white balance; it just has a richness to it, while the S26 Ultra’s shot looks quite cold. 

I switched the iPhone to Standard Photographic Style here, and as a result, the shot it took looks pretty similar to that taken by the Galaxy S26 Ultra. The exposures are pretty much the same, and while the green plants on the steps definitely look more vivid in the Galaxy’s shot, the colors elsewhere are broadly on par. 

If I’m nitpicking — which I really have to when the phones cost this much money — the S26 Ultra appears to have done a neater job rendering the details on the front of the VW Camper’s spare wheel. I also noticed more detail in some of the small twigs on the tree, especially where they’re visible against the sky. Is that a difference you’d ever notice without a side-by-side comparison? Definitely not. But this whole article is basically an exercise in pedantry, so I will continue to pick away at even the tiniest of things in these photos.

I’m back on the Gold Photographic Style with the iPhone here, so again, those warmer tones are to be expected, but I will say again that I much prefer the deeper shadows seen on the house in the Apple phone’s image. It looks much more natural, while the S26 Ultra’s shot looks a bit too HDR and oversaturated for my tastes. But that’s not the most important thing here…

What took me more by surprise was what happened when I put each phone into the ultrawide camera mode. The iPhone’s color tones stay almost exactly the same, but the Galaxy’s image has shifted quite dramatically between the main and ultrawide lenses.

The blue sky has shifted its hue into a much more teal-toned color, and I’m surprised by just how different it looks from the main camera. I usually expect to see these sorts of color shifts on cheaper phones, where there’s less effort put into ensuring consistent colors across the lenses. So I’m a bit disappointed to see Samsung’s phones producing such a noticeable shift here. 

The iPhone 17 Pro also displays a color shift, but it’s far less pronounced than the S26 Ultra’s.

I turned on the zooms on both phones. With its 10x optical zoom, the S26 Ultra has a longer reach than the 8x on the iPhone 17 Pro, but in terms of details within those images, there’s honestly nothing to choose between them. Again, the iPhone had the Gold style applied, so it looks warmer, and also again, the S26 Ultra has gone further in lightening those shadows. I can’t really say either one is better than the other in this example. 

But there’s a much bigger difference in this example. The colors are much richer in the iPhone’s shot, even though the Photographic Style is set to Standard. The S26 Ultra’s shot looks like the phone’s white balance has been tricked by the warm orange tones of the brickwork, and produced a colder-looking image as a result. 

But I also don’t like what the S26 Ultra has done with the details here. It’s oversharpened the scene, giving a weird, crunchy look to the subject that looks extremely unnatural. The iPhone, despite not having the same zoom range on paper, has delivered a much better-looking image, even when viewed at the same scale. 

But here the opposite seems to have happened. The iPhone has looked at this warm, sun-drenched scene and automatically set its white balance to cool it, while the S26 Ultra has maintained those warmer tones. Sure, the greens of the leaves in the S26’s image look almost neon, but the image overall is the nicer of the two in my view. 

The iPhone has done a much better job here of capturing the warmer tones that I loved so much when I took these images. I do think the S26 Ultra has gone too far in its hyper-saturation of the green leaves. Sure, it’s a punchy look, but if I wanted that much saturation, I’d maybe add a bit more back in in the editing stage. I’d much rather have a more natural image as a starting point, so the iPhone takes the win here for me.

There’s so little to pick out between the images here. The greens are a little more vibrant in the S26 Ultra’s shot, but the tones overall in the iPhone’s are a bit more natural. Neither one is a spectacular photo, and honestly, you may as well toss a coin to decide which one is better. 

Switching to the ultrawide lenses on both phones, the S26 Ultra has again gone quite hard on the saturation, delivering a much more vibrant blue sky than it did in its image from the main camera. As before, I’m not a fan of this sort of high-contrast, high-saturation photo. As a result, the iPhone 17 Pro is my preferred shot here.

I think the S26 Ultra’s tendency towards vibrancy has helped here, however, with this shot of spring blossom looking more joyful than the almost drab-looking image from the iPhone. 

And sure, the colors are a little overbaked from the S26 Ultra’s ultrawide image, but it still screams «spring» more than the iPhone’s shot, which again looks pretty dull and lifeless by comparison.

I was thrilled to find these fishermen hanging out in Edinburgh, and I think the iPhone has done the better job of capturing the moment. The Gold Photographic Style hasn’t produced an overly warm image here. It’s more like it applied just the correct white balance, with the S26 Ultra’s shot looking quite cold. It’s especially the case on the pink paintwork on the base of the building, which looks richer and much more true-to-life on the iPhone’s image.

At night, both phones have done a good job of capturing this complex image. The bright moon has been kept under control, and there’s plenty of detail still visible in some of the more shadowy areas. The exposures are also broadly similar (the iPhone’s is a touch brighter), and even when peering up close, there’s not much to choose from in terms of detail. 

It’s a slightly different story here, though. The iPhone’s shot is much brighter, but that results in some detail being lost in the highlights inside the phone booth. The S26 Ultra has retained that highlight detail, though its overall shot is darker. Personally, I prefer the darker version, especially as it’s much more in line with the moody nighttime aesthetic I was going for. 

What I don’t love is how much the S26 Ultra has oversharpened its image. Like the earlier image of the figure sitting on the wall, this image has been digitally sharpened to the point that the details look crunchy, high-contrast and ultimately quite unnatural. Which image would I choose — properly exposed but oversharpened, or natural details with blown-out highlights? Ideally, I’d simply take the photo again on the iPhone and lower the exposure a tad. But between the two images above, I’d probably go for the one shot on the Samsung phone.

iPhone 17 Pro vs. Galaxy S26 Ultra: Which has the better camera?

I always complain that these photo-capturing comparison stories are really close and therefore difficult to make into compelling articles, but this one felt especially close. In some shots, the iPhone’s more natural shadow rendering and less reliance on over-sharpening and other digital processing factors make them look better to my eye. But in other examples — especially the image with the tree trunks surrounded by ivy — the S26 Ultra has done a much better job with its color balancing. 

Overall, Samsung’s phone leans harder into contrast and saturation, which is literally the same thing we’ve said about Samsung’s phones since it first started putting cameras in them. Buying a Samsung camera phone has always meant getting more vibrant, punchy images out of it, and that’s exactly the case here. If you want quick images of your friends and family that look good enough to share straight to your family WhatsApp group, the S26 Ultra will serve you well. 

The iPhone 17 Pro tends to be more neutral in its color and contrast adjustments, which typically gives a more natural base for you to then add any extra edits of your own. It’s why Apple’s phones have typically always been the device of choice for more enthusiast or pro photographers and video creators. I count myself among that crowd, and it’s why the iPhone 17 Pro remains my preferred model of the two. But really, these are both excellent phones with superb cameras, and you can’t go far wrong with either.

Continue Reading

Technologies

Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay

Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay

In its latest update, Verum Messenger takes a major step toward integrating communication and financial services. Users can now enjoy a long-awaited feature — topping up their Verum Finance card directly through Apple Pay.

A New Level of Convenience

The integration with Apple Pay significantly simplifies the top-up process. Users no longer need to go through complex transfer steps or rely on third-party services. Just a few taps — and the funds are instantly credited to the card.

This is especially valuable for those who use Verum Messenger not only for communication but also for managing their finances within the ecosystem.

Finance and Messaging in One App

This update reinforces Verum’s strategy to combine in a single product:

  • secure communication
  • cryptocurrency operations
  • everyday financial tools

Verum Messenger is no longer just a messaging app — it is evolving into a полноценную fintech platform.

Security and Speed

Apple Pay is known for its high level of security thanks to:

  • biometric authentication
  • payment tokenization
  • no sharing of card details

By integrating these technologies, Verum Messenger ensures that financial operations are not only convenient but also максимально secure.

What This Means for Users

The update brings several key benefits:

  • instant card top-ups
  • simplified user experience
  • reduced reliance on third-party payment services
  • deeper integration of finance into everyday communication

Looking Ahead

The addition of Apple Pay is just one step in the evolution of the Verum ecosystem. It’s clear the team is moving toward creating a unified digital environment where users can handle most of their needs — from communication to capital management — within a single app.

Continue Reading

Trending

Copyright © Verum World Media