Technologies
Want to Get the Most Battery Life From Your Wireless Earbuds? Here’s How
Wireless earbuds have more features than ever, but that comes at a cost: power. Here’s how to maximize your battery life.

The saddest sound coming from your wireless earbuds isn’t Billie Eilish’s song, What Was I Made For? It’s the low-battery warning tone, especially when you’ve just sat down for your commute home. After a day of music, calls and the occasional TikTok video (on your break, ahem), can you reasonably expect to have some juice left for the ride home? The answer should be yes. Or it could be, depending on a few things.
Most active noise-canceling, or ANC, wireless earbud manufacturers publish battery life specs with varying degrees of specificity, ranging from basic playback time with and without ANC to a virtual dissertation on battery life using different codecs and features. Some also provide talk-time specifications, but none offer guidance on what to expect from mixed-use scenarios or using all the latest advanced features. So which of those features is depleting your battery the most?
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Real-world battery life
Most common battery drainers:
- ANC and transparency modes
- Listening volume
- Hi-res audio streaming
Less common but potential battery drainers:
- Distance between phone and earbuds
- Spatial audio processing/head tracking
The average mid- to top-tier earbuds, such as the Apple AirPods Pro, are rated for 8 hours of ANC-powered audio streaming, with some models from Samsung and Bose claiming far less (4 to 5 hours). Talk time with ANC for the AirPods is also less, around 5 to 6 hours. That means if you’re doing a combination of media streaming and phone calls, with many models, you’re looking at about 5 to 7 hours of battery life at best per charge while ANC is on.
The most common battery-draining features are ANC and transparency modes, as well as listening volume and high-resolution audio streaming. These are the places to start squeezing some extra life out of the batteries.
Turn off ANC or transparency, and you’ll see a noticeable increase in listening/talk time. For example, if a spec is 8 hours with ANC on, it’s usually 10 to 12 hours with it off. Other strategies include keeping your volume below 40% (for both battery life and your hearing), and if your earbuds support multiple codecs, choosing a lower-quality one can help. This may not be possible with iPhones, which have more limited codec options compared to Android devices.
There’s also an ever-expanding universe of other power-hungry features out there, including pulse monitoring, fitness tracking, spatial audio, live translation, sound equalization (EQ) and Find My (or the equivalent). These can add up to a death by a thousand cuts for your earbuds’ battery life. You’ll need to take a little tour through various settings to manage all of these, but if you’re running low on juice, you can turn some of them off to avoid being banished to the land of silence.
TL;DR: Turn the volume down, turn off any features you don’t absolutely need and keep your phone nearby.
Fast charging to the rescue
If conserving power by compromising on features isn’t your style, your other option is to take advantage of fast charging via your earbuds’ charging case. Most earbuds offer an extra hour or two from just 3 to 15 minutes of charge time. Hopefully, this will help you finish out your commute or long flight.
Earbud cases generally give you two to four full charges, but what if the case’s battery is low, too? If that happens to you regularly, and assuming wall charging or connecting to your laptop isn’t feasible, consider a small portable power bank. We have recommendations for Android and Apple products. For your next pair of earbuds, some models have cases with enough capacity to charge your phone as well. We like the Anker Soundcore P41i, which has a 3,000-mAh battery in its case.
One last note: To preserve the long-term health of your earbuds’ battery, avoid letting them reach 0% before recharging. Keeping the charge level between 20% and 80% is generally considered best practice for battery longevity over the long term.
Technologies
Adobe Firefly’s New AI Editing Tools Are a Step Toward More Precise AI Video
In an exclusive interview, Adobe shares how the company is building Firefly to be your forever partner for AI creation.
Anyone who’s used an AI image or video generator knows that these tools are often the opposite of precise. If you’re using an AI generator with a specific idea in mind, you’ll likely need to do a lot of work to bring that vision to life.
Adobe’s convinced that it can make its AI hub, Firefly, a place where AI can be customized and precise, which is what the company aims for with the release of new AI video editing tools on Tuesday.
Over the course of 2025, Adobe has quietly emerged as one of the best places to use generative AI tools. Firefly subscriptions starting at $10 a month, making it an affordable program that provides integration with top models from Google, OpenAI, Runway, Luma and several other leading AI companies. It’s expanding its roster with Topaz Labs’ Astra (available in Firefly Boards) and Flux 2.1 from Black Forest Labs, available in Firefly and Photoshop desktop.
The partnerships are helping to make Firefly an all-in-one hub for creators to leverage AI, said Steve Newcomb, vice president of product for Firefly, in an exclusive interview. Just as Photoshop is the «career partner» of photographers, Firefly aims to become a partner for AI video and image creators.
«If you’re a photographer, [Photoshop] has everything that you could ever want. It has all the tooling, all the plugins, in one spot with one subscription. You don’t have to subscribe to 25 different photo things,» Newcomb said. «So for us, Firefly, our philosophy is, how do we be that home?»
One way is through partnerships with AI companies, similar to Photoshop plug-ins. Precise editing tools are another, he said.
That’s why Adobe is trying to make it easier to edit AI-generated content. Hallucinations are common in AI-generated images and videos, such as disappearing and reappearing objects, weird blurs and other inaccuracies. For professional creators who use Adobe, the inability to edit out hallucinations makes AI almost unusable for final projects.
In my own testing, I’ve often found that editing tools are basic, at best. At worst, they’re entirely absent, particularly for newer AI video technologies. Firefly’s new prompt-based editing for AI videos, announced on Tuesday, is a way to get that hands-on control.
If you’ve edited images in Firefly via prompting, the video setup will feel familiar. Even if you haven’t, prompt-based editing is essentially a fancy term for asking AI to modify things as you would when talking with a chatbot. Google’s nano banana pro in Gemini is one example of an AI tool that allows you to edit through prompts.
Firefly’s video prompt editing has the added bonus of allowing you to switch between models for edits: You can generate with Firefly and edit with Runway’s Aleph, for example.
Like with any AI chatbot or tool, prompt-based editing isn’t always accurate. But it’s a nice option without having to leave Firefly for Premiere Pro.
The plan is to go beyond just prompt-based editing, Newcomb said. More AI-based precision editing tools for Firefly will be important, allowing you to make even more minute changes. What makes it possible is something called layer-based editing, a behind-the-scenes technology that enables easier, detailed changes in AI-generated images and videos.
Adobe plans to implement layer-based editing in the future, which will likely form the foundation for future AI video editing tools. The goal is to make it easier to stay working in Firefly «until the last mile» of editing, Newcomb said.
«We can run the gamut of the precision continuum all the way to the end, and just think of prompting as being one of many tools. But it is absolutely not the only tool,» said Newcomb.
For now, there is another piece of video editing news that could help you build more precise AI videos.
AI video editing without Premiere Pro expertise
Adobe is also bringing its full AI video editor into beta on Tuesday, the next step toward making editable and, therefore, usable AI video.
Debuted at the company’s annual Max conference in October, the video editor is now launching in a public beta. It sits between basic video editors and the feature-stuffed Premiere Pro. It’ll be great for AI enthusiasts who want more editing firepower than you get with OpenAI or Google, without needing expertise in Premiere Pro.
The video editor is meant to help you put all the pieces of your project together in one place. It has a multitrack timeline for you to compile all your clips and audio tracks. That’s especially important because, while you can create your own AI speech and soundtracks, Firefly AI videoes don’t natively generate with sound. (You can use Veo 3 or Sora 2 in Firefly to generate those initial clips with audio, though.) You can also export in a variety of aspect ratios.
«Think of the video editor as being one of our cornerstone releases that is helping us move toward being one place, one home, where you can have one subscription and get to every model you ever needed to get the job done,» Newcomb said.
Technologies
AI Slop for Christmas: Why McDonald’s and Coca-Cola’s AI Holiday Ads Missed the Mark
Commentary: Two billion-dollar companies using AI for holiday ads isn’t giving me that holly jolly feeling.
I am completely exhausted by huge corporations like McDonald’s and Coca-Cola choosing to rely so heavily on AI for their holiday ads. McDonald’s made $25.9 billion in revenue in 2024, and Coca-Cola made $47.1 billion. Do these companies expect us to be OK with AI slop garbage when they could’ve spent a tiny fraction of that to hire a real animator or videographer?
In case you haven’t been inundated with these AI commercials, I’ll back up a bit. Both McDonald’s and Coca-Cola have launched holiday-themed commercials that are undeniably made with AI — each bragged about its use of AI, which they have probably come to regret. They’re very different, showing the full range of what’s possible with AI in advertising. But the backlash against both proves we don’t have the appetite for AI slop.
McDonald’s commercial features a series of holiday-themed mishaps, set to a parody of the song It’s the Most Wonderful Time of the Year, about how it’s actually the most terrible time of the year. The commercial is only 30 seconds long and intended only for the Netherlands, but it has already garnered so much hate online that the company removed the video from its pages. The marketing agency behind the spot, The Sweetshop Film, still has the video up on its website.
The McDonald’s ad is very clearly AI, with short clips stitched together with a bunch of hard jump cuts. The text isn’t nearly legible, fine details are off and it just has that AI look I’ve come to quickly recognize as an AI reporter. In a now-deleted social media post, the marketing agency’s CEO talked about how it used various AI tools to create it. By contrast, the Coca-Cola commercial is a little more put-together. A Coca-Cola truck drives through a wintry landscape and into a snowy town, and forest animals awaken to follow the truck and its soda bottle contents to a lit Christmas tree in a town square. But even this video has clearly AI-generated elements.
While disappointed, I wasn’t surprised when I saw the ad and the resulting backlash. There has been a surge in creative generative AI tools, especially in the past year, with numerous AI tools built specifically for marketers. They promise to help create content, automate workflows and analyze data. A huge proportion (94%) of marketers have a dedicated AI budget, and three-quarters of them expect that budget to grow, according to Canva’s 2025 Marketing and AI report. That’s partly why we’ve seen a massive increase of AI-generated content in our social media feeds. It’s no wonder Merriam-Webster selected ‘slop’ as its word of the year.
McDonald’s and Coca-Cola’s feel-good, festive commercials manage to hit upon every single controversial issue in AI, which is why they’re inspiring such strong reactions from viewers. AI content is becoming — has already become — normalized. We can’t escape chatbots online and AI slop in our feeds. McDonald’s and Coca-Cola’s use of AI is yet another sign that companies are plowing ahead with AI without truly considering how we’ll react. Like advertisements, AI is inescapable.
If AI in advertising is here to stay, it’s worth breaking down how it’s used and where we, as media consumers, don’t want to see it used.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Spotting the AI in Coca-Cola’s ad
McDonald’s now-removed ad was clearly AI, with its plastic-y people and jerky motions. Its format, a series of short clips stitched together with hard jump cuts, is another telltale sign since most AI video generators can only generate clips up to 10 or so seconds long. Coca-Cola’s ad was a little different, but the AI use was just as obvious.
The Holidays Are Coming ad is a remake of Coca-Cola’s popular 1995 ad. In a behind-the-scenes video, Coca-Cola breaks down how it was created. It’s obvious where AI was used to create the animals. But I’m not sure I believe the company went «pixel by pixel» to create its fuzzy friends.
Coca-Cola’s AI animals don’t look realistic; they look like AI. Their fur has some detail, but those finer elements aren’t as defined as they could be. They also aren’t consistent across the animal’s body. You can see the fur gets less detailed further back on the animal. That kind of detailed work is something AI video generators struggle with, but it’s something a (human) animator likely would’ve caught and corrected.
The animals make overexaggerated surprised faces when the truck drives past them, their mouths forming perfect circles. That’s another sign of AI. You can see in the behind-the-scenes video that someone clicks through different AI variations of a sea lion’s nose, which is a common feature of AI programs. There’s also a glimpse of a feature that looks an awful lot like Photoshop’s generative fill. Google’s Veo video generator was definitely used at least once.
The company has been all-in on AI for a while, starting with a 2023 partnership with OpenAI. Even Coca-Cola’s advertising agency, Publicis Group, bragged about snatching Coca-Cola’s business with an AI-first strategy. It seems clear that the company won’t be swayed by its customers’ aversion to AI. (Disclosure: Ziff Davis, CNET’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)
All I want for Christmas is AI labels
There is exactly one thing Coca-Cola got right, and that’s the AI disclosure at the beginning of the video. It’s one thing to use AI in your content creation; it’s entirely another to lie about it. Labels are one of the best tools we have to help everyone who encounters a piece of content decipher whether it’s real or AI. Many social media apps let you simply toggle a setting before you post.
It’s so easy to be clear, yet so many brands and creators don’t disclose their AI use because they’re afraid of getting hate for it. If you don’t want to get hate for using AI, don’t use it! But letting people sit and debate about whether you did or didn’t is a waste of everyone’s time. The fact that AI-generated content is becoming indistinguishable from real photos and videos is exactly why we need to be clear when it’s used.
It’s our collective responsibility as a society to be transparent with how we’re using AI. Social media platforms try to flag AI-generated content, but those systems aren’t perfect. We should appreciate that Coca-Cola didn’t lie to us about this AI-generated content. It’s a very, very low bar, but many others don’t pass it. (I’m looking at you, Mariah Carey and Sephora. Did you use AI? Just tell us.)
AI in advertising
In June, Vogue readers were incensed when the US magazine ran a Guess ad featuring an AI-generated model. Models at the time spoke out about how AI was making it harder to get work on campaigns. Eagle-eyed fans caught J.Crew using «AI photography» a month later. Toys R Us made headlines last year when it ran a weird ad with an AI giraffe, though it did share that it was made with an early version of OpenAI’s Sora.
Something that really stung about the use of AI by Guess and J.Crew is how obvious it was that AI was used in place of real models and photographers. While Coca-Cola and Toys R Us’s use of AI was equally clear, the AI animals didn’t hit quite the same. As the Toys R Us president put it, «We weren’t going to hire a giraffe.» Points for honesty?
Even so, it’s more than likely that real humans lost out on jobs in the creation of these AI ads. Both commercials could’ve been created, and probably improved, if they had used animators, designers and illustrators. Job loss due to AI worries Americans, and people working in creative industries are certainly at risk. It’s not because AI image and video generators are ready to wholly replace workers. It’s because, for businesses, AI’s allure of cutting-edge efficiency offers executives an easy rationale. It’s exactly what just happened at Amazon as it laid off thousands of workers.
It’s easy to look at Coca-Cola’s and McDonald’s AI holiday ads and brush them off as another tone-deaf corporate blunder, especially when there are so many other things to worry about. But in our strange new AI reality, it’s important to highlight the quiet moments that normalize this consequential, controversial technology just as much as the breakthrough moments.
So this holiday season, I think I’ll drink a Pepsi-owned Poppi cranberry fizz soda instead of a Coke Zero.
Technologies
I’m Happy the 2026 Moto G Power Is $300 but Bummed It Lacks Wireless Charging
Motorola switches up some of the features on its lower-cost phone while adding an improved selfie camera and a larger battery.
Motorola added several notable improvements to the $300 Moto G Power that the 2025 model lacked. But the company did so at the expense of one of the prior phone’s best features: wireless charging.
I’m bummed about this news. I liked the 2025 Moto G Power because I could recharge it with a cable or wirelessly and was delighted to find such flexibility on such an affordable phone. But perhaps one of the reasons Motorola removed the feature was because it upgraded the new Moto G Power’s battery to a 5,200-mAh capacity. That’s 200 mAh more than the 2025 Moto G Power. Maybe Motorola is thinking you won’t need to recharge the phone as frequently. The Moto G (2026) and Moto G Play (2026) also got larger batteries.
Despite lacking wireless charging, it supports a respectable 30W wired charging speed.
The new Moto G Power runs on the MediaTek Dimensity 6300 chip and 8GB of RAM, the same processor and memory that the 2025 edition had. The 2026 Moto G Power has RAM Boost to help it run more apps and tasks. We’ll have to test it to see if there’s a noticeable difference between the new Moto G Power’s performance and its cheaper 2026 siblings, the Moto G Play and Moto G. The phone runs Android 16, and has Google’s Circle to Search along with the Gemini AI assistant.
Many of the 2026 Moto G Power’s other features were in the 2025 edition, which is largely a good thing. It has IP68 and IP69 ratings for using underwater, so it should handle having a drink spilled on it with no problem. It has a 6.8-inch 1,080-pixel resolution display and a pair of lenses on the back: a 50-megapixel wide-angle camera and an 8-megapixel ultrawide. The Moto G Power has a new 32-megapixel front-facing camera, which should improve photos and video calls from the 2025 Moto G Power’s 16-megapixel selfie camera.
The Moto G Power comes in white (Pantone pure cashmere) and dark blue (Pantone evening blue) and will go on sale Jan. 8 in the US, with initial availability at Motorola’s website, Best Buy and Amazon. It will also be available at wireless carriers over the coming months.
-
Technologies3 года ago
Tech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года ago
Best Handheld Game Console in 2023
-
Technologies3 года ago
Tighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года ago
Black Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года ago
Verum, Wickr and Threema: next generation secured messengers
-
Technologies4 года ago
Google to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies4 года ago
Olivia Harlan Dekker for Verum Messenger
-
Technologies4 года ago
iPhone 13 event: How to watch Apple’s big announcement tomorrow