Connect with us

Technologies

Galaxy S24 Ultra: What’s Changed From the S23 Ultra (and Should You Upgrade)

Here’s what’s new in the Samsung S24 Ultra and whether it’s worth swapping out last year’s premium S23.

Samsung unveiled its newest Galaxy S24 phones at its Unpacked event. The most premium of the lot is the Galaxy S24 Ultra, which has several new features over last year’s model. What’s changed from the S23 Ultra, and should you upgrade? Probably not, but let me explain. 

First, here’s what’s changed. The S24 Ultra has a nominal amount of upgrades on its predecessor (mainly under the hood), but is pricier, starting at $1,300, while the S23 Ultra had a $1,200 price tag at launch (and can probably be found for cheaper now).

Read more: Best Android Phone of 2024

The S24 Ultra visually looks identical to its predecessor, with broadly the same design and rear camera layout. There are subtle differences. For instance, Samsung’s new high-end phone has a titanium frame, which should be tougher than the aluminum frame on the S23 Ultra.

For the S24 Ultra, Samsung dispensed with the curved edges of the display found on the S23 Ultra, leaving a flat front (Samsung claims it has 47% less bezel on the sides). That new screen is also brighter with a maximum of 2,600 nits (the S23 Ultra maxed out at 1,750 nits) and thus easier to see in bright daylight. The S24 Ultra’s screen is also made of Corning’s newest and toughest material, Gorilla Glass Armor, while the S23 has Corning’s Gorilla Glass Victus 2. 

In short: it looks the same but should be tougher and have a potentially brighter screen. The S24 Ultra also has an improved cooling system with a vapor chamber that’s twice as large and two added layers of thermal insulation, so we’d expect it to maintain better framerates and temperatures when running performance-intensive operations, like gaming.

As for cameras, the S24 Ultra inherits most of its predecessor’s lenses and sensors but does use a 50-megapixel 5x optical telephoto camera in place of the 10-megapixel 10x optical camera in the S23 Ultra. While the new camera has a shorter optical length, its higher megapixel count should mean photos taken with it are sharper than those shot with the S23’s 10x optical camera. (We’ll know for sure when we can compare both phones’ photo capabilities side by side.) To get to 10x optical length, the S24 Ultra shoots with its 5x optical camera and crop zooms in.

Read more: Samsung Galaxy S24 Phones Have a New Zoom Trick to Get That Close-Up Photo

The S24 Ultra is powered by a Snapdragon 8 Gen 3 chipset, which is faster than the Snapdragon 8 Gen 2 in the S23 Ultra. Configurations have also been simplified in the new premium phone: you can only get 12GB of RAM and 256GB, 512GB or 1TB of storage (no 8GB RAM option, as there was in the S23 Ultra). The rest of the hardware remains relatively unchanged, although the S24 Ultra should get a minor battery efficiency boost thanks to its newer chipset.

The new Circle to Search feature being shown on the Galaxy S24 Ultra

The big difference between the new phones, at least at the launch of the S24 Ultra, is the new phone’s generative AI capabilities, called Galaxy AI. The most noteworthy is Circle to Search, which lets you trace an area on your screen with your stylus or finger to have your phone look up what you’ve circled without ever having to leave the app you’re in. It’s conceivably great for looking things up, like identifying landmarks in the travel photos your friend sends over or trying to track down fashion items in someone’s outfit.

The S24 Ultra (and the rest of the S24 lineup, to be clear) has other generative AI capabilities, like summarizing notes and live translations during phone calls. It can also suggest different tones for text messages depending on who you’re talking to, recommending more formal tones for bosses or casual tones for friends. Generative AI photo tricks let users do things like move or delete some elements, expand photos beyond their original boundaries or correct the tilt of an image, then use AI to fill in the backgrounds and empty areas.

Some of these generative AI features can be processed locally, like live translation, while others require sending requests to the cloud. The S24 Ultra does have a toggle in its settings to require generative AI requests to be performed on the device, which helps keep what you’re asking (and what your phone is addressing) private.

The kicker? Galaxy AI is coming to some older Samsung phones, including the S23 series, at some point in the future. While Samsung hasn’t said when to expect them, these features should come to the S23 Ultra in time.

There are other non-generative AI upgrades the S24 Ultra has over its predecessor, like making photos more stable during movement and improving low-light photography. The S Pen accessory is more or less unchanged from last year. 

There is one more upgrade that’s worth mentioning: Samsung expanded how long it’s pledging to support its newest phones. The S24 Ultra comes with Android 14 and will get seven years of Android software and security patches, up from four years in the S23 Ultra (which comes with Android 13). That’s big. For sustainability, the S24 Ultra has more recycled parts, including cobalt in its battery and rare earth elements in its speakers.

Samsung Galaxy S23

Should S23 Ultra owners upgrade to the new S24 Ultra?

You can count the hardware improvements on one hand and they don’t meaningfully change how owners use their new S24 Ultra compared to last year’s S23 Ultra. Ultimately, if you own Samsung’s premium phone from 2023, the only reason you to consider upgrading to the new one is to access generative AI today — or if you wanted a phone that could conceivably last you seven years.

As previously mentioned, all the Galaxy AI features are expected to come to the S23 Ultra at some point in the future. It’s unclear when that will happen, and though Samsung said all of the S24 Ultra’s generative AI features will come to its predecessor, we’re concerned whether last year’s premium phone can pull that off.

Last October, Qualcomm unveiled its Snapdragon 8 Gen 3 as the first phone-powering chipset to include generative AI on its silicon. That’s the chip powering the S24 Ultra, and presumably, the new phone needs that silicon to process some of its generative AI features. Either the S23 Ultra will require more of those Galaxy AI features to run through the cloud, or the Snapdragon 8 Gen 2 powering the S23 Ultra is actually capable of running on-device generative AI, but those features weren’t ready when that chipset launched in late 2022.

Whatever the case, whether you want generative AI weighs more heavily than any other factor on whether it’s worth upgrading from the S23 Ultra to the new S24 Ultra. It’s worth noting that we haven’t gotten to thoroughly experience Galaxy AI ourselves, and can’t make a summary judgment of its usefulness until we do. Once our full review comes out, we’ll be better informed to say whether the S24 Ultra provides a superior experience to its predecessor. For now, we recommend waiting — unless you want to be on the absolute cutting edge of mobile technology. 

Samsung’s Galaxy S24 Ultra Now Has a Titanium Design

See all photos

Samsung Galaxy S24 Ultra specs vs. Samsung Galaxy S23 Ultra

Samsung Galaxy S24 Ultra Samsung Galaxy S23 Ultra
Display size, tech, resolution, refresh rate 6.8-inch AMOLED; 3,120×1,440 pixels; 1-120Hz adaptive refresh rate 6.8-inch AMOLED; 3,088×1,440 pixels; 120Hz adaptive refresh rate
Pixel density 501 ppi 500 ppi
Dimensions (inches) 6.40 x 3.11 x 0.34 in 6.43 x 3.07 x 0.35 in
Dimensions (millimeters) 163 x 79 x 8.6 mm 163.3 x 78 x 8.9 mm
Weight (grams, ounces) 233 g (8.22 oz) 234 g (8.25 oz)
Mobile software Android 14 Android 13
Camera 200-megapixel (wide), 12-megapixel (ultrawide), 10-megapixel (3x telephoto), 50-megapixel (5x telephoto) 200-megapixel (wide), 12-megapixel (ultrawide) 10-megapixel (3x telephoto) 10-megapixel (10x telephoto)
Front-facing camera 12-megapixel 12-megapixel
Video capture 8K 8K
Processor Qualcomm Snapdragon 8 Gen 3 Qualcomm Snapdragon 8 Gen 2 for Galaxy
RAM/storage 12GB RAM + 256GB, 512GB, 1TB 8GB RAM + 256GB; 12GB RAM + 256GB, 512GB, 1TB
Expandable storage None None
Battery 5,000 mAh 5,000 mAh
Fingerprint sensor Under display Under display
Connector USB-C USB-C
Headphone jack None None
Special features Titanium frame, 2,600-nit peak brightness; 7 years of OS and security updates; 5G (mmWave); IP68 water resistance; wireless PowerShare to charge other devices; integrated S Pen; UWB for finding other devices; 45W wired charging (charger not included); Galaxy AI; Wi-Fi 7; Gorilla Glass Armor cover glass 4 years of OS updates, 5G (Sub6, mmWave); IP68 water resistance; wireless PowerShare to charge other devices; integrated S Pen; 100x Space Zoom; 10x optical zoom; UWB for finding other devices; 45W wired charging
US price starts at $1,300 (256GB) $1,200 (256GB)
UK price starts at £1,249 (256GB) £1,249 (256GB)
Australia price starts at AU$2,199 (256GB) AU$1,949 (256GB)

Editors’ note: CNET is using an AI engine to help create some stories. For more, see this post.

Technologies

The Ultimate AI Wearable Is a Piece of Tech You Already Own

Commentary: Tech companies are trying to give us dedicated AI devices. There’s no need — we all have them already.

In some quarters, the rise of AI has sparked the urge to invent all-new devices, which are deeply invested in that technology but which look and function differently from any products we’ve owned before.

These range from head-mounted XR devices, such as headsets and glasses, to pins, necklaces, phone accessories and whatever mystery product former Apple designer Jony Ive and OpenAI are developing in secret.

But what if, in pursuit of these new devices, we overlook the fact that the ultimate AI form factor is something we all already own? It could even be that the best way to deploy AI is through tech that dates back to the 19th century. 

I’m talking about headphones.

There hasn’t been a lack of evolution in personal audio over the years, but integrating AI into headphones is giving them a new lease on life, says Dino Bekis, vice president of wearables at chipmaker Qualcomm. We’re starting to see this with devices like Apple’s new AirPods Pro 3.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The impact of AI on headphones will be twofold, says Bekis. First, it will build on improvements we’ve already seen, such as the ability to easily switch among active noise cancellation, transparency and other listening modes. 

Instead of that being something we need to control manually, the headphones themselves will increasingly handle it all dynamically. Sensors on board, layered with AI, become more adept at reading and understanding our immediate surroundings.

Bekis says that maybe your headphones could alert you to someone trying to get your attention by recognizing your name being called, even if you’re listening to music with ANC enabled. If you’re on a call, walking along a busy street, they could alert you to traffic dangers, sirens or someone who might be walking close behind you.

But where he really sees AI headphones coming into their own is in the interactions you’ll have with AI agents. These personal assistant-like versions of artificial intelligence will operate autonomously with our devices and services on our behalf.

There’s no more «natural way» than conversation to interact with them, he says, and the high-quality mics and speakers in your headphones will allow for clear and effective communication.

«Earbuds or headphones are really yesterday’s technology that’s suddenly been reinvented and is becoming the primary way we’re going to be interfacing with agents moving forward,» says Bekis.

Headphone-makers, meet AI

Not all headphones are on the verge of transforming into wearable AI assistants, and the situation is not the same across the board. Many legacy headphone companies are «entrenched in their core focus of audio quality and audio file capability,» says Bekis.

At the same time, Bekis says Harman-owned high-end audio brand Mark Levinson is one headphone maker Qualcomm is working with on integrating AI into its products. And smartphone manufacturers who also have audio products in their lineup are at the forefront of the charge.

You only need to look at the new capabilities that Samsung, Google and Apple have bolstered their headphones with over the past few years. In addition to adaptive audio, the companies are starting to add AI-specific features. Google’s Pixel Buds 2 are engineered not just as an audio device but as hardware with the company’s Gemini AI assistant at the core (you could say «Hey, Google» to activate Gemini and ask it to summarize your emails, for example).

In September, Apple introduced AI-powered live translation with the AirPods Pro 3. The AirPods will parse what someone is saying to you and play it in your chosen language in your ear. They will also pick up your speech and translate it so that you can show the other person a transcript in their language on your phone screen. 

Apple also seems to be searching for ways to further tap the AI potential of its headphones range. A report from Bloomberg earlier this month suggested that the company might introduce AI-powered infrared cameras with the next version of the AirPods Pro, which could be activated by and respond to gestures.

It’s clear that smartphone-makers can see the potential in headphones to be more than just audio products, in the same way they once recognized that the phone could be more than simply a device for making calls. They might even turn headphones and earbuds into what I think could be the ultimate AI wearable.

Why headphones?

The biggest argument for headphones over other emerging AI-focused wearable tech is their popularity: Who doesn’t own at least one pair? (My feeling is that everyone should own at least three different styles, each with its own strengths.) It’s just not the same with glasses or watches.

Yes, they are common and familiar, but the likelihood is that if you don’t already wear them regularly, the addition of AI is unlikely to persuade you. Glasses, in particular, have drawbacks, including battery life. There’s also the difficulty of combining the tech with prescription lenses and privacy concerns due to the addition of cameras.

After well over a decade of effort, tech companies are also still struggling to make smart glasses as sleek and comfortable to wear as their non-smart counterparts (the Meta Ray-Bans perhaps being the one exception to the rule here). 

Smartwatches and fitness bands, meanwhile, have become more comfortable, but many people still find them cumbersome for sleeping. The sensors in them are too far away from our faces, where we receive the majority of our sensory inputs, to comprehend the world around us with forensic detail. They cannot relay sensory feedback to us without us having to look at a screen. The same is true for rings and other smart jewelry.

There are no devices that rival headphones, and earbuds in particular, for sheer proximity to a major sensory organ capable of both inputting and outputting complex sensory data. They have been and remain discreet, easy to take on and off, and not overly power hungry or demanding when it comes to charging frequency. 

«Critically, there’s the social acceptance level of this as well, where, ultimately, headphones have become incredibly commonplace,» says CCS Insight Analyst Leo Gebbie. 

They don’t insert a noticeable barrier between you and the world you’re experiencing. Plus, even when they’re obvious, they don’t tend to put people on edge over concerns you could be capturing their image, and you don’t need to learn how to use them, Gebbie says.

 «Contrast that with something like smart glasses, where I think there is a whole new set of user behaviors that would need to be learned in terms of exactly how to interact with that device,» he says. «Also, there’s kind of a social contract, which, for me, at least with smart glasses, has always been one of the biggest stumbling blocks.»

What’s more, headphones have been getting gradually smarter all this time without most of us even noticing.

This invisible evolution is the closest tangible expression I’ve seen of the widespread belief among tech leaders that AI should be a subtle, ambient force that permeates our lives as inconspicuously as possible.

Headphones are an established product that shows consistent growth, making them the safest bet for companies that want as many people as possible to engage with AI through wearable tech. 

Multiple forecasts, including from SNS Insider and Mordor Intelligence, estimate the global market for headphones will grow to over $100 billion by the early 2030s. By contrast, Mordor forecasts the smart glasses market will grow to $18.4 billion in the same period, one of the higher estimates I found.

Companies are always searching out new revenue streams, hence their determination to explore new kinds of AI devices, says Gebbie. But, he adds, «headphones definitely feel like a safer bet, because it’s a form factor that people are familiar with.»

It may well be the case that no single wearable device will define our coexistence with AI, and if there is, it will be a device of our choosing. 

But rather than reinvent the wheel, I strongly suspect the companies embracing the potential of headphones will see these formerly audio-focused devices fly in the age of AI. And perhaps it’s just personal preference, but I’m on board.

Continue Reading

Technologies

Phone Plugged in 24/7? Experts Reveal the Science Behind Battery Damage

Phone batteries degrade over time, but heat and use habits are a larger danger than keeping your phone plugged in.

There was a time when smartphone users were warned not to leave their phones plugged in for too long, or it could do damage to the battery. While modern smartphones now have overcharge protection that keeps them safe, many people still have questions about whether keeping their phone perpetually plugged in will damage the battery.

The short answer is no. Keeping your phone plugged in all the time won’t ruin your battery. Modern smartphones are built with smart charging systems that cut off or taper power once they’re full, preventing the kind of «overcharging damage» that was common in older devices. So if you’re leaving your iPhone or Android on the charger overnight, you can relax.

That said, «won’t ruin your battery» doesn’t mean it has no effect. Batteries naturally degrade with age and use, and how you charge plays a role in how fast that happens. Keeping a phone perpetually at 100% can add extra stress on the battery, especially when paired with heat, which is the real enemy of longevity. 

Understanding when this matters (and when it doesn’t) can help you make small changes to extend your phone’s lifespan.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The science behind battery wear

Battery health isn’t just about how many times you charge your phone. It’s about how it manages voltage, temperature and maintenance. Lithium-ion batteries age fastest when they’re exposed to extreme levels: 0% and 100%. 

Keeping them near full charge for long stretches puts additional voltage stress on the cathode and electrolyte. That’s why many devices use «trickle charging» or temporarily pause at 100%, topping up only when needed.

Still, the biggest threat isn’t overcharging — it’s heat. When your phone is plugged in and running demanding apps, it produces heat that accelerates chemical wear inside the battery. If you’re gaming, streaming or charging on a hot day, that extra warmth does far more harm than leaving the cable plugged in overnight.

Apple’s take

Apple’s battery guide describes lithium-ion batteries as «consumable components» that naturally lose capacity over time. To slow that decline, iPhones use Optimized Battery Charging, which learns your daily routine and pauses charging at about 80% until just before you typically unplug, reducing time spent at high voltage.

Apple also advises keeping devices between 0 to 35 degrees Celsius (32 to 95 degrees Fahrenheit) and removing certain cases while charging to improve heat dissipation. You can read more on Apple’s official battery support page.

What Samsung (and other Android makers) do

Samsung offers a similar feature called Battery Protect, found in One UI’s battery and device care settings. When enabled, it caps charging at 85%, which helps reduce stress during long charging sessions.

Other Android makers like Google, OnePlus and Xiaomi include comparable options — often called Adaptive Charging, Optimized Charging or Battery Care — that dynamically slow power delivery or limit charge based on your habits. These systems make it safe to leave your phone plugged in for extended periods without fear of overcharging.

When constant charging can hurt

Even with these safeguards, some conditions can accelerate battery wear. As mentioned before, the most common culprit is high temperature. Even for a short period of time, leaving your phone charging in direct sunlight, in a car or under a pillow can push temperatures into unsafe zones.

Heavy use while charging, like gaming or 4K video editing, can also cause temperature spikes that degrade the battery faster. And cheap, uncertified cables or adapters may deliver unstable current that stresses cells. If your battery is already several years old, it’s naturally more sensitive to this kind of strain.

How to charge smarter

You don’t need to overhaul your habits but a few tweaks can help your battery age gracefully. 

Start by turning on your phone’s built-in optimization tools: Optimized Battery Charging on iPhones, Battery Protect on Samsung devices and Adaptive Charging on Google Pixels. These systems learn your routine and adjust charging speed so your phone isn’t sitting at 100% all night.

Keep your phone cool while charging. According to Apple, phone batteries perform best between 62 and 72 degrees Fahrenheit (16 to 22 degrees Celsius). If your phone feels hot, remove its case or move it to a better-ventilated or shaded spot. Avoid tossing it under a pillow or too close to other electronics, like your laptop, and skip wireless chargers that trap heat overnight.

Use quality chargers and cables from your phone’s manufacturer or trusted brands. Those cheap «fast-charge» kits you find online often deliver inconsistent current, which can cause long-term issues.

Finally, don’t obsess over topping off. It’s perfectly fine to plug in your phone during the day for short bursts. Lithium-ion batteries actually prefer frequent, shallow charges rather than deep, full cycles. You don’t need to keep it between 20% and 80% all the time, but just avoid extremes when possible.

The bottom line

Keeping your phone plugged in overnight or on your desk all day won’t destroy its battery. That’s a leftover myth from a different era of tech. Modern phones are smart enough to protect themselves, and features like Optimized Battery Charging or Battery Protect do most of the heavy lifting for you.

Still, no battery lasts forever. The best way to slow the inevitable is to manage heat, use quality chargers and let your phone’s software do its job. Think of it less as «babying» your battery and more as charging with intention. A few mindful habits today can keep your phone running strong for years.

Continue Reading

Technologies

Magic Cue Might Be Pixel 10’s Most Helpful Feature. Here’s How To Use It.

With AI, Magic Cue can instantly pull up flight information, reservation details and photos in calls and texts, so you don’t have to dig for them.

You might be sick of hearing about all the AI features loaded on your phone. But if you have a Pixel 10, there’s one key capability that may be worth tapping into.

Magic Cue is one of Google’s latest AI flexes. It can surface information related to what’s on your phone’s screen, so you don’t have to dig for it yourself. For example, if you’re calling your airline, Magic Cue will automatically show your upcoming flight information on the call screen. Or if your friend texts to ask about what time dinner is, those details will appear within Messages without you having to look for them. 


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The Pixel 10 series is loaded with other impressive AI features, like a Voice Translate feature that can mimic the sound of a caller’s voice while translating what they’re saying. AI can also sharpen your zoomed-in photos and help you take better pictures with Camera Coach. And Circle to Search remains one of my favorite mobile tools. But Magic Cue is one of the few capabilities that succinctly delivers on the promise of AI to simplify tasks and act as a helpful mobile assistant. 

Like many AI features, Magic Cue can be hit-or-miss, and in many ways it’s still finding its footing. But it stands out as one of the more practical and helpful AI features you can use on the Pixel 10, 10 Pro, 10 Pro XL and 10 Pro Fold.  

Which devices can use Magic Cue?

Only Google Pixel 10 phones can tap into Magic Cue. It’s powered by the Google Tensor G5 chip and the latest version of the Gemini Nano AI model. So if you have an older Pixel phone or a different Android phone, this won’t be available to you.

How to use Magic Cue

To use Magic Cue, you’ll first need to allow access to the capability in your Pixel 10’s settings. 

When you open Settings, you’ll see Magic Cue listed near the bottom. Tap that and hit the toggles to allow suggestions and information to pop up based on what’s on your screen. 

You’ll also see an option to choose which specific apps you want Magic Cue to pull data from, like Gmail, Messages and Calendar. That way if you have a flight reservation in your email or a dinner blocked off in your calendar, Magic Cue can surface that information when it relates to a conversation on your screen. Google’s support page for Magic Cue also notes that suggestions can show up on «select third-party messaging apps,» though I personally haven’t seen it appear in WhatsApp just yet, for example.  

Within Magic Cue’s settings, you’ll also see whether an update is needed for the feature to work properly. Under the Magic Cue updates tab, it should say «Up to date.»

You’ll be able to use Magic Cue 24 hours after you set it up on your Pixel 10. It may take some time for it to process data across your apps and show relevant suggestions, but it’ll get better at providing information and actions as you continue to use your phone.

Magic Cue processes everything on-device, so you shouldn’t worry about your personal information being compromised.

How Magic Cue works

Once Magic Cue is enabled, it’ll suggest actions and surface information related to what you’re doing on your Pixel. 

For instance, if you’re calling an airline, your flight details, including departure and arrival time and confirmation number, will appear on the call screen. That way, when a customer service agent asks for those details, you’ll have them readily available.

Similarly, if a friend texts to ask when your flight lands, those details will pop up automatically within Messages, and you can just tap to send. Or if someone asks where you’re having dinner tonight, Magic Cue can find that information from your calendar so you don’t have to drop it in yourself. 

Magic Cue also works with Google Photos, so if someone asks for a picture of someone or something, you can tap the Share Photos button that pops up in Messages and select which suggested image is the right fit. 

In my experience, Magic Cue has been helpful but not perfect. It does a good job of showing flight or reservation information from my email or calendar. But there are also times it’ll just say «View calendar» when someone asks what time something is happening. In those instances, Magic Cue isn’t really saving me any time or effort, since I can easily swipe to my calendar myself. But I have hope it’ll get better with time and more consistently feel like a magic trick.

Continue Reading

Trending

Copyright © Verum World Media