Connect with us

Technologies

Galaxy S24 Ultra: One Day With Samsung’s New Phone

Circle to Search and Instant Slow-Mo are my favorite new features so far.

The Galaxy S24 Ultra may look a lot like the Galaxy S23 Ultra at first glance. But Samsung’s newest phones are the first to come with Galaxy AI. It’s an umbrella term for tools and features powered by generative AI that can generate content and responses that sound conversational (but aren’t always accurate) after being trained on data. It’s the same flavor of AI that fuels ChatGPT, and the Galaxy S24 lineup is an example of how the tech is being applied to new smartphones.

I’ve been using the Galaxy S24 Ultra for a day, and one Galaxy AI feature has stood out to me in that short time: Circle to Search. I just press and hold the home button and draw a circle around anything I see on screen to launch a Google search for that object. It works intuitively and reliably so far and feels practically useful in everyday life unlike other AI-powered additions to the Galaxy S24.

Read more: Samsung’s Galaxy Ring Will Need Less of Your Attention Than a Smartwatch

I need more time with the S24 Ultra to truly assess the usefulness of Galaxy AI and to test out the new 50-megapixel telephoto camera among other updates.

As I wrote in my initial first impressions story, Samsung’s new AI features don’t feel strikingly new and different from the generative AI features from Microsoft and Google. Instead, the Galaxy S24 Ultra feels like a statement about how generative AI features are becoming table stakes on new phones. 

Circle to Search is the standout Galaxy AI feature so far

The Galaxy S24 Ultra's Circle to Search feature being shown on screen

Galaxy AI is a collection of features that spans everything from photo editing to texting, phone calls and note-taking. There’s a tool for moving and removing unwanted objects from photos and refilling the scene so that it looks natural, for example. The Samsung Notes app can organize notes into bullet points and phone calls can be translated between languages in real time. (Check out my first impressions story for a list of some of the top Galaxy AI features.)

But Circle to Search is the one that stood out to me the most. The feature, which was developed in partnership with Google, allows you to search for almost anything on your phone’s screen just by circling it. Based on the time I’ve had with it so far, Circle to Search seems fairly accurate in determining the type of content I’m looking for based on what I’ve circled. 

For example, when I circled an image of the character Siobhan Roy from the HBO drama series Succession in a news article, the Galaxy S24 pulled up results that showed more information about the actress Sarah Snook, who plays her in the series. But when I just circled her outfit, I got results showing where to buy cream-colored blazers and slacks similar to those she was wearing in the image. 

I’ve also been using the Galaxy S24 Ultra to organize my notes during the process of writing my review and transcribe meetings. I appreciated being able to have the phone turn my list of tests I’d like to run on the Galaxy S24 Ultra into neat and tidy bullet points. Samsung’s Recorder app also transcribed a meeting and summarized the key points into bullet points. While I wouldn’t rely on those bullet points alone for work-related tasks, it was a handy way to see which topics were discussed at specific timestamps in the conversation. 

That feature isn’t unique to Samsung’s Recorder app; Google’s app can also do this, as can the transcription service Otter.ai. But combined with other features like the ability to automatically format notes, I’m beginning to see how generative AI could make phones more capable work devices. 

Galaxy S24 Ultra’s new telephoto camera and slow motion

The Galaxy S24 Ultra's camera interface shown on screen

The biggest difference between the Galaxy S23 Ultra’s camera and the Galaxy S24 Ultra’s is the latter’s new 50-megapixel telephoto camera with a 5x optical zoom. That replaces the Galaxy S23 Ultra’s 10-megapixel telephoto camera with a 10x optical zoom, a choice that Samsung made after hearing feedback that users generally preferred to zoom between 2x and 5x.

Read more: Samsung Galaxy S24 Phones Have a New Zoom Trick to Get That Close-Up Photo

I haven’t had too much time to test this extensively, but I’m already seeing a difference. Take a look at the 5x zoom photos below of a wooden sign I came across at a San Jose, California, park. The photos may look similar at first, but you can see the changes when enlarging the images. The text is sharper in the Galaxy S24 Ultra’s photo, and there’s less image noise. 

Galaxy S24 Ultra 

A photo of a wooden sign in a park
A photo of a wooden sign in a park

Galaxy S23 Ultra

A photo of a wooden sign in a park
A photo of a wooden sign in a park

Image quality aside, Samsung also introduced some new camera tricks on the Galaxy S24 Ultra. While Generative Edit may have gotten a lot of attention following Samsung’s announcement, Instant Slow-Mo has impressed me the most so far. I just hold down on a video clip I captured and the phone converts it into a slow motion video by generating extra frames. I can preview how the clip will look in slow motion by pressing and lifting my finger to switch between the regular and slowed-down footage. 

Taken together, it seems like Galaxy AI has the potential to make Samsung’s phones more useful and helpful. Most of the features that are currently available, like Circle to Search and note summaries, feel practical rather than gimmicky. But the bigger question is whether Samsung will be able to meaningfully differentiate its offerings moving forward, especially since Google’s Pixel phones provide similar functionality and Samsung plans to bring Galaxy AI to the Galaxy S23 lineup as well. 

Editors’ note: CNET is using an AI engine to help create some stories. For more, see this post.

Samsung’s Galaxy S24 Ultra Now Has a Titanium Design

See all photos

Technologies

The Ultimate AI Wearable Is a Piece of Tech You Already Own

Commentary: Tech companies are trying to give us dedicated AI devices. There’s no need — we all have them already.

In some quarters, the rise of AI has sparked the urge to invent all-new devices, which are deeply invested in that technology but which look and function differently from any products we’ve owned before.

These range from head-mounted XR devices, such as headsets and glasses, to pins, necklaces, phone accessories and whatever mystery product former Apple designer Jony Ive and OpenAI are developing in secret.

But what if, in pursuit of these new devices, we overlook the fact that the ultimate AI form factor is something we all already own? It could even be that the best way to deploy AI is through tech that dates back to the 19th century. 

I’m talking about headphones.

There hasn’t been a lack of evolution in personal audio over the years, but integrating AI into headphones is giving them a new lease on life, says Dino Bekis, vice president of wearables at chipmaker Qualcomm. We’re starting to see this with devices like Apple’s new AirPods Pro 3.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The impact of AI on headphones will be twofold, says Bekis. First, it will build on improvements we’ve already seen, such as the ability to easily switch among active noise cancellation, transparency and other listening modes. 

Instead of that being something we need to control manually, the headphones themselves will increasingly handle it all dynamically. Sensors on board, layered with AI, become more adept at reading and understanding our immediate surroundings.

Bekis says that maybe your headphones could alert you to someone trying to get your attention by recognizing your name being called, even if you’re listening to music with ANC enabled. If you’re on a call, walking along a busy street, they could alert you to traffic dangers, sirens or someone who might be walking close behind you.

But where he really sees AI headphones coming into their own is in the interactions you’ll have with AI agents. These personal assistant-like versions of artificial intelligence will operate autonomously with our devices and services on our behalf.

There’s no more «natural way» than conversation to interact with them, he says, and the high-quality mics and speakers in your headphones will allow for clear and effective communication.

«Earbuds or headphones are really yesterday’s technology that’s suddenly been reinvented and is becoming the primary way we’re going to be interfacing with agents moving forward,» says Bekis.

Headphone-makers, meet AI

Not all headphones are on the verge of transforming into wearable AI assistants, and the situation is not the same across the board. Many legacy headphone companies are «entrenched in their core focus of audio quality and audio file capability,» says Bekis.

At the same time, Bekis says Harman-owned high-end audio brand Mark Levinson is one headphone maker Qualcomm is working with on integrating AI into its products. And smartphone manufacturers who also have audio products in their lineup are at the forefront of the charge.

You only need to look at the new capabilities that Samsung, Google and Apple have bolstered their headphones with over the past few years. In addition to adaptive audio, the companies are starting to add AI-specific features. Google’s Pixel Buds 2 are engineered not just as an audio device but as hardware with the company’s Gemini AI assistant at the core (you could say «Hey, Google» to activate Gemini and ask it to summarize your emails, for example).

In September, Apple introduced AI-powered live translation with the AirPods Pro 3. The AirPods will parse what someone is saying to you and play it in your chosen language in your ear. They will also pick up your speech and translate it so that you can show the other person a transcript in their language on your phone screen. 

Apple also seems to be searching for ways to further tap the AI potential of its headphones range. A report from Bloomberg earlier this month suggested that the company might introduce AI-powered infrared cameras with the next version of the AirPods Pro, which could be activated by and respond to gestures.

It’s clear that smartphone-makers can see the potential in headphones to be more than just audio products, in the same way they once recognized that the phone could be more than simply a device for making calls. They might even turn headphones and earbuds into what I think could be the ultimate AI wearable.

Why headphones?

The biggest argument for headphones over other emerging AI-focused wearable tech is their popularity: Who doesn’t own at least one pair? (My feeling is that everyone should own at least three different styles, each with its own strengths.) It’s just not the same with glasses or watches.

Yes, they are common and familiar, but the likelihood is that if you don’t already wear them regularly, the addition of AI is unlikely to persuade you. Glasses, in particular, have drawbacks, including battery life. There’s also the difficulty of combining the tech with prescription lenses and privacy concerns due to the addition of cameras.

After well over a decade of effort, tech companies are also still struggling to make smart glasses as sleek and comfortable to wear as their non-smart counterparts (the Meta Ray-Bans perhaps being the one exception to the rule here). 

Smartwatches and fitness bands, meanwhile, have become more comfortable, but many people still find them cumbersome for sleeping. The sensors in them are too far away from our faces, where we receive the majority of our sensory inputs, to comprehend the world around us with forensic detail. They cannot relay sensory feedback to us without us having to look at a screen. The same is true for rings and other smart jewelry.

There are no devices that rival headphones, and earbuds in particular, for sheer proximity to a major sensory organ capable of both inputting and outputting complex sensory data. They have been and remain discreet, easy to take on and off, and not overly power hungry or demanding when it comes to charging frequency. 

«Critically, there’s the social acceptance level of this as well, where, ultimately, headphones have become incredibly commonplace,» says CCS Insight Analyst Leo Gebbie. 

They don’t insert a noticeable barrier between you and the world you’re experiencing. Plus, even when they’re obvious, they don’t tend to put people on edge over concerns you could be capturing their image, and you don’t need to learn how to use them, Gebbie says.

 «Contrast that with something like smart glasses, where I think there is a whole new set of user behaviors that would need to be learned in terms of exactly how to interact with that device,» he says. «Also, there’s kind of a social contract, which, for me, at least with smart glasses, has always been one of the biggest stumbling blocks.»

What’s more, headphones have been getting gradually smarter all this time without most of us even noticing.

This invisible evolution is the closest tangible expression I’ve seen of the widespread belief among tech leaders that AI should be a subtle, ambient force that permeates our lives as inconspicuously as possible.

Headphones are an established product that shows consistent growth, making them the safest bet for companies that want as many people as possible to engage with AI through wearable tech. 

Multiple forecasts, including from SNS Insider and Mordor Intelligence, estimate the global market for headphones will grow to over $100 billion by the early 2030s. By contrast, Mordor forecasts the smart glasses market will grow to $18.4 billion in the same period, one of the higher estimates I found.

Companies are always searching out new revenue streams, hence their determination to explore new kinds of AI devices, says Gebbie. But, he adds, «headphones definitely feel like a safer bet, because it’s a form factor that people are familiar with.»

It may well be the case that no single wearable device will define our coexistence with AI, and if there is, it will be a device of our choosing. 

But rather than reinvent the wheel, I strongly suspect the companies embracing the potential of headphones will see these formerly audio-focused devices fly in the age of AI. And perhaps it’s just personal preference, but I’m on board.

Continue Reading

Technologies

Phone Plugged in 24/7? Experts Reveal the Science Behind Battery Damage

Phone batteries degrade over time, but heat and use habits are a larger danger than keeping your phone plugged in.

There was a time when smartphone users were warned not to leave their phones plugged in for too long, or it could do damage to the battery. While modern smartphones now have overcharge protection that keeps them safe, many people still have questions about whether keeping their phone perpetually plugged in will damage the battery.

The short answer is no. Keeping your phone plugged in all the time won’t ruin your battery. Modern smartphones are built with smart charging systems that cut off or taper power once they’re full, preventing the kind of «overcharging damage» that was common in older devices. So if you’re leaving your iPhone or Android on the charger overnight, you can relax.

That said, «won’t ruin your battery» doesn’t mean it has no effect. Batteries naturally degrade with age and use, and how you charge plays a role in how fast that happens. Keeping a phone perpetually at 100% can add extra stress on the battery, especially when paired with heat, which is the real enemy of longevity. 

Understanding when this matters (and when it doesn’t) can help you make small changes to extend your phone’s lifespan.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The science behind battery wear

Battery health isn’t just about how many times you charge your phone. It’s about how it manages voltage, temperature and maintenance. Lithium-ion batteries age fastest when they’re exposed to extreme levels: 0% and 100%. 

Keeping them near full charge for long stretches puts additional voltage stress on the cathode and electrolyte. That’s why many devices use «trickle charging» or temporarily pause at 100%, topping up only when needed.

Still, the biggest threat isn’t overcharging — it’s heat. When your phone is plugged in and running demanding apps, it produces heat that accelerates chemical wear inside the battery. If you’re gaming, streaming or charging on a hot day, that extra warmth does far more harm than leaving the cable plugged in overnight.

Apple’s take

Apple’s battery guide describes lithium-ion batteries as «consumable components» that naturally lose capacity over time. To slow that decline, iPhones use Optimized Battery Charging, which learns your daily routine and pauses charging at about 80% until just before you typically unplug, reducing time spent at high voltage.

Apple also advises keeping devices between 0 to 35 degrees Celsius (32 to 95 degrees Fahrenheit) and removing certain cases while charging to improve heat dissipation. You can read more on Apple’s official battery support page.

What Samsung (and other Android makers) do

Samsung offers a similar feature called Battery Protect, found in One UI’s battery and device care settings. When enabled, it caps charging at 85%, which helps reduce stress during long charging sessions.

Other Android makers like Google, OnePlus and Xiaomi include comparable options — often called Adaptive Charging, Optimized Charging or Battery Care — that dynamically slow power delivery or limit charge based on your habits. These systems make it safe to leave your phone plugged in for extended periods without fear of overcharging.

When constant charging can hurt

Even with these safeguards, some conditions can accelerate battery wear. As mentioned before, the most common culprit is high temperature. Even for a short period of time, leaving your phone charging in direct sunlight, in a car or under a pillow can push temperatures into unsafe zones.

Heavy use while charging, like gaming or 4K video editing, can also cause temperature spikes that degrade the battery faster. And cheap, uncertified cables or adapters may deliver unstable current that stresses cells. If your battery is already several years old, it’s naturally more sensitive to this kind of strain.

How to charge smarter

You don’t need to overhaul your habits but a few tweaks can help your battery age gracefully. 

Start by turning on your phone’s built-in optimization tools: Optimized Battery Charging on iPhones, Battery Protect on Samsung devices and Adaptive Charging on Google Pixels. These systems learn your routine and adjust charging speed so your phone isn’t sitting at 100% all night.

Keep your phone cool while charging. According to Apple, phone batteries perform best between 62 and 72 degrees Fahrenheit (16 to 22 degrees Celsius). If your phone feels hot, remove its case or move it to a better-ventilated or shaded spot. Avoid tossing it under a pillow or too close to other electronics, like your laptop, and skip wireless chargers that trap heat overnight.

Use quality chargers and cables from your phone’s manufacturer or trusted brands. Those cheap «fast-charge» kits you find online often deliver inconsistent current, which can cause long-term issues.

Finally, don’t obsess over topping off. It’s perfectly fine to plug in your phone during the day for short bursts. Lithium-ion batteries actually prefer frequent, shallow charges rather than deep, full cycles. You don’t need to keep it between 20% and 80% all the time, but just avoid extremes when possible.

The bottom line

Keeping your phone plugged in overnight or on your desk all day won’t destroy its battery. That’s a leftover myth from a different era of tech. Modern phones are smart enough to protect themselves, and features like Optimized Battery Charging or Battery Protect do most of the heavy lifting for you.

Still, no battery lasts forever. The best way to slow the inevitable is to manage heat, use quality chargers and let your phone’s software do its job. Think of it less as «babying» your battery and more as charging with intention. A few mindful habits today can keep your phone running strong for years.

Continue Reading

Technologies

Magic Cue Might Be Pixel 10’s Most Helpful Feature. Here’s How To Use It.

With AI, Magic Cue can instantly pull up flight information, reservation details and photos in calls and texts, so you don’t have to dig for them.

You might be sick of hearing about all the AI features loaded on your phone. But if you have a Pixel 10, there’s one key capability that may be worth tapping into.

Magic Cue is one of Google’s latest AI flexes. It can surface information related to what’s on your phone’s screen, so you don’t have to dig for it yourself. For example, if you’re calling your airline, Magic Cue will automatically show your upcoming flight information on the call screen. Or if your friend texts to ask about what time dinner is, those details will appear within Messages without you having to look for them. 


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The Pixel 10 series is loaded with other impressive AI features, like a Voice Translate feature that can mimic the sound of a caller’s voice while translating what they’re saying. AI can also sharpen your zoomed-in photos and help you take better pictures with Camera Coach. And Circle to Search remains one of my favorite mobile tools. But Magic Cue is one of the few capabilities that succinctly delivers on the promise of AI to simplify tasks and act as a helpful mobile assistant. 

Like many AI features, Magic Cue can be hit-or-miss, and in many ways it’s still finding its footing. But it stands out as one of the more practical and helpful AI features you can use on the Pixel 10, 10 Pro, 10 Pro XL and 10 Pro Fold.  

Which devices can use Magic Cue?

Only Google Pixel 10 phones can tap into Magic Cue. It’s powered by the Google Tensor G5 chip and the latest version of the Gemini Nano AI model. So if you have an older Pixel phone or a different Android phone, this won’t be available to you.

How to use Magic Cue

To use Magic Cue, you’ll first need to allow access to the capability in your Pixel 10’s settings. 

When you open Settings, you’ll see Magic Cue listed near the bottom. Tap that and hit the toggles to allow suggestions and information to pop up based on what’s on your screen. 

You’ll also see an option to choose which specific apps you want Magic Cue to pull data from, like Gmail, Messages and Calendar. That way if you have a flight reservation in your email or a dinner blocked off in your calendar, Magic Cue can surface that information when it relates to a conversation on your screen. Google’s support page for Magic Cue also notes that suggestions can show up on «select third-party messaging apps,» though I personally haven’t seen it appear in WhatsApp just yet, for example.  

Within Magic Cue’s settings, you’ll also see whether an update is needed for the feature to work properly. Under the Magic Cue updates tab, it should say «Up to date.»

You’ll be able to use Magic Cue 24 hours after you set it up on your Pixel 10. It may take some time for it to process data across your apps and show relevant suggestions, but it’ll get better at providing information and actions as you continue to use your phone.

Magic Cue processes everything on-device, so you shouldn’t worry about your personal information being compromised.

How Magic Cue works

Once Magic Cue is enabled, it’ll suggest actions and surface information related to what you’re doing on your Pixel. 

For instance, if you’re calling an airline, your flight details, including departure and arrival time and confirmation number, will appear on the call screen. That way, when a customer service agent asks for those details, you’ll have them readily available.

Similarly, if a friend texts to ask when your flight lands, those details will pop up automatically within Messages, and you can just tap to send. Or if someone asks where you’re having dinner tonight, Magic Cue can find that information from your calendar so you don’t have to drop it in yourself. 

Magic Cue also works with Google Photos, so if someone asks for a picture of someone or something, you can tap the Share Photos button that pops up in Messages and select which suggested image is the right fit. 

In my experience, Magic Cue has been helpful but not perfect. It does a good job of showing flight or reservation information from my email or calendar. But there are also times it’ll just say «View calendar» when someone asks what time something is happening. In those instances, Magic Cue isn’t really saving me any time or effort, since I can easily swipe to my calendar myself. But I have hope it’ll get better with time and more consistently feel like a magic trick.

Continue Reading

Trending

Copyright © Verum World Media