Connect with us

Technologies

A Decade Later, Your Phone Still Can’t Replace a Pro Camera

Commentary: Phone cameras are getting better and better, but they still aren’t much closer to replacing dSLRs and professional mirrorless cameras.

On a chilly Saturday afternoon in San Francisco, I was under a patio heater with a group of friends when someone said we should get a group photo. What happened next was surprising. Instead of using his phone to take a commemorative photo, my friend pulled out a point-and-shoot camera. I thought to myself, «Wait. The phone killed the point-and-shoot camera years ago. Why didn’t he just use his iPhone?» Granted it was the high-end Sony RX100 VII, which is an excellent compact camera and one of the few point-and-shoots still made today.

Phones from Apple, Samsung and Google include some of the best phone cameras you can buy, like the iPhone 14 Pro, Google Pixel 7 Pro and Samsung Galaxy S22 Ultra. But for professional photographers and filmmakers, that’s not always enough. The holy grail is being able to have a truly large image sensor like the one you’d find in a high-end mirrorless camera and a lens mount that could attach to your phone. Sounds simple enough right? Wrong.

Everyone from Samsung to Panasonic, Sony and Motorola has tried to make this dream a reality in some way. Now Xiaomi, the world’s third largest phone-maker (behind Samsung and Apple) is the latest to rekindle the quest for the phone camera holy grail. The company has a new prototype phone that lets you mount a Leica M lens on it.

But this is just a concept. If you’re wondering whether phones will ever make dedicated pro cameras obsolete the way they did with point-and-shoots, the answer is a resounding no. The past decade has shown us why.

Why phone cameras are limited

First, it’s important to understand how your phone’s camera works. Behind the lens is a tiny image sensor, smaller than a single Lego brick. Sometimes there are headlines that Sony, Sharp or, years ago, Panasonic put a 1-inch sensor in a phone. Sadly, that name doesn’t refer to the actual dimensions and in reality, a 1-inch image sensor is about 0.6 of an inch diagonally or, for the sake of approximation, two Lego bricks. The 1-inch sensor is the hoverboard of cameras, but it’s still one of the largest to be put into a phone.

Dedicated cameras have sensors that are closer to 12 Lego bricks (positioned side-by-side in a four-by-three rectangle) and most come with a lens mount that lets you change lenses. The «holy grail» is to put one of these larger sensors into a phone.

But bigger sensors are more expensive than the little ones used in your iPhone and there are space considerations. A lens for a phone camera sensor is relatively small. But lenses for a full-frame sensor are larger and require more space between the back of the lens and the sensor. Phones simply lack this room without becoming significantly thicker.

Every year we see Apple, Samsung and the like take small steps toward improving phone photography. But phone camera hardware has largely hit a ceiling. Instead of radical camera improvements, we get modest upgrades. This could be a sign that companies have honed in on what consumers want. But it could also be a consequence of space and size limitations of tiny sensors.

Instead smartphone-makers use computational photography to overcome a tiny sensor’s limitations — smaller dynamic range and light sensitivity. Google, Apple, Samsung all use machine learning algorithms and artificial intelligence to improve the photos you take with your phone.

But hardware is also important. Earlier this month Tim Cook, Apple’s CEO, shared a photo on Twitter, above, of a visit to Sony in Japan. While it’s been widely assumed that Apple uses Sony’s image sensors in the iPhone, this is the first time Cook formally acknowledged it. And as CNET readers already know, Sony phones like the Xperia 1 IV have some of the best camera hardware found on any phone sold today.

The Xperia 1 IV won a CNET Innovation award for its telephoto camera, which has miniature lens elements that actually move back and forth, like a real telephoto lens. The result is that you can use the lens to zoom without cropping digitally, which degrades the image. Can you imagine an iPhone 15 Pro with this lens?

The Xiaomi 12S Ultra Leica lens prototype is so 2013

That brings us to Xiaomi, which is the latest company attempting to merge pro-level cameras with your phone. In November, Xiaomi released a video of a phone camera concept that shows a Leica lens mounted on a 12S Ultra phone. This prototype is like a concept car: No matter how cool it is, you’ll never get to drive it.

The Chinese company took the 12S Ultra and added a removable ring around its circular camera bump. The ring covers a thread around the outside edge of the camera bump onto which you can attach an adapter that lets you mount Leica M lenses. The adapter’s thickness is the same distance that a Leica M lens needs to be positioned away from the sensor in order to focus.

A few caveats: The Xiaomi 12S Ultra concept uses an exposed 1-inch sensor, which as I mentioned earlier, isn’t actually 1-inch. Next, this is purely a concept. If something like this actually went on sale, it would cost thousands of dollars. A nice dedicated camera like the Fujifilm X100 V, which has a much bigger sensor, costs $1,399 in comparison.

Xiaomi isn’t the first phone-maker to try this. In 2013, Sony took an image sensor and put it on the back of a lens that has a grip to attach to the back of a phone. The idea is to use your phone’s screen as the viewfinder for the camera system, which you can control through an app. Essentially you bypass your phone’s cameras.

Sony made several different versions of this «lens with a grip» and used sensors that were just a bit bigger than those found in phone cameras. Sony also made the QX-1 camera, which had an APS-C sized sensor that in our Lego approximation is about six bricks positioned side-by-side in a three-by-two rectangle. That’s not as large as a full-frame sensor, but vastly bigger than your phone’s image sensors.

The Sony QX-1 has a Sony E-mount, meaning you can use various E-mount lenses or use adapters for Canon or Nikon lenses. Because the QX-1 is controlled with Bluetooth, you could either attach it to your phone or put it in different places to take photos remotely.

The QX-1 came out in 2014 and cost $350. Imagine having something like this today? I would definitely buy a 2022 version if Sony made it, but sadly the QX-1 was disconitntued a few years after it went on sale. That’s around the time that Red, the company that makes cinema cameras used to film shows and movies like The Hobbit, The Witcher, Midsommar and The Boys, made a phone called the Red Hydrogen One.

Despite being a phone made by one of the best camera companies in the world, the $1,300 Red Hydrogen One’s cameras were on par with those from a $700 Android phone. The back of the phone had pogo pins designed to attach different modules (like Moto Mods), including a «cinema camera module» that housed a large image sensor and a lens mount, according to patent drawings. The idea is that you would use a Hydrogen One and the cinema mod to turn the phone into a mini-Red cinema camera.

Well, that never happened.

The Red Hydrogen One was discontinued and now shows up as a phone prop in films like F9, on the dashboard of Dominic Toretto’s car or in the hands of Leonard DiCaprio in Don’t Look Up.

2023 will show that pro cameras won’t be killed off by our phones

There aren’t any rumors that Apple is making an iPhone with a camera lens mount, nor are there murmurs of a Google mirrorless camera. But if Xiaomi made a prototype of a phone with a professional lens mount, you have to imagine that somewhere in the basement of Apple Park sits an old concept camera that runs an iOS-like interface, is powered by the iPhone’s A-series chip and able to use some of the same computational photography processing. Or at least that’s what I’d like to believe.

How amazing would photos look from a pro-level dedicated camera that uses the same processing tricks that Apple or Google implement on their phones? And how nice would it be to have a phone-like OS to share those photos and videos to Instagram or TikTok?

Turns out, Samsung tried bringing an Android phone’s interface to a camera in 2012. Noticing a theme here? Most of these holy grail phone camera concepts were tried 10 years ago. A few of these, like the Sony QX-1, were truly ahead of their time.

I don’t think Apple will ever release a standalone iOS-powered camera or make an iPhone with a Leica lens mount. The truth is that over the past decade, cameras have gotten smaller. The bulky dSLRs that signified professional cameras for years are quickly heading into the sunset. Mirrorless cameras have risen in popularity. They tend to be smaller, since they don’t need the space for a dSLR mirror box.

If there is a takeaway from all of this, it’s just a reminder of how good the cameras on our phones have gotten in that time. Even if it feels like they’ve plateaued, they’re dependable for most everyday tasks. But they won’t be replacing professional cameras anytime soon.

If you want to step up into a professional camera, find one like the Fujifilm X100 V or Sony A7C, that pack a large image sensor, a sharp lens and can fit into a coat pocket. And next time I’m at a dinner party with friends, I won’t act so shocked when someone wants to take a picture with a camera instead of a phone.

Read more: Pixel 7 Pro Actually Challenges My $10,000 DSLR Camera Setup

Technologies

The Ultimate AI Wearable Is a Piece of Tech You Already Own

Commentary: Tech companies are trying to give us dedicated AI devices. There’s no need — we all have them already.

In some quarters, the rise of AI has sparked the urge to invent all-new devices, which are deeply invested in that technology but which look and function differently from any products we’ve owned before.

These range from head-mounted XR devices, such as headsets and glasses, to pins, necklaces, phone accessories and whatever mystery product former Apple designer Jony Ive and OpenAI are developing in secret.

But what if, in pursuit of these new devices, we overlook the fact that the ultimate AI form factor is something we all already own? It could even be that the best way to deploy AI is through tech that dates back to the 19th century. 

I’m talking about headphones.

There hasn’t been a lack of evolution in personal audio over the years, but integrating AI into headphones is giving them a new lease on life, says Dino Bekis, vice president of wearables at chipmaker Qualcomm. We’re starting to see this with devices like Apple’s new AirPods Pro 3.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The impact of AI on headphones will be twofold, says Bekis. First, it will build on improvements we’ve already seen, such as the ability to easily switch among active noise cancellation, transparency and other listening modes. 

Instead of that being something we need to control manually, the headphones themselves will increasingly handle it all dynamically. Sensors on board, layered with AI, become more adept at reading and understanding our immediate surroundings.

Bekis says that maybe your headphones could alert you to someone trying to get your attention by recognizing your name being called, even if you’re listening to music with ANC enabled. If you’re on a call, walking along a busy street, they could alert you to traffic dangers, sirens or someone who might be walking close behind you.

But where he really sees AI headphones coming into their own is in the interactions you’ll have with AI agents. These personal assistant-like versions of artificial intelligence will operate autonomously with our devices and services on our behalf.

There’s no more «natural way» than conversation to interact with them, he says, and the high-quality mics and speakers in your headphones will allow for clear and effective communication.

«Earbuds or headphones are really yesterday’s technology that’s suddenly been reinvented and is becoming the primary way we’re going to be interfacing with agents moving forward,» says Bekis.

Headphone-makers, meet AI

Not all headphones are on the verge of transforming into wearable AI assistants, and the situation is not the same across the board. Many legacy headphone companies are «entrenched in their core focus of audio quality and audio file capability,» says Bekis.

At the same time, Bekis says Harman-owned high-end audio brand Mark Levinson is one headphone maker Qualcomm is working with on integrating AI into its products. And smartphone manufacturers who also have audio products in their lineup are at the forefront of the charge.

You only need to look at the new capabilities that Samsung, Google and Apple have bolstered their headphones with over the past few years. In addition to adaptive audio, the companies are starting to add AI-specific features. Google’s Pixel Buds 2 are engineered not just as an audio device but as hardware with the company’s Gemini AI assistant at the core (you could say «Hey, Google» to activate Gemini and ask it to summarize your emails, for example).

In September, Apple introduced AI-powered live translation with the AirPods Pro 3. The AirPods will parse what someone is saying to you and play it in your chosen language in your ear. They will also pick up your speech and translate it so that you can show the other person a transcript in their language on your phone screen. 

Apple also seems to be searching for ways to further tap the AI potential of its headphones range. A report from Bloomberg earlier this month suggested that the company might introduce AI-powered infrared cameras with the next version of the AirPods Pro, which could be activated by and respond to gestures.

It’s clear that smartphone-makers can see the potential in headphones to be more than just audio products, in the same way they once recognized that the phone could be more than simply a device for making calls. They might even turn headphones and earbuds into what I think could be the ultimate AI wearable.

Why headphones?

The biggest argument for headphones over other emerging AI-focused wearable tech is their popularity: Who doesn’t own at least one pair? (My feeling is that everyone should own at least three different styles, each with its own strengths.) It’s just not the same with glasses or watches.

Yes, they are common and familiar, but the likelihood is that if you don’t already wear them regularly, the addition of AI is unlikely to persuade you. Glasses, in particular, have drawbacks, including battery life. There’s also the difficulty of combining the tech with prescription lenses and privacy concerns due to the addition of cameras.

After well over a decade of effort, tech companies are also still struggling to make smart glasses as sleek and comfortable to wear as their non-smart counterparts (the Meta Ray-Bans perhaps being the one exception to the rule here). 

Smartwatches and fitness bands, meanwhile, have become more comfortable, but many people still find them cumbersome for sleeping. The sensors in them are too far away from our faces, where we receive the majority of our sensory inputs, to comprehend the world around us with forensic detail. They cannot relay sensory feedback to us without us having to look at a screen. The same is true for rings and other smart jewelry.

There are no devices that rival headphones, and earbuds in particular, for sheer proximity to a major sensory organ capable of both inputting and outputting complex sensory data. They have been and remain discreet, easy to take on and off, and not overly power hungry or demanding when it comes to charging frequency. 

«Critically, there’s the social acceptance level of this as well, where, ultimately, headphones have become incredibly commonplace,» says CCS Insight Analyst Leo Gebbie. 

They don’t insert a noticeable barrier between you and the world you’re experiencing. Plus, even when they’re obvious, they don’t tend to put people on edge over concerns you could be capturing their image, and you don’t need to learn how to use them, Gebbie says.

 «Contrast that with something like smart glasses, where I think there is a whole new set of user behaviors that would need to be learned in terms of exactly how to interact with that device,» he says. «Also, there’s kind of a social contract, which, for me, at least with smart glasses, has always been one of the biggest stumbling blocks.»

What’s more, headphones have been getting gradually smarter all this time without most of us even noticing.

This invisible evolution is the closest tangible expression I’ve seen of the widespread belief among tech leaders that AI should be a subtle, ambient force that permeates our lives as inconspicuously as possible.

Headphones are an established product that shows consistent growth, making them the safest bet for companies that want as many people as possible to engage with AI through wearable tech. 

Multiple forecasts, including from SNS Insider and Mordor Intelligence, estimate the global market for headphones will grow to over $100 billion by the early 2030s. By contrast, Mordor forecasts the smart glasses market will grow to $18.4 billion in the same period, one of the higher estimates I found.

Companies are always searching out new revenue streams, hence their determination to explore new kinds of AI devices, says Gebbie. But, he adds, «headphones definitely feel like a safer bet, because it’s a form factor that people are familiar with.»

It may well be the case that no single wearable device will define our coexistence with AI, and if there is, it will be a device of our choosing. 

But rather than reinvent the wheel, I strongly suspect the companies embracing the potential of headphones will see these formerly audio-focused devices fly in the age of AI. And perhaps it’s just personal preference, but I’m on board.

Continue Reading

Technologies

Phone Plugged in 24/7? Experts Reveal the Science Behind Battery Damage

Phone batteries degrade over time, but heat and use habits are a larger danger than keeping your phone plugged in.

There was a time when smartphone users were warned not to leave their phones plugged in for too long, or it could do damage to the battery. While modern smartphones now have overcharge protection that keeps them safe, many people still have questions about whether keeping their phone perpetually plugged in will damage the battery.

The short answer is no. Keeping your phone plugged in all the time won’t ruin your battery. Modern smartphones are built with smart charging systems that cut off or taper power once they’re full, preventing the kind of «overcharging damage» that was common in older devices. So if you’re leaving your iPhone or Android on the charger overnight, you can relax.

That said, «won’t ruin your battery» doesn’t mean it has no effect. Batteries naturally degrade with age and use, and how you charge plays a role in how fast that happens. Keeping a phone perpetually at 100% can add extra stress on the battery, especially when paired with heat, which is the real enemy of longevity. 

Understanding when this matters (and when it doesn’t) can help you make small changes to extend your phone’s lifespan.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The science behind battery wear

Battery health isn’t just about how many times you charge your phone. It’s about how it manages voltage, temperature and maintenance. Lithium-ion batteries age fastest when they’re exposed to extreme levels: 0% and 100%. 

Keeping them near full charge for long stretches puts additional voltage stress on the cathode and electrolyte. That’s why many devices use «trickle charging» or temporarily pause at 100%, topping up only when needed.

Still, the biggest threat isn’t overcharging — it’s heat. When your phone is plugged in and running demanding apps, it produces heat that accelerates chemical wear inside the battery. If you’re gaming, streaming or charging on a hot day, that extra warmth does far more harm than leaving the cable plugged in overnight.

Apple’s take

Apple’s battery guide describes lithium-ion batteries as «consumable components» that naturally lose capacity over time. To slow that decline, iPhones use Optimized Battery Charging, which learns your daily routine and pauses charging at about 80% until just before you typically unplug, reducing time spent at high voltage.

Apple also advises keeping devices between 0 to 35 degrees Celsius (32 to 95 degrees Fahrenheit) and removing certain cases while charging to improve heat dissipation. You can read more on Apple’s official battery support page.

What Samsung (and other Android makers) do

Samsung offers a similar feature called Battery Protect, found in One UI’s battery and device care settings. When enabled, it caps charging at 85%, which helps reduce stress during long charging sessions.

Other Android makers like Google, OnePlus and Xiaomi include comparable options — often called Adaptive Charging, Optimized Charging or Battery Care — that dynamically slow power delivery or limit charge based on your habits. These systems make it safe to leave your phone plugged in for extended periods without fear of overcharging.

When constant charging can hurt

Even with these safeguards, some conditions can accelerate battery wear. As mentioned before, the most common culprit is high temperature. Even for a short period of time, leaving your phone charging in direct sunlight, in a car or under a pillow can push temperatures into unsafe zones.

Heavy use while charging, like gaming or 4K video editing, can also cause temperature spikes that degrade the battery faster. And cheap, uncertified cables or adapters may deliver unstable current that stresses cells. If your battery is already several years old, it’s naturally more sensitive to this kind of strain.

How to charge smarter

You don’t need to overhaul your habits but a few tweaks can help your battery age gracefully. 

Start by turning on your phone’s built-in optimization tools: Optimized Battery Charging on iPhones, Battery Protect on Samsung devices and Adaptive Charging on Google Pixels. These systems learn your routine and adjust charging speed so your phone isn’t sitting at 100% all night.

Keep your phone cool while charging. According to Apple, phone batteries perform best between 62 and 72 degrees Fahrenheit (16 to 22 degrees Celsius). If your phone feels hot, remove its case or move it to a better-ventilated or shaded spot. Avoid tossing it under a pillow or too close to other electronics, like your laptop, and skip wireless chargers that trap heat overnight.

Use quality chargers and cables from your phone’s manufacturer or trusted brands. Those cheap «fast-charge» kits you find online often deliver inconsistent current, which can cause long-term issues.

Finally, don’t obsess over topping off. It’s perfectly fine to plug in your phone during the day for short bursts. Lithium-ion batteries actually prefer frequent, shallow charges rather than deep, full cycles. You don’t need to keep it between 20% and 80% all the time, but just avoid extremes when possible.

The bottom line

Keeping your phone plugged in overnight or on your desk all day won’t destroy its battery. That’s a leftover myth from a different era of tech. Modern phones are smart enough to protect themselves, and features like Optimized Battery Charging or Battery Protect do most of the heavy lifting for you.

Still, no battery lasts forever. The best way to slow the inevitable is to manage heat, use quality chargers and let your phone’s software do its job. Think of it less as «babying» your battery and more as charging with intention. A few mindful habits today can keep your phone running strong for years.

Continue Reading

Technologies

Magic Cue Might Be Pixel 10’s Most Helpful Feature. Here’s How To Use It.

With AI, Magic Cue can instantly pull up flight information, reservation details and photos in calls and texts, so you don’t have to dig for them.

You might be sick of hearing about all the AI features loaded on your phone. But if you have a Pixel 10, there’s one key capability that may be worth tapping into.

Magic Cue is one of Google’s latest AI flexes. It can surface information related to what’s on your phone’s screen, so you don’t have to dig for it yourself. For example, if you’re calling your airline, Magic Cue will automatically show your upcoming flight information on the call screen. Or if your friend texts to ask about what time dinner is, those details will appear within Messages without you having to look for them. 


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The Pixel 10 series is loaded with other impressive AI features, like a Voice Translate feature that can mimic the sound of a caller’s voice while translating what they’re saying. AI can also sharpen your zoomed-in photos and help you take better pictures with Camera Coach. And Circle to Search remains one of my favorite mobile tools. But Magic Cue is one of the few capabilities that succinctly delivers on the promise of AI to simplify tasks and act as a helpful mobile assistant. 

Like many AI features, Magic Cue can be hit-or-miss, and in many ways it’s still finding its footing. But it stands out as one of the more practical and helpful AI features you can use on the Pixel 10, 10 Pro, 10 Pro XL and 10 Pro Fold.  

Which devices can use Magic Cue?

Only Google Pixel 10 phones can tap into Magic Cue. It’s powered by the Google Tensor G5 chip and the latest version of the Gemini Nano AI model. So if you have an older Pixel phone or a different Android phone, this won’t be available to you.

How to use Magic Cue

To use Magic Cue, you’ll first need to allow access to the capability in your Pixel 10’s settings. 

When you open Settings, you’ll see Magic Cue listed near the bottom. Tap that and hit the toggles to allow suggestions and information to pop up based on what’s on your screen. 

You’ll also see an option to choose which specific apps you want Magic Cue to pull data from, like Gmail, Messages and Calendar. That way if you have a flight reservation in your email or a dinner blocked off in your calendar, Magic Cue can surface that information when it relates to a conversation on your screen. Google’s support page for Magic Cue also notes that suggestions can show up on «select third-party messaging apps,» though I personally haven’t seen it appear in WhatsApp just yet, for example.  

Within Magic Cue’s settings, you’ll also see whether an update is needed for the feature to work properly. Under the Magic Cue updates tab, it should say «Up to date.»

You’ll be able to use Magic Cue 24 hours after you set it up on your Pixel 10. It may take some time for it to process data across your apps and show relevant suggestions, but it’ll get better at providing information and actions as you continue to use your phone.

Magic Cue processes everything on-device, so you shouldn’t worry about your personal information being compromised.

How Magic Cue works

Once Magic Cue is enabled, it’ll suggest actions and surface information related to what you’re doing on your Pixel. 

For instance, if you’re calling an airline, your flight details, including departure and arrival time and confirmation number, will appear on the call screen. That way, when a customer service agent asks for those details, you’ll have them readily available.

Similarly, if a friend texts to ask when your flight lands, those details will pop up automatically within Messages, and you can just tap to send. Or if someone asks where you’re having dinner tonight, Magic Cue can find that information from your calendar so you don’t have to drop it in yourself. 

Magic Cue also works with Google Photos, so if someone asks for a picture of someone or something, you can tap the Share Photos button that pops up in Messages and select which suggested image is the right fit. 

In my experience, Magic Cue has been helpful but not perfect. It does a good job of showing flight or reservation information from my email or calendar. But there are also times it’ll just say «View calendar» when someone asks what time something is happening. In those instances, Magic Cue isn’t really saving me any time or effort, since I can easily swipe to my calendar myself. But I have hope it’ll get better with time and more consistently feel like a magic trick.

Continue Reading

Trending

Copyright © Verum World Media