Connect with us

Technologies

The iPhone 17 Needs Amazing Cameras. Here’s What I Think Apple Should Do

Commentary: Apple’s rivals are catching up when it comes to camera skills. Here’s how the iPhone 17 can pull ahead.

The iPhone 16 Pro already packs one of the best camera setups found on any phone, but the iPhone 17 needs to take things even further when it launches in just a few weeks. Sure, Apple’s phones are capable of taking stunning photos, thanks to its awesome software, ProRaw format and its wealth of video skills, but Apple’s rivals have been doing big things, too. The Galaxy S25 Ultra, the Pixel 9 Pro and the Xiaomi 14 Ultra all pack amazing camera setups that have given the iPhone 16 Pro a run for its money and made it clear that Apple isn’t the only company innovating in the imaging arena.

Read more: Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra

While early reports from industry insiders claim that the phone’s video skills will get a boost, there’s more the iPhone 17 will need to make it an all-around photography powerhouse. As both an experienced phone reviewer and a professional photographer, I have exceptionally high expectations for top-end phone cameras. And, having used the iPhone 16 Pro since its launch, I have some thoughts on what needs to change. 

Here are the main points I want to see improved on the iPhone 17 when it likely launches in September 2025.

An accessible Pro camera mode

At WWDC, Apple showed off the changes to the upcoming iOS 26 which included a radical change to the interface with Liquid Glass. But that simplified style extended to the camera app too, with Apple paring the interface down to the most basic functions of Photo, Video and zoom levels. Presumably, the idea is to make it super easy for even the most beginner of photographers to open the camera and start taking Instagram-worthy snaps. 

And that’s fine, but what about those of us who buy the Pro models in order to take deeper advantage of features like exposure compensation, Photographic Styles and ProRaw formats? It’s not totally clear yet how these features can be accessed within the new camera interface, but they need to not be tucked away. Many photographers — myself very much included — want to use these tools as standard, using our powerful iPhones in much the same way we would a mirrorless camera from Canon or Sony. 

That means relying on advanced settings to take control over the image-taking process to craft shots that go beyond simple snaps. If anything, Apple’s camera app has always been too simple, with even basic functions like white balance being unavailable. To see Apple take things to an even more simplistic level is concerning, and I want to see how the company will continue to make these phones usable for enthusiastic photographers. 

Larger image sensor

Though the 1/1.28-inch sensor found on  the iPhone 16 Pro’s main camera is already a good size — and marginally larger than the S24 Ultra’s 1/1.33-inch sensor — I want to see Apple go bigger. A larger image sensor can capture more light and offer better dynamic range. It’s why pro cameras tend to have at least «full frame» image sensors, while really high-end cameras, like the amazing Hasselblad 907X, have enormous «medium format» sensors for pristine image quality. 

Xiaomi understands this, equipping its 15 Ultra and previous 14 Ultra with 1-inch type sensors. It’s larger than the sensors found on almost any other phone, which allowed the 15 Ultra to take stunning photos all over Europe, while the 14 Pro was heroic in capturing images at Taylor Swift concerts. I’m keen to see Apple at least match Xiaomi’s phone here with a similar 1-inch type sensor. Though if we’re talking pie-in-the-sky wishes, maybe the iPhone 17 could be the first smartphone with a full-frame image sensor. I won’t hold my breath on that one — the phone, and the lenses, would need to be immense to accommodate it, so it’d likely be more efficient just to let you make calls with your mirrorless camera. 

Don’t lean on AI too much

AI has become a bigger part of the camera experience on many Android phones, from the Honor 400 Pro’s tool that brought my dad back to life to the Pixel 9 Pro’s wild generative AI functions. But iPhones have always emphasized the importance of real image quality, producing sharp, detailed images that remain faithful to the scene you actually saw when you pushed the shutter button. 

Apple’s dalliances in AI so far haven’t exactly been groundbreaking and I worry that the company may want to be seen as making a bigger push for deeper, more ‘innovative’ uses for AI. And sure, maybe some of those could be useful in other parts of the phone, but the iPhone 17 cameras first and foremost still need to be able to deliver truly superb-looking images, not simply use AI to compensate for any hardware shortcomings.

Variable aperture

One of the other reasons the Xiaomi 14 Ultra phone rocks so hard for photography is its variable aperture on the main camera. Its widest aperture is f/1.6 — significantly wider than the f/1.78 of the iPhone 16 Pro.That wider aperture lets in a lot of light in dim conditions and more authentically achieves out-of-focus bokeh around a subject. 

But Xiaomi’s 14 Ultra aperture can also close down to f/4, and with that narrower aperture, it’s able to create starbursts around points of light. I love achieving this effect in nighttime imagery with the phone. It makes the resulting images look much more like they’ve been taken with a professional camera and lens, while the same points of light on the iPhone just look like roundish blobs. Disappointingly, Xiaomi actually removed this feature from the new 15 Ultra so whether Apple sees value in implementing this kind of technology remains to be seen. 

More Photographic Styles

Though Apple has had various styles and effects integrated into the iPhone’s cameras, the iPhone 16 range took it further, with more control over the effects and more toning options. It’s enough that former CNET Senior Editor Lisa Eadicicco even declared the Photographic Styles her «favorite new feature on Apple’s latest phone.»

I think they’re great, too. Or rather, they’re a great start. The different color tones, like the ones you get with the Amber and Gold styles, add some lovely warmth to scenes, and the Quiet effect adds a vintage filmic fade, but there’s still not a whole lot to choose from and the interface is slow to work through. I’d love to see Apple introduce more Photographic Styles with different color toning options, or even with tones that mimic vintage film stocks from Kodak or Fujifilm. 

And sure, there are plenty of third-party apps like VSCO or Snapseed that let you play around with color filters all you want. But using Apple’s styles means you can take your images with the look already applied, and then change it afterward if you don’t like it — nothing is hard-baked into your image. 

I was recently impressed with Samsung’s new tool for creating custom color filters based off the look of other images. I’d love to see Apple bring that level of image customization to the iPhone.

Better ProRaw integration with Photographic Styles

I do think Apple has slightly missed an opportunity with its Photographic Styles, though, in that you can use them only when taking images in HEIF (high-efficiency image format). Unfortunately, you can’t use them when shooting in ProRaw. I love Apple’s use of ProRaw on previous iPhones, as it takes advantage of all of the iPhone’s computational photography — including things like HDR image blending — but still outputs a DNG raw file for easier editing. 

The DNG file typically also offers more latitude to brighten dark areas or tone down highlights in an image, making it extremely versatile. Previously, Apple’s color presets could be used when shooting in ProRaw, and I loved it. I frequently shot street-style photos using the high contrast black-and-white mode and then edited the raw file further. 

Now using that same black-and-white look means only shooting images in HEIF format, eliminating the benefits of using Apple’s ProRaw. Oddly, while the older-style «Filters» are no longer available in the camera app when taking a raw image, you can still apply those filters to raw photos in the iPhone’s gallery app through the editing menu.

LUTs for ProRes video

And while we’re on the topic of color presets and filters, Apple needs to bring those to video, too. On the iPhone 15 Pro, Apple introduced the ability to shoot video in ProRes, which results in very low-contrast, almost gray-looking footage. The idea is that video editors will take this raw footage and then apply their edits on top, often applying contrast and color presets known as LUTs (look-up tables) that gives footage a particular look — think dark and blue for horror films or warm and light tones for a romantic drama vibe. 

But Apple doesn’t offer any kind of LUT for editing ProRes video on the iPhone, beyond simply ramping up the contrast, which doesn’t really do the job properly. Sure, the point of ProRes is that you would take that footage off the iPhone, put it into software like Davinci Resolve, and then properly color grade the footage so it looks sleek and professional. 

But that still leaves the files on your phone, and I’d love to be able to do more with them. My gallery is littered with ungraded video files that I’ll do very little with because they need color grading externally. I’d love to share them to Instagram, or with my family over WhatsApp, after transforming those files from drab and gray to beautifully colorful.

With the iPhone 17, or even with the iPhone 16 as a software update, I want to see Apple creating a range of its own LUTs that can be directly applied to ProRes video files on the iPhone. While we didn’t see this software functionality discussed as part of the company’s June WWDC keynote, that doesn’t mean it couldn’t be launched with the iPhone in September.

If Apple were able to implement all these changes — excluding, perhaps, the full-frame sensor which even I can admit is a touch ambitious — it would have an absolute beast of a camera on its hands. 

Technologies

I Used to Tell People Wi-Fi 7 Routers Were a Waste of Money. CNET’s Lab Data Just Proved Me Wrong

Continue Reading

Technologies

My Camera Test: Comparing the $499 Pixel 10A With the Galaxy S25 FE, Motorola Edge

The Pixel 10A’s cameras are similar to those on the 9A, but it still performs quite well compared to other phones in its price range.

Google’s $499 Pixel 10A uses nearly the same cameras as last year’s Pixel 9A, but I wanted to see how its photos directly match up to its midrange Android rivals: the $650 Samsung Galaxy S25 FE and the $550 Motorola Edge.

I traveled with all three phones around St. Petersburg, Florida, checking how flexible each was in different environments, from bright outdoor settings to an indoor coffee shop and an evening brewery. All three environments can be challenging for the small image sensors on each phone. 

While I find the cameras on all three phones to have different strengths and weaknesses depending on the setting, I’m quite impressed with how the Pixel 10A keeps up. In my tests, the photos include lots of detail, even though certain settings appear to involve a lot of processing to improve them.

Wide and telephoto cameras

Starting with photos taken on the sidewalk in downtown St. Petersburg, I notice that all three phones handle bright sunlight slightly differently, especially how it’s depicted on the street.

For the Pixel 10A, the sun provides a slight exposure mark over the Bay First sign at the top of the frame, but it remains fairly cordoned off to focus on the rest of the streetscape. Zooming in, you can see the Century 21 location, but the street is captured in the most detail, with the phone’s camera maintaining its natural gray color.

For both the Galaxy S25 FE and the Motorola Edge, the sun has a more pronounced effect on the rest of the image. The pavement’s color is notably brighter. I also find both the S25 FE and the Edge have slightly more clarity on the business signs on the Bay First building, including the aforementioned Century 21 logo.

Since the S25 FE and the Edge each include a telephoto camera that supports 3x optical zoom, I took a photo at that zoom with each phone. The Pixel 10A uses digital zoom on the phone’s 48-megapixel wide camera, but a lot of the scene’s detail remains preserved.

The Pixel’s zoom photo provides a clear view of the 7th St N sign, the trees and the plants. However, if you look further back at the next intersection, you’ll notice that the 7th St S sign and the Colony Grill are much harder to see. It’s those smaller details that are captured by the S25 FE and the Edge, both aided by telephoto cameras, making them more visible.

Of the three zoom photo examples, I feel like the S25 FE has the best color reproduction while also retaining details like the signs further back. Even though the photo was taken with the S25 FE’s 8-megapixel telephoto camera rather than its 50-megapixel wide camera, the colors remain complementary when comparing the 1x to the 3x. Meanwhile, the Edge’s 10-megapixel telephoto camera looks quite a bit different from the 50-megapixel wide camera — the whole image has a more yellowish hue.

Ultrawide cameras

Moving inside the Southern Grounds coffee shop, I decided to use the ultrawide cameras to capture my sausage, egg and cheese on toast. The three photos came out wildly different.

The Pixel 10A’s 13-megapixel ultrawide and S25 FE’s 12-megapixel ultrawide have a more balanced set of colors and details, in my opinion. The wheat toast appears lighter in the Pixel’s photo than in the darker hues captured by both the S25 FE and the Edge.

When zooming into my notebook, however, the Pixel and S25 FE captured more of the page markings, details that blur together more in the photo taken by the Edge. While the Edge’s 50-megapixel ultrawide camera is a higher-spec number, I noticed it had a harder time distinguishing toast levels, giving more of it a darker look. If I hadn’t eaten it myself, I’d have thought it was burned based on the Edge’s photo.

Night photography

Moving over to a nighttime setting, I used the three phones to take photos outside of 3 Daughters Brewing. I felt like all three did a decent job at producing the colors of the building, but they differ in how they handle light sources.

Both the Pixel and the S25 FE tone back the glare produced by the various lighting fixtures. Meanwhile, the Edge’s photos show noticeable streaks that dominate the sky. When inspecting the photos more closely, I find that the Galaxy captured a sharper view of the furniture, like in the Connect 4 set next to the blue chairs in the center of the frame. The same details are visible in the Pixel’s and the Edge’s depictions of the scene, but they appear smudgy by comparison. 

This type of scene needs to take advantage of a phone’s processing power in order to iron out visibility issues, and I do find that the Edge appears to come up short here in this regard, with a lot of noticeable image noise.

Selfies

Each phone takes selfies with noticeable differences in style and color choices. For this test example, I’m in a well-lit daytime room with natural light from a window. The 12-megapixel front-facing camera on Google’s Pixel 10A brightened up my face as if there was a light in front of me, and captured a decent amount of the details of my hair and face.

The front-facing camera on Samsung’s Galaxy S25 FE shows a noticeably darker color tone, but it still captures a similar shade of orange on the wall behind me. Of the three photos, I felt like the S25 captures the most details, including strands of hair, and defaulted to a closer crop than the other two.

The photos taken by the 50-megapixel selfie camera on the Motorola Edge feel a bit smoothed out. The orange color on the wall is noticeably different from the Pixel and the S25 FE, though it does capture a lot of my face details, from hair strands to the fabric textures on my shirt.

The $499 Pixel 10A camera keeps up and, in some cases, exceeds the detail captured by the slightly more expensive $550 Motorola Edge and $650 Galaxy S25 FE. I’m quite impressed by how the Pixel camera handles colors and low-light environments, but the phone’s processing work sometimes makes scenes appear brighter than they are in real life.

The Galaxy S25 FE is no slouch either, with a third telephoto lens for capturing more detail farther away. While I did find the Motorola Edge to struggle in low light, it is one of the lowest-cost phone options currently available for someone who must have a 3x optical telephoto camera.

But if you can live without the telephoto lens, the Pixel 10A’s low cost and photography abilities will likely be a good fit for most people.

Google’s Pixel 10A Looks Stylish for a Low-Cost Flagship Phone

See all photos

Continue Reading

Technologies

Today’s NYT Strands Hints, Answers and Help for March 14 #741

Here are hints and answers for the NYT Strands puzzle for March 14, No. 741.

Looking for the most recent Strands answer? Click here for our daily Strands hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections and Connections: Sports Edition puzzles.


Does today’s date seem memorable to you? If so, today’s NYT Strands puzzle might be easy. Some of the answers are difficult to unscramble, so if you need hints and answers, read on.

I go into depth about the rules for Strands in this story. 

If you’re looking for today’s Wordle, Connections and Mini Crossword answers, you can visit CNET’s NYT puzzle hints page.

Read more: NYT Connections Turns 1: These Are the 5 Toughest Puzzles So Far

Hint for today’s Strands puzzle

Today’s Strands theme is: A math teacher’s favorite dessert.

If that doesn’t help you, here’s a clue: 3.14

Clue words to unlock in-game hints

Your goal is to find hidden words that fit the puzzle’s theme. If you’re stuck, find any words you can. Every time you find three words of four letters or more, Strands will reveal one of the theme words. These are the words I used to get those hints but any words of four or more letters that you find will work:

  • RITE, SPIT, TIPS, STAT, STATE, GIVE, RUST, FINE, LAZE, SURE, PEAL

Answers for today’s Strands puzzle

These are the answers that tie into the theme. The goal of the puzzle is to find them all, including the spangram, a theme word that reaches from one side of the puzzle to the other. When you have all of them (I originally thought there were always eight but learned that the number can vary), every letter on the board will be used. Here are the nonspangram answers:

  • VENT, CRUST, FRUIT, EDGES, GLAZE, FILLING, LATTICE

Today’s Strands spangram

Today’s Strands spangram is HAPPYPIDAY. To find it, start with the H that’s six rows down and three to the right from the upper-left corner, and make — well, a pie shape.

Toughest Strands puzzles

Here are some of the Strands topics I’ve found to be the toughest.

#1: Dated slang. Maybe you didn’t even use this lingo when it was cool. Toughest word: PHAT.

#2: Thar she blows! I guess marine biologists might ace this one. Toughest word: BALEEN or RIGHT. 

#3: Off the hook. Again, it helps to know a lot about sea creatures. Sorry, Charlie. Toughest word: BIGEYE or SKIPJACK.

Continue Reading

Trending

Copyright © Verum World Media