Connect with us

Technologies

I Need Apple to Make the iPhone 17 Cameras Amazing. Here’s What It Should Do

Commentary: After a lackluster WWDC, Apple needs to bring the razzle dazzle with the iPhone 17. Here’s how it can do just that.

Apple’s WWDC was a letdown for me, with no new hardware announced and few new features beyond a glassy interface for iOS 26. I’m pinning my hopes that the iPhone 17 will get my pulse racing, and the best way it can do that is with the camera. The iPhone 16 Pro already packs one of the best camera setups found on any phone, it’s capable of taking stunning images in any conditions. Throw in its ProRes video, Log recording and the neat 4K slow motion mode and it’s a potent video shooter too. It even put up a strong fight against the other best camera phones around, including the Galaxy S25 Ultra, the Pixel 9 Pro and the Xiaomi 14 Ultra. 

Read more: Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra

Despite that, it’s still not the perfect camera. While early reports from industry insiders claim that the phone’s video skills will get a boost, there’s more the iPhone 17 will need to make it an all-round photography powerhouse. As both an experienced phone reviewer and a professional photographer, I have exceptionally high expectations for top-end phone cameras. And, having used the iPhone 16 Pro since its launch, I have some thoughts on what needs to change. 

Here are the main points I want to see improved on the iPhone 17 when it likely launches in September 2025.

An accessible Pro camera mode

At WWDC, Apple showed off the changes to the upcoming iOS 26 which included a radical change to the interface with Liquid Glass. But that simplified style extended to the camera app too, with Apple paring the interface down to the most basic functions of Photo, Video and zoom levels. Presumably, the idea is to make it super easy for even the most beginner of photographers to open the camera and start taking Instagram-worthy snaps. 

And that’s fine, but what about those of us who buy the Pro models in order to take deeper advantage of features like exposure compensation, Photographic Styles and ProRaw formats? It’s not totally clear yet how these features can be accessed within the new camera interface, but they need to not be tucked away. Many photographers — myself very much included — want to use these tools as standard, using our powerful iPhones in much the same way we would a mirrorless camera from Canon or Sony. 

That means relying on advanced settings to take control over the image-taking process to craft shots that go beyond simple snaps. If anything, Apple’s camera app has always been too simple, with even basic functions like white balance being unavailable. To see Apple take things to an even more simplistic level is disappointing, and I want to see how the company will continue to make these phones usable for enthusiastic photographers. 

Larger image sensor

Though the 1/1.28-inch sensor found on  the iPhone 16 Pro’s main camera is already a good size — and marginally larger than the S24 Ultra’s 1/1.33-inch sensor — I want to see Apple go bigger. A larger image sensor can capture more light and offer better dynamic range. It’s why pro cameras tend to have at least «full frame» image sensors, while really high-end cameras, like the amazing Hasselblad 907X, have enormous «medium format» sensors for pristine image quality. 

Xiaomi understands this, equipping its 15 Ultra and previous 14 Ultra with 1-inch type sensors. It’s larger than the sensors found on almost any other phone, which allowed the 15 Ultra to take stunning photos all over Europe, while the 14 Pro was heroic in capturing a Taylor Swift concerts. I’m keen to see Apple at least match Xiaomi’s phone here with a similar 1-inch type sensor. Though if we’re talking pie-in-the-sky wishes, maybe the iPhone 17 could be the first smartphone with a full-frame image sensor. I won’t hold my breath on that one — the phone, and the lenses, would need to be immense to accommodate it, so it’d likely be more efficient just to let you make calls with your mirrorless camera. 

Variable aperture

Speaking of the Xiaomi 14 Ultra, one of the other reasons that phone rocks so hard for photography is its variable aperture on the main camera. Its widest aperture is f/1.6 — significantly wider than the f/1.78 of the iPhone 16 Pro.That wider aperture lets in a lot of light in dim conditions and more authentically achieves out-of-focus bokeh around a subject. 

But Xiaomi’s 14 Ultra aperture can also close down to f/4, and with that narrower aperture, it’s able to create starbursts around points of light. I love achieving this effect in nighttime imagery with the phone. It makes the resulting images look much more like they’ve been taken with a professional camera and lens, while the same points of light on the iPhone just look like roundish blobs. Disappointingly, Xiaomi actually removed this feature from the new 15 Ultra so whether Apple sees value in implementing this kind of technology remains to be seen. 

More Photographic Styles

Though Apple has had various styles and effects integrated into the iPhone’s cameras, the iPhone 16 range took it further, with more control over the effects and more toning options. It’s enough that CNET Senior Editor Lisa Eadicicco even declared the new Photographic Styles her «favorite new feature on Apple’s latest phone.»

I think they’re great too. Or rather, they’re a great start. The different color tones, like the ones you get with the Amber and Gold styles, add some lovely warmth to scenes, and the Quiet effect adds a vintage filmic fade, but there’s still not a whole lot to choose from and the interface can be a little slow to work through. I’d love to see Apple introduce more Photographic Styles with different color toning options, or even with tones that mimic vintage film stocks from Kodak or Fujifilm. 

And sure, there are plenty of third-party apps like VSCO or Snapseed that let you play around with color filters all you want. But using Apple’s styles means you can take your images with the look already applied, and then change it afterward if you don’t like it — nothing is hard-baked into your image. 

I was recently impressed with Samsung’s new tool for creating custom color filters based off the look of other images. I’d love to see Apple bring that level of image customization to the iPhone.

Better ProRaw integration with Photographic Styles

I do think Apple has slightly missed an opportunity with its Photographic Styles, though, in that you can use them only when taking images in HEIF (high-efficiency image format). Unfortunately, you can’t use them when shooting in ProRaw. I love Apple’s use of ProRaw on previous iPhones, as it takes advantage of all of the iPhone’s computational photography — including things like HDR image blending — but still outputs a DNG raw file for easier editing. 

The DNG file typically also offers more latitude to brighten dark areas or tone down highlights in an image, making it extremely versatile. Previously, Apple’s color presets could be used when shooting in ProRaw, and I loved it. I frequently shot street-style photos using the high contrast black-and-white mode and then edited the raw file further. 

Now using that same black-and-white look means only shooting images in HEIF format, eliminating the benefits of using Apple’s ProRaw. Oddly, while the older-style «Filters» are no longer available in the camera app when taking a raw image, you can still apply those filters to raw photos in the iPhone’s gallery app through the editing menu.

LUTs for ProRes video

And while we’re on the topic of color presets and filters, Apple needs to bring those to video, too. On the iPhone 15 Pro, Apple introduced the ability to shoot video in ProRes, which results in very low-contrast, almost gray-looking footage. The idea is that video editors will take this raw footage and then apply their edits on top, often applying contrast and color presets known as LUTs (look-up tables) that gives footage a particular look — think dark and blue for horror films or warm and light tones for a romantic drama vibe. 

But Apple doesn’t offer any kind of LUT for editing ProRes video on the iPhone, beyond simply ramping up the contrast, which doesn’t really do the job properly. Sure, the point of ProRes is that you would take that footage off the iPhone, put it into software like Davinci Resolve, and then properly color grade the footage so it looks sleek and professional. 

But that still leaves the files on your phone, and I’d love to be able to do more with them. My gallery is littered with ungraded video files that I’ll do very little with because they need color grading externally. I’d love to share them to Instagram, or with my family over WhatsApp, after transforming those files from drab and gray to beautifully colorful.

With the iPhone 17, or even with the iPhone 16 as a software update, I want to see Apple creating a range of its own LUTs that can be directly applied to ProRes video files on the iPhone. While we didn’t see this software functionality discussed as part of the company’s June WWDC keynote, that doesn’t mean it couldn’t be launched with the iPhone in September.

If Apple were able to implement all these changes — excluding, perhaps, the full-frame sensor which even I can admit is a touch ambitious — it would have an absolute beast of a camera on its hands. 

Technologies

Toy Story 5: The Big New Rival Is a Tablet

Woody, Buzz and Jessie will battle a tablet in Toy Story 5, which is scheduled to hit theaters just about a year from now.

Pixar is giving its old-school toys a decidedly modern antagonist: a tablet.

During the studio’s Friday showcase at Annecy’s International Animation Film Festival in France, Chief Creative Officer Pete Docter revealed that Toy Story 5’s villain is Lily Pad, a «sneaky» and «prickly» tablet that convinces 8-year-old Bonnie Anderson that friends and games on a device beat dusty ol’ toys in the closet.

The first concept art shows the frog-faced tablet looming over Buzz, Jessie and Bullseye like a touchscreen tyrant of the toy box. As you can probably foresee, this is a battle between analog toys and always-on tech. (You can stream Toy Story movies one through four on Disney Plus.)

The Lily Pad reveal topped a jam-packed Walt Disney Animation and Pixar showcase, where the company rolled out updated release dates, never-before-seen footage and a couple of all-new original films.

Toy Story 5: The enemy is tech

The premise of Toy Story 5 is that Anderson gets a Lily Pad for school chat and online games. But the tablet decides that Anderson’s toys, including Woody, Buzz and Jessie, are holding her back. Tom Hanks, Tim Allen and Joan Cusack are all returning, while Ernie Hudson steps in as Combat Carl, honoring the late Carl Weathers.

Docter shared the opening scene of Toy Story 5, which shows a crate full of stranded Buzz Lightyears trying to escape a desert island. Toy Story 5 is set for release on June 19, 2026.

Pixar announced two new original films

Daniel Chong’s Hoppers, scheduled for March 6, 2026, turns an eco-heist into a critter-powered caper: 14-year-old Mabel uploads her mind into a robotic beaver to save her local pond from a highway project. 

Gatto, slated for summer 2027, centers on Nero, a black cat in Venice who’s burned through most of his nine lives doing jobs for a feline mob boss. Now, questioning whether he’s wasted those lives, Nero stumbles into an unexpected friendship that could finally give him purpose. The film will be shot in a «living storybook» style, which is new to Pixar. 

And we got to see new footage from Zootopia 2 and Elio

Jared Bush, Walt Disney Animation Studios’ chief creative officer and director-writer of Zootopia 2, showed some new footage and images from the anticipated sequel, which should come out in November. The audience also got to watch a 27-minute sizzle reel from Elio, the cosmic coming-of-age adventure, which opens June 20.

Continue Reading

Technologies

Too Busy to Read? Google’s Audio Overviews Summarize Your Search Results Aloud

This new feature turns some Google queries into bite-size podcast clips so you can learn without reading.

The next time you wonder why school buses are yellow, you might not have to read a single word to get the answer. Google’s latest experimental feature can literally tell you the answer, in a tiny audio clip that loads right to your results page.

Launched Friday in Search Labs, Audio Overviews uses Google’s latest Gemini AI models to turn certain queries into 30- to 45-second, podcast-style explainers, complete with on-screen source links for fact-checking. 

The move pushes Google’s AI Overviews beyond text, positioning Search for a semi-hands-free, voice-first future, while also raising more questions about what this means for publishers who rely on clicks.

How you can try out Audio Overviews right now

If you’re interested, you can try out Google’s Audio Overviews right now. Go to the Google Labs website, opt in to the Search Labs program if you’re not already signed up and toggle on Audio Overviews

The next time you run a query, like «How do I stop apps from tracking my exact location on my iPhone,» Google might show you a button that says Generate Audio Overview, which you’ll have to scroll down a little to see. 

You can then tap on the Audio Overview to process the clip, and then press play. You can speed up the audio, mute the clip and also rate it with a thumbs-up or thumbs-down to better train it.

Below the player, Google lists the web pages it drew from, so you can click through to fact-check the information or just dig deeper.

For those with visual impairments, this new feature offers a glimpse at what a voice-first Google might look like. But until Google expands language support and proves the summaries are dependable, consider this a nifty experiment for now, not a substitute for reading the full story.

Continue Reading

Technologies

Today’s NYT Strands Hints, Answers and Help for June 14, #468

Here are hints and answers for the NYT Strands puzzle No. 468 for June 14.

Looking for the most recent Strands answer? Click here for our daily Strands hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections and Connections: Sports Edition puzzles.


Today’s NYT Strands puzzle honors Flag Day. If you need hints and answers, read on.

I go into depth about the rules for Strands in this story. 

If you’re looking for today’s Wordle, Connections and Mini Crossword answers, you can visit CNET’s NYT puzzle hints page.

Hint for today’s Strands puzzle

Today’s Strands theme is: It’s a banner day.

If that doesn’t help you, here’s a clue: O say can you see.

Clue words to unlock in-game hints

Your goal is to find hidden words that fit the puzzle’s theme. If you’re stuck, find any words you can. Every time you find three words of four letters or more, Strands will reveal one of the theme words. These are the words I used to get those hints but any words of four or more letters that you find will work:

  • TRIP, TRIPS, TROT, TROTS, RATS, LEND, SEND, TRAIL, RAIL, NAIL, RANT, STRIP

Answers for today’s Strands puzzle

These are the answers that tie in to the theme. The goal of the puzzle is to find them all, including the spangram, a theme word that reaches from one side of the puzzle to the other. When you have all of them (I originally thought there were always eight but learned that the number can vary), every letter on the board will be used. Here are the nonspangram answers:

  • STAR, STRIPE, SHIELD, MOON, CROSS, TRIANGLE, CROWN

Today’s Strands spangram

Today’s Strands spangram is  FLAGSYMBOL. To find it, start with the F that’s four letters to the right on the bottom row, and head up.

Continue Reading

Trending

Copyright © Verum World Media