Connect with us

Technologies

I Need Apple to Make the iPhone 17 Cameras Amazing. Here’s What It Should Do

Commentary: After a lackluster WWDC, Apple needs to bring the razzle dazzle with the iPhone 17. Here’s how it can do just that.

Apple’s WWDC was a letdown for me, with no new hardware announced and few new features beyond a glassy interface for iOS 26. I’m pinning my hopes that the iPhone 17 will get my pulse racing, and the best way it can do that is with the camera. The iPhone 16 Pro already packs one of the best camera setups found on any phone, it’s capable of taking stunning images in any conditions. Throw in its ProRes video, Log recording and the neat 4K slow motion mode and it’s a potent video shooter too. It even put up a strong fight against the other best camera phones around, including the Galaxy S25 Ultra, the Pixel 9 Pro and the Xiaomi 14 Ultra. 

Read more: Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra

Despite that, it’s still not the perfect camera. While early reports from industry insiders claim that the phone’s video skills will get a boost, there’s more the iPhone 17 will need to make it an all-round photography powerhouse. As both an experienced phone reviewer and a professional photographer, I have exceptionally high expectations for top-end phone cameras. And, having used the iPhone 16 Pro since its launch, I have some thoughts on what needs to change. 

Here are the main points I want to see improved on the iPhone 17 when it likely launches in September 2025.

An accessible Pro camera mode

At WWDC, Apple showed off the changes to the upcoming iOS 26 which included a radical change to the interface with Liquid Glass. But that simplified style extended to the camera app too, with Apple paring the interface down to the most basic functions of Photo, Video and zoom levels. Presumably, the idea is to make it super easy for even the most beginner of photographers to open the camera and start taking Instagram-worthy snaps. 

And that’s fine, but what about those of us who buy the Pro models in order to take deeper advantage of features like exposure compensation, Photographic Styles and ProRaw formats? It’s not totally clear yet how these features can be accessed within the new camera interface, but they need to not be tucked away. Many photographers — myself very much included — want to use these tools as standard, using our powerful iPhones in much the same way we would a mirrorless camera from Canon or Sony. 

That means relying on advanced settings to take control over the image-taking process to craft shots that go beyond simple snaps. If anything, Apple’s camera app has always been too simple, with even basic functions like white balance being unavailable. To see Apple take things to an even more simplistic level is disappointing, and I want to see how the company will continue to make these phones usable for enthusiastic photographers. 

Larger image sensor

Though the 1/1.28-inch sensor found on  the iPhone 16 Pro’s main camera is already a good size — and marginally larger than the S24 Ultra’s 1/1.33-inch sensor — I want to see Apple go bigger. A larger image sensor can capture more light and offer better dynamic range. It’s why pro cameras tend to have at least «full frame» image sensors, while really high-end cameras, like the amazing Hasselblad 907X, have enormous «medium format» sensors for pristine image quality. 

Xiaomi understands this, equipping its 15 Ultra and previous 14 Ultra with 1-inch type sensors. It’s larger than the sensors found on almost any other phone, which allowed the 15 Ultra to take stunning photos all over Europe, while the 14 Pro was heroic in capturing a Taylor Swift concerts. I’m keen to see Apple at least match Xiaomi’s phone here with a similar 1-inch type sensor. Though if we’re talking pie-in-the-sky wishes, maybe the iPhone 17 could be the first smartphone with a full-frame image sensor. I won’t hold my breath on that one — the phone, and the lenses, would need to be immense to accommodate it, so it’d likely be more efficient just to let you make calls with your mirrorless camera. 

Variable aperture

Speaking of the Xiaomi 14 Ultra, one of the other reasons that phone rocks so hard for photography is its variable aperture on the main camera. Its widest aperture is f/1.6 — significantly wider than the f/1.78 of the iPhone 16 Pro.That wider aperture lets in a lot of light in dim conditions and more authentically achieves out-of-focus bokeh around a subject. 

But Xiaomi’s 14 Ultra aperture can also close down to f/4, and with that narrower aperture, it’s able to create starbursts around points of light. I love achieving this effect in nighttime imagery with the phone. It makes the resulting images look much more like they’ve been taken with a professional camera and lens, while the same points of light on the iPhone just look like roundish blobs. Disappointingly, Xiaomi actually removed this feature from the new 15 Ultra so whether Apple sees value in implementing this kind of technology remains to be seen. 

More Photographic Styles

Though Apple has had various styles and effects integrated into the iPhone’s cameras, the iPhone 16 range took it further, with more control over the effects and more toning options. It’s enough that CNET Senior Editor Lisa Eadicicco even declared the new Photographic Styles her «favorite new feature on Apple’s latest phone.»

I think they’re great too. Or rather, they’re a great start. The different color tones, like the ones you get with the Amber and Gold styles, add some lovely warmth to scenes, and the Quiet effect adds a vintage filmic fade, but there’s still not a whole lot to choose from and the interface can be a little slow to work through. I’d love to see Apple introduce more Photographic Styles with different color toning options, or even with tones that mimic vintage film stocks from Kodak or Fujifilm. 

And sure, there are plenty of third-party apps like VSCO or Snapseed that let you play around with color filters all you want. But using Apple’s styles means you can take your images with the look already applied, and then change it afterward if you don’t like it — nothing is hard-baked into your image. 

I was recently impressed with Samsung’s new tool for creating custom color filters based off the look of other images. I’d love to see Apple bring that level of image customization to the iPhone.

Better ProRaw integration with Photographic Styles

I do think Apple has slightly missed an opportunity with its Photographic Styles, though, in that you can use them only when taking images in HEIF (high-efficiency image format). Unfortunately, you can’t use them when shooting in ProRaw. I love Apple’s use of ProRaw on previous iPhones, as it takes advantage of all of the iPhone’s computational photography — including things like HDR image blending — but still outputs a DNG raw file for easier editing. 

The DNG file typically also offers more latitude to brighten dark areas or tone down highlights in an image, making it extremely versatile. Previously, Apple’s color presets could be used when shooting in ProRaw, and I loved it. I frequently shot street-style photos using the high contrast black-and-white mode and then edited the raw file further. 

Now using that same black-and-white look means only shooting images in HEIF format, eliminating the benefits of using Apple’s ProRaw. Oddly, while the older-style «Filters» are no longer available in the camera app when taking a raw image, you can still apply those filters to raw photos in the iPhone’s gallery app through the editing menu.

LUTs for ProRes video

And while we’re on the topic of color presets and filters, Apple needs to bring those to video, too. On the iPhone 15 Pro, Apple introduced the ability to shoot video in ProRes, which results in very low-contrast, almost gray-looking footage. The idea is that video editors will take this raw footage and then apply their edits on top, often applying contrast and color presets known as LUTs (look-up tables) that gives footage a particular look — think dark and blue for horror films or warm and light tones for a romantic drama vibe. 

But Apple doesn’t offer any kind of LUT for editing ProRes video on the iPhone, beyond simply ramping up the contrast, which doesn’t really do the job properly. Sure, the point of ProRes is that you would take that footage off the iPhone, put it into software like Davinci Resolve, and then properly color grade the footage so it looks sleek and professional. 

But that still leaves the files on your phone, and I’d love to be able to do more with them. My gallery is littered with ungraded video files that I’ll do very little with because they need color grading externally. I’d love to share them to Instagram, or with my family over WhatsApp, after transforming those files from drab and gray to beautifully colorful.

With the iPhone 17, or even with the iPhone 16 as a software update, I want to see Apple creating a range of its own LUTs that can be directly applied to ProRes video files on the iPhone. While we didn’t see this software functionality discussed as part of the company’s June WWDC keynote, that doesn’t mean it couldn’t be launched with the iPhone in September.

If Apple were able to implement all these changes — excluding, perhaps, the full-frame sensor which even I can admit is a touch ambitious — it would have an absolute beast of a camera on its hands. 

Technologies

Tim Cook Stepping Down? Apple CEO’s 65th Birthday Today Sparks Succession Talk

Apple is no doubt considering who it will choose to fill the chief executive role once Tim Cook decides to retire. Here are a few potential candidates reportedly being considered.

With Tim Cook turning 65 on Saturday, Nov. 1, talks have been growing around the question of who his successor as Apple CEO could be, should he choose to retire. Cook has made no announcement that he’ll be stepping down, but according to Bloomberg’s Mark Gurman, the tech giant is working behind the scenes to ensure a seamless transition when the time does come.

Cook replaced Steve Jobs in 2011, and after a period of uncertainty, Cook ushered Apple into its most profitable era. Stock-watching website Stocktwits reports that the company’s stock has increased by around 1,800% since Cook took over leading the company.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


Jobs may have introduced devices like the iPhone into everyday use that changed how we interact with technology, but Cook expanded on the Apple experience. Under his guidance, the company built upon Apple’s smartphone by introducing subscription services and more mobile products, including earbuds and wearables.

He introduced Apple Pay, Beats headphones became part of the company’s ecosystem, the Apple Watch launched 10 years ago, and Apple even entered the entertainment business, producing original Oscar-winning movies and Emmy-winning TV shows through Apple TV Plus.

Read more: Best iPhone in 2025: Here’s Which Apple Phone You Should Buy

We should reiterate that the notion of Cook stepping down is pure speculation at this point. We don’t know what Apple’s CEO is currently planning or what his thoughts about retirement may be. That said, there are a handful of contenders who have reportedly been part of the succession conversation. 

Potential Apple CEO contenders

Apple likely has «a solid bench of successors» that the company’s board has been developing, says Bryan Ma, VP of Devices Research at IDC.

«But the anxiety gets amplified when there isn’t clear visibility for such a valuable and iconic company,» Ma says. «Compounding the challenge is the fact that the bar has been set by big rock stars like Steve Jobs and Tim Cook. The next generation of leaders have very big shoes to fill.»

John Ternus, Apple’s current vice president of Hardware Engineering, was top of Gurman’s list. Ternus has been with the tech giant for more than two decades, so he has the knowledge and experience for a chief executive upgrade. There would be value in having an engineer behind the wheel. 

Ternus appeared during the September Apple event to introduce the iPhone Air. At 50, he’s the same age Cook was when he took over as Apple CEO.

Other potential contenders are also being considered, including Craig Federighi, Apple’s senior vice president of software engineering; Greg Joswiak, Apple’s senior vice president of worldwide marketing; and Jeff Williams, the company’s former chief operating officer, according to a report by Apple Insider. On Oct. 10, Bloomberg reported that Federighi also will soon be overseeing the Apple Watch operating system watchOS, while Ternus will be overseeing Apple Watch hardware engineering once Williams departs at the end of the year.

Federighi has been with Apple for a long time and has the public speaking experience — frequently speaking during Apple Events — that would be vital if he replaced Cook as CEO. Considering his current role, Joswiak has a more marketing perspective and a broader overview of the company and may not be as hands-on with the tech as Ternus and Federighi. And according to Gurman, Williams was viewed as a shoo-in to be Cook’s replacement until his role as COO was announced to be ending. (He’s now Apple’s senior vice president of design, watch and health.) Cook held the position of chief operating officer before he replaced Jobs as CEO in 2011. Sabih Khan will be stepping into that COO role, which also puts his name in the running.

When Cook steps down, Apple will undoubtedly have a pool of qualified talent to choose from to take up the leadership mantle. Who exactly will take the mantle remains to be seen.

Apple didn’t immediately respond to a request for comment.

Continue Reading

Technologies

I Went Hands-On With the OnePlus 15’s Camera and You Need to See the Results

What better first test run than taking it on a neighborhood photo safari?

The OnePlus 15 is the next premium handset from the Chinese phone-maker, and I just got my hands on it. To give its cameras a whirl, I took it out for a quick spin through a hip corner of Los Angeles.

The OnePlus 15’s big advantage is that it’s one of the first to run the Snapdragon 8 Elite Gen 5, Qualcomm’s next-generation chip for high-end phones, which was launched in September. The system-on-a-chip has a big influence on how photos come out, processing every image captured through the rear cameras. 

The OnePlus 15 has three 50-megapixel rear cameras, along with a selfie shooter on the front, and I took photos of my neighborhood flora and fauna using them all. While there’s a certain level of polish expected of premium phone cameras, this phone has something new: it’s the first major OnePlus handset released since the company’s partnership with Hasselblad ended. For years, OnePlus incorporated the iconic Swedish camera maker’s color science and image calibration in its cameras.

With Hasselblad gone, the OnePlus 15 features the debut of the DetailMax Engine, a loftily-titled computational processing system that aims to «present scenes as they truly are, without over-beautification or distortion,» as the company’s official blog post explained. 

That means a new page for shooting photos on a OnePlus phone, which made me want to know what the OnePlus 15 is capable of. Join me through a casual tour of a vibrant Los Angeles neighborhood, taking the kinds of snapshots that make up the majority of everybody’s camera roll. I’ll need to spend a lot more time with the device to give it a comprehensive review. 

Our first shot is of the outside of The Silver Lake House, a neighborhood Thai restaurant. While I clearly can’t resist a slight Dutch angle here, the blend of colors look distinct and not oversaturated — a win for true-to-life processing. I like the way the OnePlus 15 captured the light and shadows filtering through the trees, and the camera has handled the lens flare well without over-exposing that area. Also, notice the reflection on the chrome on the heat lamp.

Here’s a close-up of knick-knack plant vases on a windowsill overlooking the restaurant’s indoor tables. The light is really balanced, bright on the foreground outside the eatery and dimmer within — but colors and details are still visible inside. You can also pick out some detail in the reflections on the window of the street behind me.

I couldn’t resist this 1960s Ford Thunderbird sitting idly on the street, a cruising car from yesteryear resting in a hipper corner of LA. Note the texture of the dirt streaks over the paint contrasted against the shiny chromed metal surrounding the taillights. More importantly, despite the camera’s focus on the foreground, the OnePlus 15  still manages to capture the blue sky in the background, complete with details in the clouds.

I took this photo of a nearby dog park with the ultrawide lens, which preserves humdrum details in the brown dirt amid sprouted grass along the bottom.

Here’s an image of the same dog park that I took while zoomed in at 7x magnification. It has a lot of detail and color. But we can go further!

Here’s the dog park photographed at 120x magnification, the farthest this phone can zoom in. The image looked grainy as heck on the phone’s screen when I shot it, but that DetailMax Engine’s post-processing has done relative wonders, making this semi-recognizable despite a lot of smudging at the edges caused by noise reduction — look between the chain links. To be sure, this is not a great image — it’s nearly painterly — but the fact that it can zoom in this far and still serve up a photo with something recognizable is amazing.

Here’s a selfie featuring yours truly. I think this photo has good detail and shadow, but what most impresses me are the mountains in the distance, which can be seen to some degree through the classic Los Angeles haze (marine layer, not smog) occluding the air, not the OnePlus selfie camera.

For comparison, here’s a selfie I took at night. The color is fine, with decent details in the foreground, though they start to blur behind me — notice the bricks on the bottom right, the posters on the light pole on the mid-left, and especially the building over my shoulder.

Here’s the obligatory night shot of a Los Angeles street. While the city will never be dark enough to test the phone’s ability to capture constellations of stars in the night sky, this does show the contrast between warm streetlights and the bright neon. The details of the stucco pockmarked the walls of the bowling alley are clear, even from across the street. Look closely at the texture of the street’s pavement. It’s a granular mix of grays flecked with white spots. All the grime of the city, preserved by the OnePlus 15’s new shiny cameras.

That’s it for the first look at the OnePlus’ camera capabilities. Happy Halloween! And keep an eye out for my full OnePlus 15 review.

Continue Reading

Technologies

A $20K Humanoid Robot to Help Around the House? The Price Isn’t the Only Caveat

The new Neo robot from 1X is designed to do chores. It’ll have to learn a lot from you — and about you.

It stands 5 feet, 6 inches tall, weighs about as much as a golden retriever and costs near the price of a brand-new budget car. 

This is Neo, the humanoid robot. It’s billed as a personal assistant you can talk to and eventually rely on to take care of everyday tasks, such as loading the dishwasher and folding laundry. 

Neo doesn’t work cheap. It’ll cost you $20,000. And even then, you’ll still have to train this new home bot.

If that sounds enticing, preorders are now open (for a mere $200 down). You’ll be signing up as an early adopter for what Neo’s maker, a California-based company called 1X, is calling a «consumer-ready humanoid.» That’s opposed to other humanoids under development from the likes of Tesla and Figure, which are, for the moment at least, more focused on factory environments. 

Neo is a whole order of magnitude different from robot vacuums like those from Roomba, Eufy and Ecovacs, and embodies a long-running sci-fi fantasy of robot maids and butlers doing chores and picking up after us. If this is the future, read on for more of what’s in store.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


What the Neo robot can do around the house

The pitch from 1X is that Neo can do all manner of household chores: fold laundry, run a vacuum, tidy shelves, bring in the groceries. It can open doors, climb stairs and even act as a home entertainment system.

Neo appears to move smoothly, with a soft, almost human-like gait, thanks to 1X’s tendon-driven motor system that gives it gentle motion and impressive strength. The company says it can lift up to 154 pounds and carry 55 pounds, but it is quieter than a refrigerator. It’s covered in soft materials and neutral colors, making it look less intimidating than metallic prototypes from other companies.

The company says Neo has a 4-hour runtime. Its hands are IP68-rated, meaning they’re submersible in water. It can connect via Wi-Fi, Bluetooth and 5G. For conversation, it has a built-in LLM, the same sort of AI technology that powers ChatGPT and Gemini.

The primary way to control the Neo robot will be by speaking to it, just as if it were a person in your home.  

Still, Neo’s usefulness today depends heavily on how you define useful. The Wall Street Journal’s Joanna Stern got an up-close look at Neo at 1X’s headquarters and found that, at least for now, it’s largely teleoperated, meaning a human often operates it remotely using a virtual-reality headset and controllers. 

«I didn’t see Neo do anything autonomously, although the company did share a video of Neo opening a door on its own,» Stern wrote. 

1X CEO Bernt Børnich told her that Neo will do most things autonomously in 2026, though he also acknowledged that the quality «may lag at first.»

What you need to know about Neo and privacy

Part of what early adopters are signing up for is to let Neo learn from their environment so that future versions can operate more independently. 

That learning process raises privacy and trust questions. The robot uses a mix of visual, audio and contextual intelligence — meaning it can see, hear and remember interactions with users throughout their homes. 

«If you buy this product, it is because you’re OK with that social contract,» Børnich told the Journal. «It’s less about Neo instantly doing your chores and more about you helping Neo learn to do them safely and effectively.»

1X says it’s taking steps to protect your privacy: Neo listens only when it recognizes it’s being addressed, and its cameras will blur out humans. You can restrict Neo from entering or viewing specific areas of your home, and the robot will never be teleoperated without owner approval, the company says. 

But inviting an AI-equipped humanoid to observe your home life isn’t a small step.

The first units will ship to customers in the US in 2026. There is a $499 monthly subscription alternative to the $20,000 full-purchase price, though that will be available at an unspecified later date. A broader international rollout is promised for 2027.

Neo’s got a long road ahead of it to live up to the expectations set by Rosie the Robot in The Jetsons way back when. But this is no Hanna-Barbera cartoon. What we’re seeing now is a much more tangible harbinger of change.

Continue Reading

Trending

Copyright © Verum World Media