Technologies
The iPhone 17 Needs Amazing Cameras. Here’s What I Think Apple Should Do
Commentary: Apple’s rivals are catching up when it comes to camera skills. Here’s how the iPhone 17 can pull ahead.

The iPhone 16 Pro already packs one of the best camera setups found on any phone, but the iPhone 17 needs to take things even further when it launches in just a few weeks. Sure, Apple’s phones are capable of taking stunning photos, thanks to its awesome software, ProRaw format and its wealth of video skills, but Apple’s rivals have been doing big things, too. The Galaxy S25 Ultra, the Pixel 9 Pro and the Xiaomi 14 Ultra all pack amazing camera setups that have given the iPhone 16 Pro a run for its money and made it clear that Apple isn’t the only company innovating in the imaging arena.
Read more: Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra
While early reports from industry insiders claim that the phone’s video skills will get a boost, there’s more the iPhone 17 will need to make it an all-around photography powerhouse. As both an experienced phone reviewer and a professional photographer, I have exceptionally high expectations for top-end phone cameras. And, having used the iPhone 16 Pro since its launch, I have some thoughts on what needs to change.
Here are the main points I want to see improved on the iPhone 17 when it likely launches in September 2025.
An accessible Pro camera mode
At WWDC, Apple showed off the changes to the upcoming iOS 26 which included a radical change to the interface with Liquid Glass. But that simplified style extended to the camera app too, with Apple paring the interface down to the most basic functions of Photo, Video and zoom levels. Presumably, the idea is to make it super easy for even the most beginner of photographers to open the camera and start taking Instagram-worthy snaps.
And that’s fine, but what about those of us who buy the Pro models in order to take deeper advantage of features like exposure compensation, Photographic Styles and ProRaw formats? It’s not totally clear yet how these features can be accessed within the new camera interface, but they need to not be tucked away. Many photographers — myself very much included — want to use these tools as standard, using our powerful iPhones in much the same way we would a mirrorless camera from Canon or Sony.
That means relying on advanced settings to take control over the image-taking process to craft shots that go beyond simple snaps. If anything, Apple’s camera app has always been too simple, with even basic functions like white balance being unavailable. To see Apple take things to an even more simplistic level is concerning, and I want to see how the company will continue to make these phones usable for enthusiastic photographers.
Larger image sensor
Though the 1/1.28-inch sensor found on the iPhone 16 Pro’s main camera is already a good size — and marginally larger than the S24 Ultra’s 1/1.33-inch sensor — I want to see Apple go bigger. A larger image sensor can capture more light and offer better dynamic range. It’s why pro cameras tend to have at least «full frame» image sensors, while really high-end cameras, like the amazing Hasselblad 907X, have enormous «medium format» sensors for pristine image quality.
Xiaomi understands this, equipping its 15 Ultra and previous 14 Ultra with 1-inch type sensors. It’s larger than the sensors found on almost any other phone, which allowed the 15 Ultra to take stunning photos all over Europe, while the 14 Pro was heroic in capturing images at Taylor Swift concerts. I’m keen to see Apple at least match Xiaomi’s phone here with a similar 1-inch type sensor. Though if we’re talking pie-in-the-sky wishes, maybe the iPhone 17 could be the first smartphone with a full-frame image sensor. I won’t hold my breath on that one — the phone, and the lenses, would need to be immense to accommodate it, so it’d likely be more efficient just to let you make calls with your mirrorless camera.
Don’t lean on AI too much
AI has become a bigger part of the camera experience on many Android phones, from the Honor 400 Pro’s tool that brought my dad back to life to the Pixel 9 Pro’s wild generative AI functions. But iPhones have always emphasized the importance of real image quality, producing sharp, detailed images that remain faithful to the scene you actually saw when you pushed the shutter button.
Apple’s dalliances in AI so far haven’t exactly been groundbreaking and I worry that the company may want to be seen as making a bigger push for deeper, more ‘innovative’ uses for AI. And sure, maybe some of those could be useful in other parts of the phone, but the iPhone 17 cameras first and foremost still need to be able to deliver truly superb-looking images, not simply use AI to compensate for any hardware shortcomings.
Variable aperture
One of the other reasons the Xiaomi 14 Ultra phone rocks so hard for photography is its variable aperture on the main camera. Its widest aperture is f/1.6 — significantly wider than the f/1.78 of the iPhone 16 Pro.That wider aperture lets in a lot of light in dim conditions and more authentically achieves out-of-focus bokeh around a subject.
But Xiaomi’s 14 Ultra aperture can also close down to f/4, and with that narrower aperture, it’s able to create starbursts around points of light. I love achieving this effect in nighttime imagery with the phone. It makes the resulting images look much more like they’ve been taken with a professional camera and lens, while the same points of light on the iPhone just look like roundish blobs. Disappointingly, Xiaomi actually removed this feature from the new 15 Ultra so whether Apple sees value in implementing this kind of technology remains to be seen.
More Photographic Styles
Though Apple has had various styles and effects integrated into the iPhone’s cameras, the iPhone 16 range took it further, with more control over the effects and more toning options. It’s enough that former CNET Senior Editor Lisa Eadicicco even declared the Photographic Styles her «favorite new feature on Apple’s latest phone.»
I think they’re great, too. Or rather, they’re a great start. The different color tones, like the ones you get with the Amber and Gold styles, add some lovely warmth to scenes, and the Quiet effect adds a vintage filmic fade, but there’s still not a whole lot to choose from and the interface is slow to work through. I’d love to see Apple introduce more Photographic Styles with different color toning options, or even with tones that mimic vintage film stocks from Kodak or Fujifilm.
And sure, there are plenty of third-party apps like VSCO or Snapseed that let you play around with color filters all you want. But using Apple’s styles means you can take your images with the look already applied, and then change it afterward if you don’t like it — nothing is hard-baked into your image.
I was recently impressed with Samsung’s new tool for creating custom color filters based off the look of other images. I’d love to see Apple bring that level of image customization to the iPhone.
Better ProRaw integration with Photographic Styles
I do think Apple has slightly missed an opportunity with its Photographic Styles, though, in that you can use them only when taking images in HEIF (high-efficiency image format). Unfortunately, you can’t use them when shooting in ProRaw. I love Apple’s use of ProRaw on previous iPhones, as it takes advantage of all of the iPhone’s computational photography — including things like HDR image blending — but still outputs a DNG raw file for easier editing.
The DNG file typically also offers more latitude to brighten dark areas or tone down highlights in an image, making it extremely versatile. Previously, Apple’s color presets could be used when shooting in ProRaw, and I loved it. I frequently shot street-style photos using the high contrast black-and-white mode and then edited the raw file further.
Now using that same black-and-white look means only shooting images in HEIF format, eliminating the benefits of using Apple’s ProRaw. Oddly, while the older-style «Filters» are no longer available in the camera app when taking a raw image, you can still apply those filters to raw photos in the iPhone’s gallery app through the editing menu.
LUTs for ProRes video
And while we’re on the topic of color presets and filters, Apple needs to bring those to video, too. On the iPhone 15 Pro, Apple introduced the ability to shoot video in ProRes, which results in very low-contrast, almost gray-looking footage. The idea is that video editors will take this raw footage and then apply their edits on top, often applying contrast and color presets known as LUTs (look-up tables) that gives footage a particular look — think dark and blue for horror films or warm and light tones for a romantic drama vibe.
But Apple doesn’t offer any kind of LUT for editing ProRes video on the iPhone, beyond simply ramping up the contrast, which doesn’t really do the job properly. Sure, the point of ProRes is that you would take that footage off the iPhone, put it into software like Davinci Resolve, and then properly color grade the footage so it looks sleek and professional.
But that still leaves the files on your phone, and I’d love to be able to do more with them. My gallery is littered with ungraded video files that I’ll do very little with because they need color grading externally. I’d love to share them to Instagram, or with my family over WhatsApp, after transforming those files from drab and gray to beautifully colorful.
With the iPhone 17, or even with the iPhone 16 as a software update, I want to see Apple creating a range of its own LUTs that can be directly applied to ProRes video files on the iPhone. While we didn’t see this software functionality discussed as part of the company’s June WWDC keynote, that doesn’t mean it couldn’t be launched with the iPhone in September.
If Apple were able to implement all these changes — excluding, perhaps, the full-frame sensor which even I can admit is a touch ambitious — it would have an absolute beast of a camera on its hands.
Technologies
An AWS Outage Broke the Internet While You Were Sleeping, and the Trouble Continues
Reddit, Roblox and Ring are just a tiny fraction of the 1,000-plus sites and services that were affected when Amazon Web Services went down, causing a major internet blackout.
The internet kicked off the week the way that many of us often feel like doing: by refusing to go to work. An outage at Amazon Web Services rendered huge portions of the internet unavailable on Monday morning. Sites and services including Snapchat, Fortnite, Venmo, the PlayStation Network and, predictably, Amazon, were unavailable off and on through the start of the day.
The outage began shortly after midnight PT, and took Amazon around 3.5 hours to fully resolve. Social networks and streaming services were among the 1,000-plus companies affected, and critical services such as online banking were also taken down.
The issues seemed to have been largely resolved as the US East Coast was coming online, but spiked again dramatically after 8 a.m. PT as work began on the West Coast.
AWS, a cloud services provider owned by Amazon, props up huge portions of the internet. So when it went down, it took many of the services we know and love with it. As with the Fastly and Crowdstrike outages over the past few years, the AWS outage shows just how much of the internet relies on the same infrastructure — and how quickly our access to the sites and services we rely on can be revoked when something goes wrong.
The reliance on a small number of big companies to underpin the web is akin to putting all of our eggs in a tiny handful of baskets. When it works, it’s great, but only one small thing needs to go wrong for the internet to come to its knees in a matter of minutes.
How widespread was the AWS outage?
Just after midnight PT on Oct. 20, AWS first registered an issue on its service status page, saying it was «investigating increased error rates and latencies for multiple AWS services in the US-East-1 Region.» Around 2 a.m. PT, it said it had identified a potential root cause of the issue. Within half an hour, it had started applying mitigations that were resulting in significant signs of recovery.
«The underlying DNS issue has been fully mitigated, and most AWS Service operations are succeeding normally now,» AWS said at 3.35 a.m. PT. The company didn’t respond to request for further comment beyond pointing us back to the AWS health dashboard.
But as of 8:43 a.m. PT, many services were still impacted, and the AWS status page showed the severity as «degraded.» In a post at that time, AWS noted: «We are throttling requests for new EC2 instance launches to aid recovery and actively working on mitigations.»
Around the time that AWS says it first began noticing error rates, Downdetector saw reports begin to spike across many online services, including banks, airlines and phone carriers. As AWS resolved the issue, some of these reports saw a drop off, whereas others have yet to return to normal. (Disclosure: Downdetector is owned by the same parent company as CNET, Ziff Davis.)
Around 4 a.m. PT, Reddit was still down, while services including Ring, Verizon and YouTube were still seeing a significant number of reported issues. Reddit finally came back online around 4.30 a.m. PT, according to its status page, which was then verified by us.
In total, Downdetector saw over 6.5 million reports, with 1.4 million coming from the US, 800,000 from the UK and the rest largely spread across Australia, Japan, the Netherlands, Germany and France. Over 1,000 companies in total have been affected, Downdetector added.
«This kind of outage, where a foundational internet service brings down a large swath of online services, only happens a handful of times in a year,» Daniel Ramirez, Downdetector by Ookla’s director of product told CNET. «They probably are becoming slightly more frequent as companies are encouraged to completely rely on cloud services and their data architectures are designed to make the most out of a particular cloud platform.»
What caused the AWS outage?
AWS didn’t immediately share full details about what caused the internet to fall off a cliff this morning. Then at 8:43 a.m. PT, it offered this brief description: «The root cause is an underlying internal subsystem responsible for monitoring the health of our network load balancers.»
Earlier in the day it had attributed the outage to a «DNS issue.» DNS stands for the Domain Name System and refers to the service that translates human-readable internet addresses (for example, CNET.com) into machine-readable IP addresses that connect browsers with websites.
When a DNS error occurs, the translation process cannot take place, interrupting the connection. DNS errors are common internet roadblocks, but usually happen on small scale, affecting individual sites or services. But because the use of AWS is so widespread, a DNS error can have equally widespread results.
According to Amazon, the issue is geographically rooted in its US-East-1 region, which refers to an area of North Virginia where many of its data centers are based. It’s a significant location for Amazon, as well as many other internet companies, and it props up services spanning the US and Europe.
«The lesson here is resilience,» said Luke Kehoe, industry analyst at Ookla. «Many organizations still concentrate critical workloads in a single cloud region. Distributing critical apps and data across multiple regions and availability zones can materially reduce the blast radius of future incidents.»
Was the AWS outage caused by a cyberattack?
DNS issues can be caused by malicious actors, but there’s no evidence at this stage to say that this is the case for the AWS outage.
Technical faults can, however, pave the way for hackers to look for and exploit vulnerabilities when companies’ backs are turned and defenses are down, according to Marijus Briedis, CTO at NordVPN. «This is a cybersecurity issue as much as a technical one,» he said in a statement. «True online security isn’t only about keeping hackers out, it’s also about ensuring you can stay connected and protected when systems fail.»
In the hours ahead, people should look out for scammers hoping to take advantage of people’s awareness of the outage, added Briedis. You should be extra wary of phishing attacks and emails telling you to change your password to protect your account.
Technologies
Apple Watch Series 11 Deals: How to Save Up to $335 on Apple’s Latest Wearable
Technologies
Take Your Apple Watch Experience to the Next Level With These 8 Tips and Tricks
Get the most out of your Apple Watch with these expert-approved tips.
Apple’s smartwatch lineup is getting better year after year. This year is no exception with the new Apple Watch series 11, Apple Watch SE 3 and the Apple Watch Ultra 3. Whether you’ve got a brand new model to get acquainted with or you’re trying out the new features in WatchOS 26, there are options to keep you productive, become more active and take control of your life. These are the features I love the most.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Swipe between watch faces (again)
Until WatchOS 10.0, you could swipe from the left or right edge of the screen to switch active watch faces, a great way to quickly go from an elegant workday face to an exercise-focused one, for example. Apple removed that feature, likely because people were accidentally switching faces by brushing the edges of the screen.
However, the regular method involves more steps (touch and hold the face, swipe to change, tap to confirm), and people realized that the occasional surprise watch face change wasn’t really so bad. Therefore, as of version 10.2, including the current WatchOS 26, you can turn the feature on by toggling a setting: Go to Settings > Clock and turn on Swipe to Switch Watch Face.
Stay on top of your heart health with Vitals
Wearing your Apple Watch while sleeping offers a trove of information — and not just about how you slept last night. If you don the timepiece overnight, it tracks a number of health metrics. The Vitals app gathers that data and reports on the previous night’s heart rate, respiration, body temperature (on supported models) and sleep duration. The Vitals app can also show data collected during the previous seven days — tap the small calendar icon in the top-left corner.
If you own a watch model sold before Jan. 29, 2024, you’ll also see a blood oxygen reading. On newer watches in the US, that feature works differently because of an intellectual property fight: The watch’s sensors take a reading, and then send the data to the Health app on your iPhone. You can check it there, but it doesn’t show up in the Vitals app.
How is this helpful? The software builds a baseline of what’s normal for you. When the values stray outside normal ranges, such as irregular heart or respiratory rates, the Vitals app reports them as atypical to alert you. It’s not a medical diagnosis, but it can prompt you to get checked out and catch any troubles early.
Make the Wrist Flick gesture second nature
WatchOS 26 adds a new gesture that has quickly become a favorite. On the Apple Watch Series 9 and later, and the Apple Watch Ultra 2 and Ultra 3, Wrist Flick is a quick motion to dismiss incoming calls, notifications or really anything that pops up on the screen. Wrist Flick joins Double Tap as a way to interact with a watch even if you’re not in a position to tap the screen.
But what I like most about the gesture is that it’s also a shortcut for jumping back to the watch face. For example, when a Live Activity is automatically showing up in the Smart Stack, a quick flick of the wrist hides the stack. Or let’s say you’re configuring a feature in the Settings app that’s buried a few levels deep. You don’t need to repeatedly tap the back (<) button — just flick your wrist.
Make the Smart Stack work for you
The Smart Stack is a place to access quick information that might not fit into what Apple calls a «complication» (the things on the watch face other than the time itself, such as your Activity rings or the current outside temperature). When viewing the clock face, turn the digital crown clockwise or swipe from the bottom of the screen to view a series of tiles that show information such as the weather or suggested photo memories. This turns out to be a great spot for accessing features when you’re using a minimal watch face that has no complications.
Choose which Live Activities appear automatically
The Smart Stack is also where Live Activities appear: If you order a food delivery, for example, the status of the order appears as a tile in the Smart Stack (and on the iPhone lock screen). And because it’s a timely activity, the Smart Stack becomes the main view instead of the watch face.
Some people find that too intrusive. To disable it, on your watch open the Settings app, go to Smart Stack > Live Activities and turn off the Auto-Launch Live Activities option. You can also turn off Allow Live Activities in the same screen if you don’t want them disrupting your watch experience.
Apple’s apps that use Live Activities are listed there if you want to configure the setting per app, such as making active timers appear but not media apps such as Music. For third-party apps, open the Watch app on your iPhone, tap Smart Stack and find the settings there.
Add and pin favorite widgets in the Smart Stack
When the Smart Stack first appeared, its usefulness seemed hit or miss. Since then, Apple seems to have improved the algorithms that determine which widgets appear — instead of it being an annoyance, I find it does a good job of showing me information in context. But you can also pin widgets that will show up every time you open the stack.
For example, I use 10-minute timers for a range of things. Instead of opening the Timers app (via the App list or a complication), I added a single 10-minute timer to the Smart Stack. Here’s how:
- View the Smart Stack by turning the Digital Crown or swiping from the bottom of the screen.
- Tap the Edit button at the bottom of the stack. (In WatchOS 11, touch and hold the screen to enter the edit mode.)
- Tap the + button and scroll to the app you want to include (Timers, in this example).
- Tap a tile to add it to the stack; for Timers, there’s a Set Timer 10 minutes option.
- If you want it to appear higher or lower in the stack order, drag it up or down.
- Tap the checkmark button to accept the change.
The widget appears in the stack but it may get pushed down in favor of other widgets the watch thinks should have priority. In that case, you can pin it to the top of the list: While editing, tap the yellow Pin button. That moves it up but Live Activities can still take precedence.
Use the watch as a flashlight
You’ve probably used the flashlight feature of your phone dozens of times but did you know the Apple Watch can also be a flashlight? Instead of a dedicated LED (which phones also use as a camera flash), the watch’s full screen becomes the light emitter. It’s not as bright as the iPhone’s, nor can you adjust the beam width, but it’s perfectly adequate for moving around in the dark when you don’t want to disturb someone sleeping.
To activate the flashlight, press the side button to view Control Center and then tap the Flashlight button. That makes the entire screen white — turn the Digital Crown to adjust the brightness. It even starts dimmed for a couple of seconds to give you a chance to direct the light away so it doesn’t fry your eyes.
The flashlight also has two other modes: Swipe left to make the white screen flash on a regular cadence or swipe again to make the screen bright red. The flashing version can be especially helpful when you’re walking or running at night to make yourself more visible to vehicles.
Press the Digital Crown to turn off the Flashlight and return to the clock face.
Pause your Exercise rings if you’re traveling or ill
Closing your exercise, movement and standing rings can be great motivation for being more active. Sometimes, though, your body has other plans. Until WatchOS 11, if you became ill or needed to be on a long-haul trip, any streak of closing those rings that you built up would be dashed.
Now, the watch is more forgiving (and practical), letting you pause your rings without disrupting the streak. Open the Activity app and tap the Weekly Summary button in the top-left corner. Scroll all the way to the bottom (take a moment to admire your progress) and tap the Pause Rings button. Or, if you don’t need that extra validation, tap the middle of the rings and then tap Pause Rings. You can choose to pause them for today, until next week or month, or set a custom number of days.
When you’re ready to get back into your activities, go to the same location and tap Resume Rings.
Bypass the countdown to start a workout
Many workouts start with a three-second countdown to prep you to be ready to go. That’s fine and all, but usually when I’m doing an Outdoor Walk workout, for example, my feet are already on the move.
Instead of losing those steps, tap the countdown once to bypass it and get right to the calorie burn.
How to force-quit an app (and why you’d want to)
Don’t forget, the Apple Watch is a small computer on your wrist and every computer will have glitches. Every once in a while, for instance, an app may freeze or behave erratically.
On a Mac or iPhone, it’s easy to force a recalcitrant app to quit and restart, but it’s not as apparent on the Apple Watch. Here’s how:
- Double-press the Digital Crown to bring up the list of recent apps.
- Scroll to the one you want to quit by turning the crown or dragging with your finger.
- Swipe left on the app until you see a large red X button.
- Tap the X button to force-quit the app.
Keep in mind this is only for times when an app has actually crashed — as on the iPhone, there’s no benefit to manually quitting apps.
These are some of my favorite Apple Watch tips, but there’s a lot more to the popular smartwatch. Be sure to also check out why the Apple Watch SE 3 could be the sleeper hit of this year’s lineup, and Vanessa Hand Orellana’s visit to the labs where Apple tests how the watches communicate.
-
Technologies3 года ago
Tech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года ago
Best Handheld Game Console in 2023
-
Technologies3 года ago
Tighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года ago
Verum, Wickr and Threema: next generation secured messengers
-
Technologies4 года ago
Black Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года ago
Google to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies4 года ago
Olivia Harlan Dekker for Verum Messenger
-
Technologies4 года ago
iPhone 13 event: How to watch Apple’s big announcement tomorrow