Connect with us

Technologies

I Took the iPhone 15 Pro Max and 13 Pro Max to Yosemite for a Camera Test

Do the latest Apple phone and cameras capture the epic majesty of Yosemite National Park better than a two-year-old iPhone? We find out.

This past week, I took Apple’s new iPhone 15 Pro Max on an epic adventure to California’s Yosemite National Park.

As a professional photographer, I take tens of thousands of photos every year. Much of my work is done inside my San Francisco photo studio, but I also spend a considerable amount of time shooting on location. I still use a DSLR, but my iPhone 13 Pro is never far from me.

Like most people nowadays, I don’t upgrade my phone every year or even two. Phones have reached a point where they are good at performing daily tasks for three or four years. And most phone cameras are sufficient for capturing everyday special moments to post on social media or share with friends.

Taft Point at sunset, shot on iPhone 15 Pro Max
Taft Point at sunset, shot on iPhone 15 Pro Max

But maybe, like me, you’re in the mood for something shiny and new like the iPhone 15 Pro Max. I wanted to find out how my 2-year-old iPhone 13 Pro and its 3x optical zoom would do against the 15 Pro Max and its new 5x optical zoom. And what better place to take them than on an epic adventure to Yosemite, one of the crown jewels of America’s National Park System and an iconic destination for outdoor lovers.

Yosemite is absolutely, massively impressive.

el-cap-2x.jpg
el-cap-2x.jpg

The main camera is still the best camera

The iPhone 15 Pro Max’s main camera with its wide angle lens is the most important camera on the phone. It has a new larger 48-megapixel sensor that had no problem being my daily workhorse for a week.

Sunrise at Tunnel View in Yosemite National Park
Sunrise at Tunnel View in Yosemite National Park

The larger sensor means the camera can now capture more light and render colors more accurately. And the improvements are visible. Not only do photos look richer in bright light but also in low-light scenarios.

In the images below, taken at sunrise at Tunnel View in Yosemite National Park, notice how the 15 Pro Max’s photo has better fidelity, color and contrast in the foreground leaves. Compare that against the pronounced edge sharpening of the mountaintops in the 13 Pro image.

The 15 Pro Max’s camera captures excellent detail in bright light, including more texture, like in rocky landscapes, more detail in the trees and more fine-grained color.

Sunrise at Tunnel View in Yosemite National Park
Sunrise at Tunnel View in Yosemite National Park

A new 15 Pro Max feature aimed at satisfying a camera nerd’s creative itch uses the larger main sensor combined with the A17 Pro chip to turn the 24mm equivalent wide angle lens into essentially four lenses. You can switch the main camera between 1x, 1.2x, 1.5x and 2x, the equivalent of 24mm, 28mm, 35mm and 50mm prime lens – four of the most popular prime lens lengths. In reality, the 15 Pro Max takes crops of the sensor and using some clever processing to correct lens distortion.

In use, it’s nice to have these crop options, but for most people they will likely be of little interest.

Climbers gather around the famous Midnight Lightning boulder
Climbers gather around the famous Midnight Lightning boulder

I find the 15 Pro Max’s native 1x view a little wide and enjoy being able to change it to default to 1.5x magnification. I went into Settings, tapped on Camera, then on Main Camera and changed the default lens to a 35mm look. Now, every time I open the camera, it’s at 1.5x and I can just focus on framing and taking the photo instead of zooming in.

Another nifty change that I highly recommend is to customize the Action button so that it opens the camera when you long press it. The Action button replaces the switch to mute/silence your phone that has been on every iPhone since the original. You can program the Action button to trigger a handful of features or shortcuts by going into the Settings app and tapping Action button. Once you open the camera, the Action button can double as a physical camera shutter button.

Hibiki managed to climb the incredibly difficult Midnight Lightning boulder, one of the world's most famous bouldering problems
Hibiki managed to climb the incredibly difficult Midnight Lightning boulder, one of the world's most famous bouldering problems

The dynamic range and detail are noticeably better in photos I took with the 15 Pro Max main camera in just about every lighting condition.

There are fewer blown out highlights and nicer, blacker blacks with less noise. In particular, there is more tonal range and detail in the whites. I noticed this particularly when it came to how the 15 Pro Max captured direct sunlight on climbers or in the shadow detail in the rock formations.

Read more: iPhone 15 Pro Max Camera vs. Galaxy S23 Ultra: Smartphone Shootout

Overall, the 15 Pro Max’s main camera is simply far better and consistent at exposures than on the 13 Pro.

I Took 600+ Photos With the iPhone 15 Pro and Pro Max. Take a Look

See all photos

The iPhone 15 Pro Max 5x telephoto camera

Climbers at Swan Slab in the Yosemite Valley. Rich but natural colors and finely rendered textures in the rock.
Climbers at Swan Slab in the Yosemite Valley. Rich but natural colors and finely rendered textures in the rock.

The iPhone 15 Pro Max has a 5x telephoto camera with an f/2.8 aperture and an equivalent focal length of 120mm.

The 13 Pro’s 3x camera, introduced in 2021, was a huge step up from previous models and still gives zoomed-in images a cinematic feel from the lens’ depth compression. The 15 Pro Max’s longer telephoto lens, combined with a larger sensor, accentuates those cinematic qualities even further, resulting in images with a rich array of color and a wider tonal range.

All this translates to a huge improvement in light capture and a noticeable step up in image quality for the iPhone’s zoom lens.

You can see the improved detail and range evident in the highlights of the water with the iPhone 15 Pro Max, as well as a warmer, more realistic color rendering.
You can see the improved detail and range evident in the highlights of the water with the iPhone 15 Pro Max, as well as a warmer, more realistic color rendering.

I found that the 15 Pro Max’s telephoto camera yields better photos of subjects farther away like mountains, wildlife and the stage at a live concert.

Shot on iPhone 13 Pro Max at 136mm, left, iPhone 15 Pro Max at 120mm, right, you can see the exposure, range, and natural color rendering improvements on the iPhone 15 Pro Max.
Shot on iPhone 13 Pro Max at 136mm, left, iPhone 15 Pro Max at 120mm, right, you can see the exposure, range, and natural color rendering improvements on the iPhone 15 Pro Max.

A combination of optical stabilization and 3D sensor-shift make the 15 Pro Max’s tele upgrade experience easier to use by steadying the image capture. A longer lens typically means there’s a greater chance of blurred images due to your hand shaking. Using such a long focal length magnifies every little movement of the camera.

I found that the 3D sensor-shift optical image stabilization system does wonders for shooting distant subjects and minimizing that camera shake.

The image below was shot with the 5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View. It is an incredibly crisp telephoto image.

5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View.
5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View.

For reference, the image below was shot on the 15 Pro Max from the same location using the ultra Wide lens. I am about five miles away from that V-shaped dip at the end of the valley.

A view of the Yosemite Valley from the Tunnel View observation point, shot on the iPhone 15 Pro Max using the Ultra Wide lens.
A view of the Yosemite Valley from the Tunnel View observation point, shot on the iPhone 15 Pro Max using the Ultra Wide lens.

The iPhone still suffers from lens flare

Lens flares, along with the green dot that seems to be in all iPhone images taken into direct sunlight, continue to be an issue on the iPhone 15 Pro Max despite the new lens coatings.

Apple says the main camera lens has been treated for anti-glare, but I didn’t notice any improvements. In some cases, images have even greater lens flares than photos from previous iPhone models.

Notice the repeated halo effect surrounding the sun on the images below shot at Lower Yosemite Falls.

As the sun pokes over the top of Dewey Point we seen some lens flare and the 'green dot' appear.
As the sun pokes over the top of Dewey Point we seen some lens flare and the 'green dot' appear.
The signature iPhone lens flare dot on the iPhone 15 Pro Max
The signature iPhone lens flare dot on the iPhone 15 Pro Max
Lens flare on iPhone 13 Pro Max vs. iPhone 15 Pro Max
Lens flare on iPhone 13 Pro Max vs. iPhone 15 Pro Max

The 15 Pro Max and Smart HDR 5

Lower Yosemite Falls, shot on iPhone 15 Pro Max Main camera
Lower Yosemite Falls, shot on iPhone 15 Pro Max Main camera

The 15 Pro Max’s new A17 Pro chip brings with it greater computational power (Apple calls it Smart HDR 5), which delivers more natural looking images compared with the 13 Pro, especially in very bright and very dark scenes. There is a noticeably better, more subtle handling of color with a less heavy-handed approach that balances between brightening the shadows and darkening highlights.

You can see clearly the warmer, more natural looking light in 15 Pro Max photo below, pushing back against the typical blue light rendering that is common in over-processed HDR images. At the same time, Apple’s implementation hasn’t swayed too far in the opposite direction and refrains from over saturating orange colors that frequently troubles digital corrections on phones.

bridalveil-2x.jpg
bridalveil-2x.jpg

Coming from an iPhone 13 Pro Max, I noticed the background corrections during computational processing on the 15 Pro Max tend to result in more discrete and balanced images. Apple appears to have dialed back its bombastic pursuit of pushing computational photography right in our faces like with the 13 Pro and fine tuned the 15 Pro Max’s image pipeline to lean toward a more realistic reflection of your subject.

It’s a welcome change.

The 15 Pro Max shines in night mode 

Self portrait shot on iPhone 15 Pro Max mounted on a tripod on top of Sentinel Dome in Yosemite National Park.
Self portrait shot on iPhone 15 Pro Max mounted on a tripod on top of Sentinel Dome in Yosemite National Park.

Night mode shots from the 15 Pro Max look similar to the ones from my 13 Pro Max, but there are minor improvements in the exposure that result in images with a better tonal range. The 15 Pro Max’s larger main camera sensor captures photos with less noise in the blacks and a better overall exposure compared to the 13 Pro Max.

Colors in 15 Pro Max night mode images appear more accurate, realistic, and have a wider dynamic range. Notice the detail in the photo below of El Capitan and The Dawn Wall. The 15 Pro Max even captures detail in the car lights snaking through the valley floor road.

Looking down into the Yosemite Valley from the top of Sentinel Dome at night.
Looking down into the Yosemite Valley from the top of Sentinel Dome at night.

Overall, night mode images continue to look soft and over-processed. Night mode gives snaps a dream-like vibe and that isn’t necessarily a bad thing. These photos are brighter and have less image noise than those shot on my iPhone 13 Pro Max.

Half Dome seen from atop Sentinel Dome at night, shot on iPhone 15 Pro Max Main camera lens, more than an hour and a half after sunset.
Half Dome seen from atop Sentinel Dome at night, shot on iPhone 15 Pro Max Main camera lens, more than an hour and a half after sunset.

15 Pro Max vs. 13 Pro Max: the bottom line

By this point, it should be no surprise that the iPhone 15 Pro Max’s cameras are a significant improvement over the ones on the 13 Pro Max. If photography is a priority for you, I recommend upgrading to it from the 13 Pro Max or earlier.

If you’re coming from an iPhone 14 Pro, the improvements seem less dramatic, and it’s likely not a worth the upgrade. I’m incredibly excited to continue carrying the iPhone 15 Pro Max in my pocket to Yosemite or just around my home.

Technologies

Silksong, Long-Awaited Hollow Knight Spinoff, Gets Release Date: Sept. 4

Announced in 2019, Team Cherry’s follow-up is coming sooner than expected, and it’s on Game Pass on Day 1.

Hollow Knight: Silksong is the follow-up, announced back in 2019, to one of the most beloved indie games of the last decade. In a special announcement video on Thursday, Australian developer Team Cherry revealed that the wait is almost over. 

Silksong will be released on Sept. 4, according to the new trailer. The almost two-minute video reveals some of the new enemies and bosses in the upcoming spinoff and ends with the surprise release date. 

Originally, Silksong was going to be a DLC for Hollow Knight. However, numerous delays resulted in it being pushed back again and again. Glimpses of the game would show up here and there over the years, but it was this year that it received the most attention from Nintendo as part of its Switch 2 lineup, and from Microsoft, which confirmed it would be available on Xbox Game Pass. 

Hollow Knight: Silksong will be available on PC, Switch, Switch 2, Xbox One, Xbox Series X and Series S, PS4 and PS5. It will be available on Day 1 for Xbox Game Pass subscribers. 

Continue Reading

Technologies

PS5 Prices Go Up Today. Here’s How Much and Why

You can expect to pay more for a new PlayStation, thanks to «a challenging economic environment.»

Sony will increase the prices of its PlayStation 5 consoles in the US, starting today. This follows the trend of console manufacturers such as Microsoft and Nintendo raising prices for their hardware in response to tariffs. 

The PlayStation-maker posted about the price change Wednesday. The jump in price is $50 more than the current price for each model.

The new prices are:

«Similar to many global businesses, we continue to navigate a challenging economic environment,» Sony said in a post about the price increase. 

As of Thursday morning, retailers and Sony’s online store have yet to update the console prices. This jump in price also will likely affect recently released PS5 bundles such as the Astro Bot bundle and Fortnite Cobal bundle

Sony says accessories have not been affected by the change and this cost hike only affects the US. 

In May, Microsoft increased the price of the Xbox Series consoles and Nintendo hiked the original Switch console price and Switch 2 accessories this month.

While the companies didn’t point to the tariffs instituted by President Donald Trump as the reason for the hardware price jump, it would explain the trend in recent months. 

Continue Reading

Technologies

Google Thinks AI Can Make You a Better Photographer: I Dive Into the Pixel 10 Cameras

The camera specs for the Pixel 10 series reveal only a small part of what’s new for mobile photographers. I spoke with the head of the Pixel camera team to learn more.

If a company releases new phone models but doesn’t change the cameras, would anyone pay attention? Fortunately that’s not the case with Google’s new Pixel 10, Pixel 10 Pro and Pixel 10 Pro Fold phones, which make a few advancements in the hardware — hello, telephoto camera on the base-level Pixel for the first time — and also in the software that runs it all, with generative AI playing an even bigger role than it has before.

«This is the first year where not only are we able to achieve some image quality superlatives,» Isaac Reynolds, group product manager for the Pixel cameras, told CNET, «but we’re actually able to make you a better photographer, because generative AI and large models can do things and understand levels of context that no technology before could achieve.»

Modern smartphone cameras must be more than glass and sensors, because they have to compensate for the physical limitations of those same glass and sensors. You can’t expect a tiny phone camera to perform as well as a large glass lens on a traditional camera, and yet the photos coming out of the Pixel 10 models surpass their optical abilities. In a call that covered a lot of photographic ground, Reynolds shared with me details about new features as well as issues of how we can trust images when AI — in Google’s own tools, even — is so prevalent.

Pro Res Zoom adds generative AI to reach 100x

The new Pro Res Zoom feature is likely to get the most attention because it strives for something exceptionally difficult in smartphones: long-range zoom that isn’t a fuzzy mess of pixels.

You see this all the time: Someone on their phone spreads two fingers against the screen to make a distant object larger in the frame. Photographers die a little each time that happens because, by not sticking to the main zoom levels — 1x, 2x, 5x and so on — the person is relying on digital zoom; the camera app is making pixels larger and then using software to try to clean up the result. Digital zoom is certainly better than it once was, but each time it’s used, the person sacrifices image quality for more zoom in the moment.

Google’s Super Res Zoom feature, introduced with the Pixel 3, interpolates and sharpens the image up to 30x zoom level on the Pixel 10 Pros (and up to 20x zoom on the Pixel 10 and Pixel 10 Pro Fold). The new Pro Res Zoom on the Pixel 10 Pro pushes way beyond that to 100x zoom — with a significant lift from AI.

Past 30x, Pro Res Zoom uses generative AI to refine and rebuild areas of the image based on the underlying pixels captured by the camera sensor. It’s similar to the technology that Magic Editor uses when you move an object to another area in the image, or type a prompt to add things that weren’t there in the first place. Only in this case, the Pixel Camera app creates a generative AI version of what you captured to give the image crisp lines and features. All the processing is performed on-device.

Reynolds explained that one of the factors driving the creation of Pro Res Zoom was the environments where people are taking photos. «They’re taking pictures in the same levels of low light — dinners did not get darker since we launched Night Sight,» he said. «But what is changing is how much people zoom, [and] because the tech is getting so much better, we took this opportunity to reset and refocus the program on incredible zoom quality.»

Pro Res Zoom works best on static scenes such as buildings, skylines, foliage and the like — things that don’t move. It won’t try to reconstruct faces or people, since generative AI can often make them stand out more as being artificially manipulated. The generated image is saved alongside the image captured by the camera sensor so you can choose which one looks best.

What about consistency and accuracy of the AI processing? Generative AI images are built out of pixel noise that is quickly refined based on the input driving them. Visual artifacts have often gone hand-in-six-fingered-hand with generated imagery.

But that’s a different kind of generative AI, says Reynolds. «When I think of Gen AI in this application, I think of something where the team has spent a couple of years getting it really tuned for exactly our use case, which is image enhancement, image to image.»

Initially, people inside Google were worried about artifacts, but the result is that «every image you see should be truly authentic to the real photo,» he said.

Auto Best Take

This new feature seems like a natural evolution — and by «natural,» I mean «processor speeds have improved enough to make it happen.» The Best Take feature was introduced with the Pixel 8, letting you capture several shots of a person or group of people, and have the phone merge them into one photo where everyone’s expressions look good. CNET’s Patrick Holland wrote in his review of the Pixel 8, «It’s the start of a path where our photography can be even more curated and polished, even if the photos we take don’t start out that way.»

That path has led to Auto Best Take, which does it automatically — and not just grabbing a handful of images to work with. Says Reynolds, «[It] can analyze… I think we’re up to 150 individual frames within just a few seconds, and pick the right five or six that are most likely to yield you the perfect photo. And then it runs Best Take.»

From the photographer’s point of view, the phone is doing all the work, though, as with Pro Res Zoom, you can also view the handful of shots that went into the final merged image if you’re not happy with the result. The shots are full-resolution and fully processed as if you’d snapped them individually.

«What’s interesting about this is you might actually find in your testing that Auto Best Take doesn’t trigger very often, and there’s a very particular reason for that,» said Reynolds. «Once the camera gets to look at 150 items, it’s probably going to find one where everybody was looking at the camera, because if there’s even one, it’ll pick it up.»

Improved Portrait mode and Real Tone

Another improvement enabled by the Pixel 10 Pro’s Tensor G5 processor is a new high-resolution Portrait mode. To take advantage of the wide camera’s 50-megapixel resolution, Reynolds said the Pixel team rebuilt the Portrait mode model so it creates a higher quality soft-background depth effect, particularly around a subject’s hair.

Real Tone, the technology for more accurately representing skin tones, is also incrementally better. As Reynolds explained, Real Tone has progressed from establishing color balances for people versus the other areas of a frame to individual color balances for each person in the image.

«That’s not just going to mean better consistency shot to shot, it means better consistency scene to scene,» he said, «because your color, your [skin] tone, won’t depend so strongly on the other things that happened in the image.»

He also mentioned that a core component of Real Tone has been the ability to scale up image quality testing methods and data collection in the process of bringing the feature’s algorithms to market.

«What standards are we setting for diversity and equity, inclusion across the entire feature set?» he said. «Real Tone is primarily a mission and a process.»

Instant View feature in the Pixel 10 Fold

One other significant photo hardware improvement has nothing to do with the cameras. On the Pixel 10 Pro Fold, the Pixel Camera app takes advantage of the large internal screen by showing the previous photo you captured on the left side of the display. Instead of straining to see details in a tiny thumbnail in the corner of the app, Instant View gives a full-size shot, which is especially helpful when you’re taking multiple photos of a person or subject.

Camera Coach

So far, these new Pixel 10 camera features are incorporated into the moment you capture a photo, but Reynolds also wants to use the phones’ cameras to encourage people to become better photographers. Camera Coach is an assistant that you can invoke when you’re stuck or looking for new ideas while photographing a scene.

It can look at the picture you’re trying to take and help you improve it using suggestions such as getting closer to a subject for better framing or moving the camera lower for a more dramatic angle. When you tap a Get Inspired button, the Pixel Camera app looks at the scene and makes suggestions.

«Whether you’re a beginner and you just need step-by-step instructions to learn how to do it,» said Reynolds, «or you’re someone like me who needs a little more push on the creativity when sometimes I’m busy or stressed, it helps me think creatively.»

CP2A content credentials

All of this AI being worked into the photographic process, from Pro Res Zoom to Auto Best Take, invariably brings up the unresolved question of whether the images we’re creating are genuine. And in a world that is now awash in AI-generated images that look real enough, people are naturally guarded about the provenance of digital images.

For Google, one answer is to label everything. Each image captured by the Pixel 10 cameras or touches Google Photos is tagged with C2PA Content Credentials (Coalition for Content Provenance and Authenticity), even if it’s untouched by AI. It’s the first smartphone with C2PA built in.

«We really wanted to make a big difference in transparency and credibility and teaching people what to expect from AI,» said Reynolds. «The reason we are so committed to saving this metadata in every Pixel camera picture is so people can start to be suspicious of pictures without any information.»

Marking images that have no AI editing is meant to instill trust in them. «The image with an AI label is less malicious than an image without one,» said Reynolds. «When you send a picture of someone, they can look at the C2PA in that picture. So we’re trying to build this whole network that customers can start to expect to have this information about where a photo came from.»

What’s new in the Pixel 10 camera hardware

Scanning the specs of the Pixel 10 cameras, listed below, you’d rightly notice that they match those found on last year’s Pixel 9 models, but a couple of details stand out.

For one, having a dedicated telephoto camera is no longer one of the features that separates the entry-level Pixel from the pro models. The Pixel 10 now has its own 10.8 megapixel, f/3.1 telephoto camera with optical image stabilization that offers a 5x optical zoom and up to 20x Super Res Zoom.

It’s not as good as the 48-megapixel f/2.8 telephoto camera used in the Pixel 10 Pro and Pixel 10 Pro XL (the same one used in the Pixel 9 Pros), but that’s not the point. You don’t need to give up extra zoom just to buy a more affordable phone.

Another difference you’ll encounter, particularly when recording video, is improved image stabilization. The optical image stabilization is upgraded in all three phones, but the stabilization in the Pixel 10 Pros is significantly improved. Although the sensor and lens share the same specs as the Pixel 9 Pro, the wide-angle camera in the Pixel 10 Pro models necessitated a new design to accommodate new OIS components inside the module enclosure. Google says it doubled the range of motion so the lens physically moves through a wider arc to compensate for motion. Alongside that, the stabilization software has been tuned to make it smoother.

Camera Specs for the Pixel 10 Lineup

Pixel 10 Pixel 10 Pro Pixel 10 Pro XL Pixel 10 Pro Fold
Wide Camera 48MP Quad PD, f/1.7, 1/2″ image sensor 50MP Octa PD, f/1.68, 1/1.3″ image sensor 50MP Octa PD, f/1.68, 1/1.3″ image sensor 48MP Quad PD, f/1.7, 1/2″ image sensor
Ultra-wide Camera 13MP Quad PD, f/2.2, 1/3.1″ image sensor 48MP Quad PD with autofocus, f/1.7, 1/2.55″ image sensor 48MP Quad PD with autofocus, f/1.7, 1/2.55″ image sensor 10.5MP Dual PD with autofocus, f/2.2, 1/3.4″ image sensor
Telephoto Camera 10.8MP Dual PD with optical image stabilization, f/3.1, 1/3.2″ sensor size, 5x optical zoom 48MP Quad PD with optical image stabilization, f/2.8, 1/2.55″ image sensor, 5x optical zoom 48MP Quad PD with optical image stabilization, f/2.8, 1/2.55″ image sensor, 5x optical zoom 10.8MP Dual PD with optical image stabilization, f/3.1, 1/3.2″ sensor size, 5x optical zoom
Front camera 10.5MP Dual PD with autofocus, f/2.2 42MP Dual PD with autofocus, f/2.2 42MP Dual PD with autofocus, f/2.2 10MP Dual PD, f/2.2
Inner camera n/a n/a n/a 10MP Dual PD, f/2.2

Continue Reading

Trending

Copyright © Verum World Media