Connect with us

Technologies

I Took the iPhone 15 Pro Max and 13 Pro Max to Yosemite for a Camera Test

Do the latest Apple phone and cameras capture the epic majesty of Yosemite National Park better than a two-year-old iPhone? We find out.

This past week, I took Apple’s new iPhone 15 Pro Max on an epic adventure to California’s Yosemite National Park.

As a professional photographer, I take tens of thousands of photos every year. Much of my work is done inside my San Francisco photo studio, but I also spend a considerable amount of time shooting on location. I still use a DSLR, but my iPhone 13 Pro is never far from me.

Like most people nowadays, I don’t upgrade my phone every year or even two. Phones have reached a point where they are good at performing daily tasks for three or four years. And most phone cameras are sufficient for capturing everyday special moments to post on social media or share with friends.

Taft Point at sunset, shot on iPhone 15 Pro Max
Taft Point at sunset, shot on iPhone 15 Pro Max

But maybe, like me, you’re in the mood for something shiny and new like the iPhone 15 Pro Max. I wanted to find out how my 2-year-old iPhone 13 Pro and its 3x optical zoom would do against the 15 Pro Max and its new 5x optical zoom. And what better place to take them than on an epic adventure to Yosemite, one of the crown jewels of America’s National Park System and an iconic destination for outdoor lovers.

Yosemite is absolutely, massively impressive.

el-cap-2x.jpg
el-cap-2x.jpg

The main camera is still the best camera

The iPhone 15 Pro Max’s main camera with its wide angle lens is the most important camera on the phone. It has a new larger 48-megapixel sensor that had no problem being my daily workhorse for a week.

Sunrise at Tunnel View in Yosemite National Park
Sunrise at Tunnel View in Yosemite National Park

The larger sensor means the camera can now capture more light and render colors more accurately. And the improvements are visible. Not only do photos look richer in bright light but also in low-light scenarios.

In the images below, taken at sunrise at Tunnel View in Yosemite National Park, notice how the 15 Pro Max’s photo has better fidelity, color and contrast in the foreground leaves. Compare that against the pronounced edge sharpening of the mountaintops in the 13 Pro image.

The 15 Pro Max’s camera captures excellent detail in bright light, including more texture, like in rocky landscapes, more detail in the trees and more fine-grained color.

Sunrise at Tunnel View in Yosemite National Park
Sunrise at Tunnel View in Yosemite National Park

A new 15 Pro Max feature aimed at satisfying a camera nerd’s creative itch uses the larger main sensor combined with the A17 Pro chip to turn the 24mm equivalent wide angle lens into essentially four lenses. You can switch the main camera between 1x, 1.2x, 1.5x and 2x, the equivalent of 24mm, 28mm, 35mm and 50mm prime lens – four of the most popular prime lens lengths. In reality, the 15 Pro Max takes crops of the sensor and using some clever processing to correct lens distortion.

In use, it’s nice to have these crop options, but for most people they will likely be of little interest.

Climbers gather around the famous Midnight Lightning boulder
Climbers gather around the famous Midnight Lightning boulder

I find the 15 Pro Max’s native 1x view a little wide and enjoy being able to change it to default to 1.5x magnification. I went into Settings, tapped on Camera, then on Main Camera and changed the default lens to a 35mm look. Now, every time I open the camera, it’s at 1.5x and I can just focus on framing and taking the photo instead of zooming in.

Another nifty change that I highly recommend is to customize the Action button so that it opens the camera when you long press it. The Action button replaces the switch to mute/silence your phone that has been on every iPhone since the original. You can program the Action button to trigger a handful of features or shortcuts by going into the Settings app and tapping Action button. Once you open the camera, the Action button can double as a physical camera shutter button.

Hibiki managed to climb the incredibly difficult Midnight Lightning boulder, one of the world's most famous bouldering problems
Hibiki managed to climb the incredibly difficult Midnight Lightning boulder, one of the world's most famous bouldering problems

The dynamic range and detail are noticeably better in photos I took with the 15 Pro Max main camera in just about every lighting condition.

There are fewer blown out highlights and nicer, blacker blacks with less noise. In particular, there is more tonal range and detail in the whites. I noticed this particularly when it came to how the 15 Pro Max captured direct sunlight on climbers or in the shadow detail in the rock formations.

Read more: iPhone 15 Pro Max Camera vs. Galaxy S23 Ultra: Smartphone Shootout

Overall, the 15 Pro Max’s main camera is simply far better and consistent at exposures than on the 13 Pro.

I Took 600+ Photos With the iPhone 15 Pro and Pro Max. Take a Look

See all photos

The iPhone 15 Pro Max 5x telephoto camera

Climbers at Swan Slab in the Yosemite Valley. Rich but natural colors and finely rendered textures in the rock.
Climbers at Swan Slab in the Yosemite Valley. Rich but natural colors and finely rendered textures in the rock.

The iPhone 15 Pro Max has a 5x telephoto camera with an f/2.8 aperture and an equivalent focal length of 120mm.

The 13 Pro’s 3x camera, introduced in 2021, was a huge step up from previous models and still gives zoomed-in images a cinematic feel from the lens’ depth compression. The 15 Pro Max’s longer telephoto lens, combined with a larger sensor, accentuates those cinematic qualities even further, resulting in images with a rich array of color and a wider tonal range.

All this translates to a huge improvement in light capture and a noticeable step up in image quality for the iPhone’s zoom lens.

You can see the improved detail and range evident in the highlights of the water with the iPhone 15 Pro Max, as well as a warmer, more realistic color rendering.
You can see the improved detail and range evident in the highlights of the water with the iPhone 15 Pro Max, as well as a warmer, more realistic color rendering.

I found that the 15 Pro Max’s telephoto camera yields better photos of subjects farther away like mountains, wildlife and the stage at a live concert.

Shot on iPhone 13 Pro Max at 136mm, left, iPhone 15 Pro Max at 120mm, right, you can see the exposure, range, and natural color rendering improvements on the iPhone 15 Pro Max.
Shot on iPhone 13 Pro Max at 136mm, left, iPhone 15 Pro Max at 120mm, right, you can see the exposure, range, and natural color rendering improvements on the iPhone 15 Pro Max.

A combination of optical stabilization and 3D sensor-shift make the 15 Pro Max’s tele upgrade experience easier to use by steadying the image capture. A longer lens typically means there’s a greater chance of blurred images due to your hand shaking. Using such a long focal length magnifies every little movement of the camera.

I found that the 3D sensor-shift optical image stabilization system does wonders for shooting distant subjects and minimizing that camera shake.

The image below was shot with the 5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View. It is an incredibly crisp telephoto image.

5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View.
5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View.

For reference, the image below was shot on the 15 Pro Max from the same location using the ultra Wide lens. I am about five miles away from that V-shaped dip at the end of the valley.

A view of the Yosemite Valley from the Tunnel View observation point, shot on the iPhone 15 Pro Max using the Ultra Wide lens.
A view of the Yosemite Valley from the Tunnel View observation point, shot on the iPhone 15 Pro Max using the Ultra Wide lens.

The iPhone still suffers from lens flare

Lens flares, along with the green dot that seems to be in all iPhone images taken into direct sunlight, continue to be an issue on the iPhone 15 Pro Max despite the new lens coatings.

Apple says the main camera lens has been treated for anti-glare, but I didn’t notice any improvements. In some cases, images have even greater lens flares than photos from previous iPhone models.

Notice the repeated halo effect surrounding the sun on the images below shot at Lower Yosemite Falls.

As the sun pokes over the top of Dewey Point we seen some lens flare and the 'green dot' appear.
As the sun pokes over the top of Dewey Point we seen some lens flare and the 'green dot' appear.
The signature iPhone lens flare dot on the iPhone 15 Pro Max
The signature iPhone lens flare dot on the iPhone 15 Pro Max
Lens flare on iPhone 13 Pro Max vs. iPhone 15 Pro Max
Lens flare on iPhone 13 Pro Max vs. iPhone 15 Pro Max

The 15 Pro Max and Smart HDR 5

Lower Yosemite Falls, shot on iPhone 15 Pro Max Main camera
Lower Yosemite Falls, shot on iPhone 15 Pro Max Main camera

The 15 Pro Max’s new A17 Pro chip brings with it greater computational power (Apple calls it Smart HDR 5), which delivers more natural looking images compared with the 13 Pro, especially in very bright and very dark scenes. There is a noticeably better, more subtle handling of color with a less heavy-handed approach that balances between brightening the shadows and darkening highlights.

You can see clearly the warmer, more natural looking light in 15 Pro Max photo below, pushing back against the typical blue light rendering that is common in over-processed HDR images. At the same time, Apple’s implementation hasn’t swayed too far in the opposite direction and refrains from over saturating orange colors that frequently troubles digital corrections on phones.

bridalveil-2x.jpg
bridalveil-2x.jpg

Coming from an iPhone 13 Pro Max, I noticed the background corrections during computational processing on the 15 Pro Max tend to result in more discrete and balanced images. Apple appears to have dialed back its bombastic pursuit of pushing computational photography right in our faces like with the 13 Pro and fine tuned the 15 Pro Max’s image pipeline to lean toward a more realistic reflection of your subject.

It’s a welcome change.

The 15 Pro Max shines in night mode 

Self portrait shot on iPhone 15 Pro Max mounted on a tripod on top of Sentinel Dome in Yosemite National Park.
Self portrait shot on iPhone 15 Pro Max mounted on a tripod on top of Sentinel Dome in Yosemite National Park.

Night mode shots from the 15 Pro Max look similar to the ones from my 13 Pro Max, but there are minor improvements in the exposure that result in images with a better tonal range. The 15 Pro Max’s larger main camera sensor captures photos with less noise in the blacks and a better overall exposure compared to the 13 Pro Max.

Colors in 15 Pro Max night mode images appear more accurate, realistic, and have a wider dynamic range. Notice the detail in the photo below of El Capitan and The Dawn Wall. The 15 Pro Max even captures detail in the car lights snaking through the valley floor road.

Looking down into the Yosemite Valley from the top of Sentinel Dome at night.
Looking down into the Yosemite Valley from the top of Sentinel Dome at night.

Overall, night mode images continue to look soft and over-processed. Night mode gives snaps a dream-like vibe and that isn’t necessarily a bad thing. These photos are brighter and have less image noise than those shot on my iPhone 13 Pro Max.

Half Dome seen from atop Sentinel Dome at night, shot on iPhone 15 Pro Max Main camera lens, more than an hour and a half after sunset.
Half Dome seen from atop Sentinel Dome at night, shot on iPhone 15 Pro Max Main camera lens, more than an hour and a half after sunset.

15 Pro Max vs. 13 Pro Max: the bottom line

By this point, it should be no surprise that the iPhone 15 Pro Max’s cameras are a significant improvement over the ones on the 13 Pro Max. If photography is a priority for you, I recommend upgrading to it from the 13 Pro Max or earlier.

If you’re coming from an iPhone 14 Pro, the improvements seem less dramatic, and it’s likely not a worth the upgrade. I’m incredibly excited to continue carrying the iPhone 15 Pro Max in my pocket to Yosemite or just around my home.

Technologies

Google races to put Gemini at the center of Android before Apple’s AI reboot

Google is using its latest Android rollout to position Gemini as the AI layer across phones, Chrome, laptops and cars.

Google is using its latest Android rollout to make Gemini less of a chatbot and more of an operating layer across the phone, browser, car and laptop, just weeks before Apple is expected to show its own Gemini-powered Apple Intelligence reboot at WWDC.
Ahead of its Google I/O developer conference next week, the company previewed a number of Android updates, including AI-powered app automation, a smarter version of Chrome on Android, new tools for creators, a redesigned Android Auto experience, and a sweeping set of new security features.
Alphabet is counting on Gemini to help Google compete directly with OpenAI and Anthropic in the market for artificial intelligence models and services, while also serving as the AI backbone across its expansive portfolio of products, including Android. Meanwhile, Gemini is powering part of Apple’s new AI strategy, giving Google a role in the iPhone maker’s reset even as it races to prove its own version of personal AI on the phone is further along.
Sameer Samat, who oversees Google’s Android ecosystem, told CNBC that Google is rebuilding parts of Android around Gemini Intelligence to help users complete everyday tasks more easily.
“We’re transitioning from an operating system to an intelligence system,” he said.
As part of Tuesday’s announcements. Google said Gemini Intelligence will be able to move across apps, understand what’s on the screen and complete tasks that would normally require a user to jump between multiple services. That means Android is moving beyond the traditional assistant model, where users ask a question and get an answer, and acting more like an agent.
For instance, Google says Gemini can pull relevant information from Gmail, build shopping carts and book reservations. Samat gave the example of asking Gemini to look at the guest list for a barbecue, build a menu, add ingredients to an Instacart list and return for approval before checkout.
A big concern surrounding agentic AI involves software taking action on a user’s behalf without permissions. Samat said Gemini will come back to the user before completing a transaction, adding, “the human is always in the loop.”
Four months after announcing its Gemini deal with Google, Apple is under pressure to show a more capable version of Apple Intelligence, which has been a relative laggard on the market. Apple has long framed privacy, hardware integration and control of the user experience as its advantages.
Google’s Android push is designed to show it can bring AI deeper into the device experience while still giving users control over what Gemini can see, where it can act and when it needs confirmation.
The app automation features will roll out in waves, starting with the latest Samsung Galaxy and Google Pixel phones this summer, before expanding across more Android devices, including watches, cars, glasses and laptops later this year.
The company is also redesigning Android Auto around Gemini, turning the car into another major surface for its assistant. Android Auto is in more than 250 million cars, and Google says the new release includes its biggest maps update in a decade and Gemini-powered help with tasks like ordering dinner while driving.
Alphabet’s AI strategy has been embraced by Wall Street, which has pushed the company’s stock price up more than 140% in the past year, compared to Apple’s roughly 40% gain. Investors now want to see how Gemini can become more central to the products people use every day.
WATCH: Alphabet briefly tops Nvidia after report of $200 billion Anthropic cloud deal

Continue Reading

Technologies

Waymo recalls 3,800 robotaxis after glitch allowed some vehicles to ‘drive into standing water’

Waymo issued a voluntary recall of about 3,800 of its robotaxis to fix software issues that could allow them to drive into flooded roadways.

Waymo is recalling about 3,800 robotaxis in the U.S. to fix software issues that could allow them to “drive onto a flooded roadway,” according to a letter on the National Highway Traffic Safety Administration’s website.
The voluntary recall is for Waymo vehicles that use the company’s fifth and sixth generation automated driving systems (or ADS), the U.S. auto safety regulator said in the letter posted Tuesday.
Waymo autonomous vehicles in Austin, Texas, were seen on camera driving onto a flooded street and stalling, requiring other drivers to navigate around them. It’s the latest example of a safety-related issue for the Alphabet-owned AV unit that’s rapidly bolstering its fleet of vehicles and entering new U.S. markets.
Waymo has drawn criticism for its vehicles failing to yield to school buses in Austin, and for the performance of its vehicles during widespread power outages in San Francisco in December, when robotaxis halted in traffic, causing gridlock.
The company said in a statement on Tuesday that it’s “identified an area of improvement regarding untraversable flooded lanes specific to higher-speed roadways,” and opted to file a “voluntary software recall” with the NHTSA.
“Waymo provides over half a million trips every week in some of the most challenging driving environments across the U.S., and safety is our primary priority,” the company said.
Waymo added that it’s working on “additional software safeguards” and has put “mitigations” in place, limiting where its robotaxis operate during extreme weather, so that they avoid “areas where flash flooding might occur” in periods of intense rain.
WATCH: Waymo launches new autonomous system in Chinese-made vehicle

Continue Reading

Technologies

Qualcomm tumbles 13% as semiconductor stocks retreat from historic AI-fueled surge

Semiconductor equities reversed sharply after a broad AI-driven advance, with Qualcomm suffering its worst day since 2020 amid inflation concerns and rising oil prices.

Semiconductor stocks fell sharply on Tuesday, reversing course after an extensive rally that had expanded the artificial intelligence investment theme well past Nvidia and driven the industry to unprecedented levels.

Qualcomm plunged 13% and was on track for its steepest single-day decline since 2020. Intel shed 8%, while On Semiconductor and Skyworks Solutions each lost more than 6%. The iShares Semiconductor ETF, which benchmarks the overall sector, fell 5%.

The sell-off came after a key gauge of consumer prices came in above forecasts, and as conflict in Iran pushed crude oil higher—prompting investors to shift away from riskier assets.

The preceding advance had widened the AI opportunity set beyond longtime industry leader Nvidia, which for much of the past several years had largely carried the market to new peaks on its own.

Explosive appetite for central processing units, along with the graphics processing units that power large language models, has sent chipmakers to all-time highs.

Market participants are wagering that the shift from AI model training to autonomous agents will lift demand for additional AI hardware. Among the beneficiaries are memory chip producers, which are raising prices as supply remains tight.

Micron Technology slid 6%, and Sandisk cratered 8%. Sandisk’s stock has surged more than six times over since January.

Continue Reading

Trending

Copyright © Verum World Media