Technologies
I Took the iPhone 15 Pro Max and 13 Pro Max to Yosemite for a Camera Test
Do the latest Apple phone and cameras capture the epic majesty of Yosemite National Park better than a two-year-old iPhone? We find out.
This past week, I took Apple’s new iPhone 15 Pro Max on an epic adventure to California’s Yosemite National Park.
As a professional photographer, I take tens of thousands of photos every year. Much of my work is done inside my San Francisco photo studio, but I also spend a considerable amount of time shooting on location. I still use a DSLR, but my iPhone 13 Pro is never far from me.
Like most people nowadays, I don’t upgrade my phone every year or even two. Phones have reached a point where they are good at performing daily tasks for three or four years. And most phone cameras are sufficient for capturing everyday special moments to post on social media or share with friends.


But maybe, like me, you’re in the mood for something shiny and new like the iPhone 15 Pro Max. I wanted to find out how my 2-year-old iPhone 13 Pro and its 3x optical zoom would do against the 15 Pro Max and its new 5x optical zoom. And what better place to take them than on an epic adventure to Yosemite, one of the crown jewels of America’s National Park System and an iconic destination for outdoor lovers.
Yosemite is absolutely, massively impressive.


The main camera is still the best camera
The iPhone 15 Pro Max’s main camera with its wide angle lens is the most important camera on the phone. It has a new larger 48-megapixel sensor that had no problem being my daily workhorse for a week.


The larger sensor means the camera can now capture more light and render colors more accurately. And the improvements are visible. Not only do photos look richer in bright light but also in low-light scenarios.
In the images below, taken at sunrise at Tunnel View in Yosemite National Park, notice how the 15 Pro Max’s photo has better fidelity, color and contrast in the foreground leaves. Compare that against the pronounced edge sharpening of the mountaintops in the 13 Pro image.
The 15 Pro Max’s camera captures excellent detail in bright light, including more texture, like in rocky landscapes, more detail in the trees and more fine-grained color.


A new 15 Pro Max feature aimed at satisfying a camera nerd’s creative itch uses the larger main sensor combined with the A17 Pro chip to turn the 24mm equivalent wide angle lens into essentially four lenses. You can switch the main camera between 1x, 1.2x, 1.5x and 2x, the equivalent of 24mm, 28mm, 35mm and 50mm prime lens – four of the most popular prime lens lengths. In reality, the 15 Pro Max takes crops of the sensor and using some clever processing to correct lens distortion.
In use, it’s nice to have these crop options, but for most people they will likely be of little interest.


I find the 15 Pro Max’s native 1x view a little wide and enjoy being able to change it to default to 1.5x magnification. I went into Settings, tapped on Camera, then on Main Camera and changed the default lens to a 35mm look. Now, every time I open the camera, it’s at 1.5x and I can just focus on framing and taking the photo instead of zooming in.
Another nifty change that I highly recommend is to customize the Action button so that it opens the camera when you long press it. The Action button replaces the switch to mute/silence your phone that has been on every iPhone since the original. You can program the Action button to trigger a handful of features or shortcuts by going into the Settings app and tapping Action button. Once you open the camera, the Action button can double as a physical camera shutter button.


The dynamic range and detail are noticeably better in photos I took with the 15 Pro Max main camera in just about every lighting condition.
There are fewer blown out highlights and nicer, blacker blacks with less noise. In particular, there is more tonal range and detail in the whites. I noticed this particularly when it came to how the 15 Pro Max captured direct sunlight on climbers or in the shadow detail in the rock formations.
Read more: iPhone 15 Pro Max Camera vs. Galaxy S23 Ultra: Smartphone Shootout
Overall, the 15 Pro Max’s main camera is simply far better and consistent at exposures than on the 13 Pro.
The iPhone 15 Pro Max 5x telephoto camera


The iPhone 15 Pro Max has a 5x telephoto camera with an f/2.8 aperture and an equivalent focal length of 120mm.
The 13 Pro’s 3x camera, introduced in 2021, was a huge step up from previous models and still gives zoomed-in images a cinematic feel from the lens’ depth compression. The 15 Pro Max’s longer telephoto lens, combined with a larger sensor, accentuates those cinematic qualities even further, resulting in images with a rich array of color and a wider tonal range.
All this translates to a huge improvement in light capture and a noticeable step up in image quality for the iPhone’s zoom lens.


I found that the 15 Pro Max’s telephoto camera yields better photos of subjects farther away like mountains, wildlife and the stage at a live concert.


A combination of optical stabilization and 3D sensor-shift make the 15 Pro Max’s tele upgrade experience easier to use by steadying the image capture. A longer lens typically means there’s a greater chance of blurred images due to your hand shaking. Using such a long focal length magnifies every little movement of the camera.
I found that the 3D sensor-shift optical image stabilization system does wonders for shooting distant subjects and minimizing that camera shake.
The image below was shot with the 5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View. It is an incredibly crisp telephoto image.


For reference, the image below was shot on the 15 Pro Max from the same location using the ultra Wide lens. I am about five miles away from that V-shaped dip at the end of the valley.


The iPhone still suffers from lens flare
Lens flares, along with the green dot that seems to be in all iPhone images taken into direct sunlight, continue to be an issue on the iPhone 15 Pro Max despite the new lens coatings.
Apple says the main camera lens has been treated for anti-glare, but I didn’t notice any improvements. In some cases, images have even greater lens flares than photos from previous iPhone models.
Notice the repeated halo effect surrounding the sun on the images below shot at Lower Yosemite Falls.






The 15 Pro Max and Smart HDR 5


The 15 Pro Max’s new A17 Pro chip brings with it greater computational power (Apple calls it Smart HDR 5), which delivers more natural looking images compared with the 13 Pro, especially in very bright and very dark scenes. There is a noticeably better, more subtle handling of color with a less heavy-handed approach that balances between brightening the shadows and darkening highlights.
You can see clearly the warmer, more natural looking light in 15 Pro Max photo below, pushing back against the typical blue light rendering that is common in over-processed HDR images. At the same time, Apple’s implementation hasn’t swayed too far in the opposite direction and refrains from over saturating orange colors that frequently troubles digital corrections on phones.


Coming from an iPhone 13 Pro Max, I noticed the background corrections during computational processing on the 15 Pro Max tend to result in more discrete and balanced images. Apple appears to have dialed back its bombastic pursuit of pushing computational photography right in our faces like with the 13 Pro and fine tuned the 15 Pro Max’s image pipeline to lean toward a more realistic reflection of your subject.
It’s a welcome change.
The 15 Pro Max shines in night mode


Night mode shots from the 15 Pro Max look similar to the ones from my 13 Pro Max, but there are minor improvements in the exposure that result in images with a better tonal range. The 15 Pro Max’s larger main camera sensor captures photos with less noise in the blacks and a better overall exposure compared to the 13 Pro Max.
Colors in 15 Pro Max night mode images appear more accurate, realistic, and have a wider dynamic range. Notice the detail in the photo below of El Capitan and The Dawn Wall. The 15 Pro Max even captures detail in the car lights snaking through the valley floor road.


Overall, night mode images continue to look soft and over-processed. Night mode gives snaps a dream-like vibe and that isn’t necessarily a bad thing. These photos are brighter and have less image noise than those shot on my iPhone 13 Pro Max.


15 Pro Max vs. 13 Pro Max: the bottom line
By this point, it should be no surprise that the iPhone 15 Pro Max’s cameras are a significant improvement over the ones on the 13 Pro Max. If photography is a priority for you, I recommend upgrading to it from the 13 Pro Max or earlier.
If you’re coming from an iPhone 14 Pro, the improvements seem less dramatic, and it’s likely not a worth the upgrade. I’m incredibly excited to continue carrying the iPhone 15 Pro Max in my pocket to Yosemite or just around my home.
Technologies
TikTok to Let Apple Music Users Stream Full Songs Without Ever Leaving the App
TikTok and Apple Music come together to introduce two new features to the music listening experience.
If you’ve ever scrolled TikTok, caught a snippet of a tune, and thought, «I wish I could play this song all the way through,» this is for you. TikTok and Apple Music announced on Wednesday that they have partnered on two new features, Play Full Song and Listening Party. The goal is to offer listeners a seamless music listening experience without ever leaving the social media app.
Apple Music subscribers who discover a song on their TikTok For You Page or on the Sound Detail Page will be able to click Play Full Song to open the Apple Music player and listen to the track in its entirety. From there, subscribers to the music streaming service will be able to save the song as a favorite, add it to a playlist on Apple Music and listen to a customized stream of recommended songs.
When a full-length song is played, the stream will pay artists through Apple Music.
«Tapping into the music you love should feel effortless,» Ole Obermann, co-head of Apple Music, said in a statement. «With Play Full Song, Apple Music subscribers can move easily from discovering a track on TikTok to listening to it in full instantly, without breaking the flow. This integration not only makes it easier for fans to discover, listen to, and engage with the artists they love, but also creates a powerful new pathway for artists — turning moments of discovery into deeper connection and sustained engagement in one simple, seamless experience.»
Listening Party sounds somewhat like Spotify‘s feature of the same name. Fans join a shared, real-time session where they listen to the same tracks together and interact live, with the songs streamed through Apple Music inside TikTok. Musicians can also join and chat with their fans.
«TikTok is where music discovery and culture move at the speed of the community,» Tracy Gardner, global head of music business development at TikTok, said in a statement. «Thanks to Apple Music, Play Full Song gives fans a seamless way to go from discovery to full-length listening, and Listening Party provides a shared place to experience music together in real time. It’s all about bringing artists and fans closer, and turning shared moments into lasting connections.»
Play Full Song and Listening Party will launch globally on TikTok over the next few weeks.
Technologies
AI Chatbots Are Making People All Think the Same, Study Says
A new paper argues that humans are losing varied ways of thinking due to the use of chatbots, and that’s concerning.
Part of what makes us human is the unique ways we think and solve problems. But using large language models like ChatGPT might be eroding this uniqueness and leading humans to think and communicate the same way, according to a group of scientists and psychologists who have co-authored a new opinion paper.
«Individuals differ in how they write, reason, and view the world,» Zhivar Sourati, a computer scientist of the University of Southern California and first author for the paper, said in a statement.
«When these differences are mediated by the same LLMs, their distinct linguistic style, perspective and reasoning strategies become homogenized, producing standardized expressions and thoughts across users,» Sourati continued.
The paper, published Wednesday in the journal Trends in Cognitive Sciences, examines how hundreds of millions of people worldwide use the same handful of chatbots and what that means for our individuality.
Thinking inside the box
Pew Research found that one-third of all Americans used ChatGPT last year, double the 2023 figure. And chatbot use is much more common among teens: Two-thirds say they use chatbots, and almost a third use them daily.
Businesses are also going all in on artificial intelligence. Stanford found that 78% of organizations reported using AI in 2024, up from 55% in 2023.
So we’re using AI a lot. But the danger is that we could lose the diversity in the ways we think. The team points out that LLMs generate writing that varies less than what people come up with on their own.
Part of the reason LLMs may be pushing homogenized thought, according to the paper’s authors, is the data used to train them.
«Because LLMs are trained to capture and reproduce statistical regularities in their training data, which often overrepresent dominant languages and ideologies, their outputs often mirror a narrow and skewed slice of human experience,» Sourati says.
Why diverse thinking matters
There’s a good reason why the authors warn against this trend. Homogenized thought reduces pluralism, which is essentially the idea that multiple perspectives are good for society as a whole.
«This value of pluralism is rooted in the long-held principle that sound judgment requires exposure to varied thought,» the authors write in the paper. «Unchecked, this homogenization risks flattening the cognitive landscapes that drive collective intelligence and adaptability,»
So we use different ways of thinking to figure out more solutions to a problem. If we lose the ability to think and communicate differently, it could affect how we adapt to new situations.
«The concern is not just that LLMs shape how people write or speak, but that they subtly redefine what counts as credible speech, correct perspective, or even good reasoning,» Sourati says.
The authors also say that this trend even impacts people who don’t use chatbots.
«If a lot of people around me are thinking and speaking in a certain way, and I do things differently, I would feel a pressure to align with them, because it would seem like a more credible or socially acceptable way of expressing my ideas,» Sourati says.
Technologies
SXSW 2026 Updates: What We Expect on Tech and Culture From Austin
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow



