Technologies
I Took the iPhone 15 Pro Max and 13 Pro Max to Yosemite for a Camera Test
Do the latest Apple phone and cameras capture the epic majesty of Yosemite National Park better than a two-year-old iPhone? We find out.
This past week, I took Apple’s new iPhone 15 Pro Max on an epic adventure to California’s Yosemite National Park.
As a professional photographer, I take tens of thousands of photos every year. Much of my work is done inside my San Francisco photo studio, but I also spend a considerable amount of time shooting on location. I still use a DSLR, but my iPhone 13 Pro is never far from me.
Like most people nowadays, I don’t upgrade my phone every year or even two. Phones have reached a point where they are good at performing daily tasks for three or four years. And most phone cameras are sufficient for capturing everyday special moments to post on social media or share with friends.


But maybe, like me, you’re in the mood for something shiny and new like the iPhone 15 Pro Max. I wanted to find out how my 2-year-old iPhone 13 Pro and its 3x optical zoom would do against the 15 Pro Max and its new 5x optical zoom. And what better place to take them than on an epic adventure to Yosemite, one of the crown jewels of America’s National Park System and an iconic destination for outdoor lovers.
Yosemite is absolutely, massively impressive.


The main camera is still the best camera
The iPhone 15 Pro Max’s main camera with its wide angle lens is the most important camera on the phone. It has a new larger 48-megapixel sensor that had no problem being my daily workhorse for a week.


The larger sensor means the camera can now capture more light and render colors more accurately. And the improvements are visible. Not only do photos look richer in bright light but also in low-light scenarios.
In the images below, taken at sunrise at Tunnel View in Yosemite National Park, notice how the 15 Pro Max’s photo has better fidelity, color and contrast in the foreground leaves. Compare that against the pronounced edge sharpening of the mountaintops in the 13 Pro image.
The 15 Pro Max’s camera captures excellent detail in bright light, including more texture, like in rocky landscapes, more detail in the trees and more fine-grained color.


A new 15 Pro Max feature aimed at satisfying a camera nerd’s creative itch uses the larger main sensor combined with the A17 Pro chip to turn the 24mm equivalent wide angle lens into essentially four lenses. You can switch the main camera between 1x, 1.2x, 1.5x and 2x, the equivalent of 24mm, 28mm, 35mm and 50mm prime lens – four of the most popular prime lens lengths. In reality, the 15 Pro Max takes crops of the sensor and using some clever processing to correct lens distortion.
In use, it’s nice to have these crop options, but for most people they will likely be of little interest.


I find the 15 Pro Max’s native 1x view a little wide and enjoy being able to change it to default to 1.5x magnification. I went into Settings, tapped on Camera, then on Main Camera and changed the default lens to a 35mm look. Now, every time I open the camera, it’s at 1.5x and I can just focus on framing and taking the photo instead of zooming in.
Another nifty change that I highly recommend is to customize the Action button so that it opens the camera when you long press it. The Action button replaces the switch to mute/silence your phone that has been on every iPhone since the original. You can program the Action button to trigger a handful of features or shortcuts by going into the Settings app and tapping Action button. Once you open the camera, the Action button can double as a physical camera shutter button.


The dynamic range and detail are noticeably better in photos I took with the 15 Pro Max main camera in just about every lighting condition.
There are fewer blown out highlights and nicer, blacker blacks with less noise. In particular, there is more tonal range and detail in the whites. I noticed this particularly when it came to how the 15 Pro Max captured direct sunlight on climbers or in the shadow detail in the rock formations.
Read more: iPhone 15 Pro Max Camera vs. Galaxy S23 Ultra: Smartphone Shootout
Overall, the 15 Pro Max’s main camera is simply far better and consistent at exposures than on the 13 Pro.
The iPhone 15 Pro Max 5x telephoto camera


The iPhone 15 Pro Max has a 5x telephoto camera with an f/2.8 aperture and an equivalent focal length of 120mm.
The 13 Pro’s 3x camera, introduced in 2021, was a huge step up from previous models and still gives zoomed-in images a cinematic feel from the lens’ depth compression. The 15 Pro Max’s longer telephoto lens, combined with a larger sensor, accentuates those cinematic qualities even further, resulting in images with a rich array of color and a wider tonal range.
All this translates to a huge improvement in light capture and a noticeable step up in image quality for the iPhone’s zoom lens.


I found that the 15 Pro Max’s telephoto camera yields better photos of subjects farther away like mountains, wildlife and the stage at a live concert.


A combination of optical stabilization and 3D sensor-shift make the 15 Pro Max’s tele upgrade experience easier to use by steadying the image capture. A longer lens typically means there’s a greater chance of blurred images due to your hand shaking. Using such a long focal length magnifies every little movement of the camera.
I found that the 3D sensor-shift optical image stabilization system does wonders for shooting distant subjects and minimizing that camera shake.
The image below was shot with the 5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View. It is an incredibly crisp telephoto image.


For reference, the image below was shot on the 15 Pro Max from the same location using the ultra Wide lens. I am about five miles away from that V-shaped dip at the end of the valley.


The iPhone still suffers from lens flare
Lens flares, along with the green dot that seems to be in all iPhone images taken into direct sunlight, continue to be an issue on the iPhone 15 Pro Max despite the new lens coatings.
Apple says the main camera lens has been treated for anti-glare, but I didn’t notice any improvements. In some cases, images have even greater lens flares than photos from previous iPhone models.
Notice the repeated halo effect surrounding the sun on the images below shot at Lower Yosemite Falls.






The 15 Pro Max and Smart HDR 5


The 15 Pro Max’s new A17 Pro chip brings with it greater computational power (Apple calls it Smart HDR 5), which delivers more natural looking images compared with the 13 Pro, especially in very bright and very dark scenes. There is a noticeably better, more subtle handling of color with a less heavy-handed approach that balances between brightening the shadows and darkening highlights.
You can see clearly the warmer, more natural looking light in 15 Pro Max photo below, pushing back against the typical blue light rendering that is common in over-processed HDR images. At the same time, Apple’s implementation hasn’t swayed too far in the opposite direction and refrains from over saturating orange colors that frequently troubles digital corrections on phones.


Coming from an iPhone 13 Pro Max, I noticed the background corrections during computational processing on the 15 Pro Max tend to result in more discrete and balanced images. Apple appears to have dialed back its bombastic pursuit of pushing computational photography right in our faces like with the 13 Pro and fine tuned the 15 Pro Max’s image pipeline to lean toward a more realistic reflection of your subject.
It’s a welcome change.
The 15 Pro Max shines in night mode


Night mode shots from the 15 Pro Max look similar to the ones from my 13 Pro Max, but there are minor improvements in the exposure that result in images with a better tonal range. The 15 Pro Max’s larger main camera sensor captures photos with less noise in the blacks and a better overall exposure compared to the 13 Pro Max.
Colors in 15 Pro Max night mode images appear more accurate, realistic, and have a wider dynamic range. Notice the detail in the photo below of El Capitan and The Dawn Wall. The 15 Pro Max even captures detail in the car lights snaking through the valley floor road.


Overall, night mode images continue to look soft and over-processed. Night mode gives snaps a dream-like vibe and that isn’t necessarily a bad thing. These photos are brighter and have less image noise than those shot on my iPhone 13 Pro Max.


15 Pro Max vs. 13 Pro Max: the bottom line
By this point, it should be no surprise that the iPhone 15 Pro Max’s cameras are a significant improvement over the ones on the 13 Pro Max. If photography is a priority for you, I recommend upgrading to it from the 13 Pro Max or earlier.
If you’re coming from an iPhone 14 Pro, the improvements seem less dramatic, and it’s likely not a worth the upgrade. I’m incredibly excited to continue carrying the iPhone 15 Pro Max in my pocket to Yosemite or just around my home.
Technologies
Watch a Robot Stuff Cash Into a Wallet Just Like You Do
Generalist AI’s Gen-1 model is all about «teaching robots physical common sense.»
In 2026, we’re seeing robots progress by leaps and bounds with markedly improved dexterity, the kind of progress long needed in the quest for truly useful household helpers. Now a new AI model has arrived to power robots through activities, including folding laundry, constructing boxes, fixing other robots and even filling wallets with flimsy paper money.
Earlier this month, California-based company Generalist AI released Gen-1, a new physical AI model that makes robots capable of performing all of these tasks (and more) with success. It’s a big step forward in terms of robots designed for the real world based on intelligence born from the real world, Pete Florence, co-founder and CEO of Generalist AI told me.
In most of the example videos published by the company, Gen-1 is seen running on a pair of robotic arms, but that’s not all it’s built for. «Gen-1 is designed to be the brain of any robot, meaning the same model can run on a humanoid, an industrial arm or other robotic systems,» said Florence.
Already, this has proved to be a breakthrough year for general-purpose humanoid robots, with companies including Boston Dynamics and Honor unveiling cutting-edge bots capable of uncannily humanlike movements. The market for robots is expected to explode, with one estimate from Morgan Stanley predicting growth to a $5 trillion market by 2050. Predictions see robots coming for industry, retail, hospitality and care environments before eventually landing in our homes. To get us there, we need to see further advances in AI.
Training robots to live alongside humans
Over the past few years we’ve seen large language models, such as ChatGPT, Gemini and Claude, evolve at lightning speed. The same hasn’t been true of the physical AI models required to power robots, in large part because of a lack of data to train those models on. Robots — and especially humanoid robots — must learn to navigate a world built for humans just as a human would.
Often this data is collected from robots performing tasks while being teleoperated by humans, but not Gen-1. Instead, the dataset used to train Generalist AI’s models has been assembled by humans completing millions of different tasks using wearable technology.
«We built our own lightweight ‘data hands’ and distributed them globally to learn how people actually interact with objects, with all the subtle force feedback, tactile feel, slips, corrections and recoveries that define human dexterity in the real world,» said Florence. «That kind of data is critical for teaching robots physical common sense, the intuitive understanding and ability to adapt in real time rather than execute rigid instructions.»
Generalist AI has released a series of videos showing the model running on robots repetitively performing a range of different tasks, with the most compelling, perhaps, being a robot drawing cash out of a wallet before reinserting it into the same pocket. This is a fiddly task that many humans fumble over. It’s clearly not easy for the robot, either, given the flimsiness of the paper money and the fabric of the wallet — and yet it completes the task.
Another video shows a robot sorting socks by color, folding them in neat piles and counting the number of pairs using a touchscreen. Other tricky tasks the model can complete include unzipping and filling a pencil case with pens, stacking oranges in a neat pyramid and plugging in an Ethernet cable.
These videos show the breadth of Gen-1’s capabilities, but more impressive is the success rate with which it can complete certain tasks. Generalist AI measured the model’s hit rate against the previous version and found Gen-1 could successfully service a robot vacuum cleaner in 99% of cases (up from 50% for Gen-0), fold boxes in 99% of cases (up from 81% for Gen-0) and package up phones in 99% of cases (up from 62% for Gen-0).
Robots do improv
Most robots are programmed to complete a task in a specific and orderly way. But what happens when a curve ball gets thrown? «The smallest changes in the environment can cause failures,» said Florence.
An important skill robots need, which humans innately possess, is the ability to think on their feet. This is why Gen-1 has been designed with improvisation in mind so it can come up with strategies to complete tasks. Florence gives me an example of a robot using two hands to reposition an awkwardly placed part for an automotive task, even though it has only been trained to use one.
«This kind of creativity has been largely absent from robotics until now,» he said.
Significant work still needs to be done when it comes to beefing up robots’ improv chops, but early progress show glimpses of a positive impact on both reliability and speed, says Florence. «We’re beginning to see real progress and are excited to push the boundaries of embodied intelligence.»
After all, there may come a day when you need a robot in your house that can fix all your other smaller robots.
Technologies
iPhone 17 Pro Camera Battles the Galaxy S26 Ultra: Let the Fun Begin
They’re both top-end flagship phones, but which one takes better photos? I wanted to find out.
Both Apple’s iPhone 17 Pro and Samsung’s Galaxy S26 Ultra earned coveted CNET Editors’ Choice awards in their full reviews. And they damned well earned them, too, thanks to their stellar overall performance and wealth of top-end tech on board. But they also garnered praise for their camera quality, with both able to take great-looking photos in a variety of conditions. But which does it better?
As a professional photographer myself, I was keen to find out, so I took them on a series of photo walks around Scotland to put them to the test in the same conditions.
Before we dive in, a few notes from me. First, all images were captured in JPEG format using the standard camera app on each phone. On some images on the iPhone, Apple’s Gold Photographic Style was activated; on others, it was set to Standard, and I’ll be highlighting which is which. The images have been imported into Adobe Lightroom for comparison purposes and exported at smaller file sizes to better suit online viewing. No edits to the images themselves were made, and no sharpening was applied on the export.
Read more: These Are the Best Phone Cameras That We’ve Tested
Crucially, though, it’s important to keep in mind that the analysis here is my opinion. Photography is largely subjective, and what might look good to one person might not to another. For me, I love a more natural-looking image with accurate tones that I could then edit further later if I want to. You may like a punchy, vibrant tone straight out of the camera, and that’s fine. You’ll just need to take my results here with a slight pinch of salt.
All that said, let’s dive in.
This was an image I took with the Gold filter accidentally enabled on the iPhone. So its warmer color tones are to be expected to an extent, but what I liked more here is the depth of shadow that the iPhone has maintained. The S26 Ultra has done a fair bit of processing here to lift those shadows and create a more balanced exposure overall, but I think it’s killed some of the evening drama as a result. I see this in a lot of Android phones, to be fair.
Taken earlier in the day, there’s much less difference to be seen here. The iPhone’s colors are a bit warmer, thanks to the Gold filter, but they actually look more natural as a result. The shot doesn’t look warm in its white balance; it just has a richness to it, while the S26 Ultra’s shot looks quite cold.
I switched the iPhone to Standard Photographic Style here, and as a result, the shot it took looks pretty similar to that taken by the Galaxy S26 Ultra. The exposures are pretty much the same, and while the green plants on the steps definitely look more vivid in the Galaxy’s shot, the colors elsewhere are broadly on par.
If I’m nitpicking — which I really have to when the phones cost this much money — the S26 Ultra appears to have done a neater job rendering the details on the front of the VW Camper’s spare wheel. I also noticed more detail in some of the small twigs on the tree, especially where they’re visible against the sky. Is that a difference you’d ever notice without a side-by-side comparison? Definitely not. But this whole article is basically an exercise in pedantry, so I will continue to pick away at even the tiniest of things in these photos.
I’m back on the Gold Photographic Style with the iPhone here, so again, those warmer tones are to be expected, but I will say again that I much prefer the deeper shadows seen on the house in the Apple phone’s image. It looks much more natural, while the S26 Ultra’s shot looks a bit too HDR and oversaturated for my tastes. But that’s not the most important thing here…
What took me more by surprise was what happened when I put each phone into the ultrawide camera mode. The iPhone’s color tones stay almost exactly the same, but the Galaxy’s image has shifted quite dramatically between the main and ultrawide lenses.
The blue sky has shifted its hue into a much more teal-toned color, and I’m surprised by just how different it looks from the main camera. I usually expect to see these sorts of color shifts on cheaper phones, where there’s less effort put into ensuring consistent colors across the lenses. So I’m a bit disappointed to see Samsung’s phones producing such a noticeable shift here.
The iPhone 17 Pro also displays a color shift, but it’s far less pronounced than the S26 Ultra’s.
I turned on the zooms on both phones. With its 10x optical zoom, the S26 Ultra has a longer reach than the 8x on the iPhone 17 Pro, but in terms of details within those images, there’s honestly nothing to choose between them. Again, the iPhone had the Gold style applied, so it looks warmer, and also again, the S26 Ultra has gone further in lightening those shadows. I can’t really say either one is better than the other in this example.
But there’s a much bigger difference in this example. The colors are much richer in the iPhone’s shot, even though the Photographic Style is set to Standard. The S26 Ultra’s shot looks like the phone’s white balance has been tricked by the warm orange tones of the brickwork, and produced a colder-looking image as a result.
But I also don’t like what the S26 Ultra has done with the details here. It’s oversharpened the scene, giving a weird, crunchy look to the subject that looks extremely unnatural. The iPhone, despite not having the same zoom range on paper, has delivered a much better-looking image, even when viewed at the same scale.
But here the opposite seems to have happened. The iPhone has looked at this warm, sun-drenched scene and automatically set its white balance to cool it, while the S26 Ultra has maintained those warmer tones. Sure, the greens of the leaves in the S26’s image look almost neon, but the image overall is the nicer of the two in my view.
The iPhone has done a much better job here of capturing the warmer tones that I loved so much when I took these images. I do think the S26 Ultra has gone too far in its hyper-saturation of the green leaves. Sure, it’s a punchy look, but if I wanted that much saturation, I’d maybe add a bit more back in in the editing stage. I’d much rather have a more natural image as a starting point, so the iPhone takes the win here for me.
There’s so little to pick out between the images here. The greens are a little more vibrant in the S26 Ultra’s shot, but the tones overall in the iPhone’s are a bit more natural. Neither one is a spectacular photo, and honestly, you may as well toss a coin to decide which one is better.
Switching to the ultrawide lenses on both phones, the S26 Ultra has again gone quite hard on the saturation, delivering a much more vibrant blue sky than it did in its image from the main camera. As before, I’m not a fan of this sort of high-contrast, high-saturation photo. As a result, the iPhone 17 Pro is my preferred shot here.
I think the S26 Ultra’s tendency towards vibrancy has helped here, however, with this shot of spring blossom looking more joyful than the almost drab-looking image from the iPhone.
And sure, the colors are a little overbaked from the S26 Ultra’s ultrawide image, but it still screams «spring» more than the iPhone’s shot, which again looks pretty dull and lifeless by comparison.
I was thrilled to find these fishermen hanging out in Edinburgh, and I think the iPhone has done the better job of capturing the moment. The Gold Photographic Style hasn’t produced an overly warm image here. It’s more like it applied just the correct white balance, with the S26 Ultra’s shot looking quite cold. It’s especially the case on the pink paintwork on the base of the building, which looks richer and much more true-to-life on the iPhone’s image.
At night, both phones have done a good job of capturing this complex image. The bright moon has been kept under control, and there’s plenty of detail still visible in some of the more shadowy areas. The exposures are also broadly similar (the iPhone’s is a touch brighter), and even when peering up close, there’s not much to choose from in terms of detail.
It’s a slightly different story here, though. The iPhone’s shot is much brighter, but that results in some detail being lost in the highlights inside the phone booth. The S26 Ultra has retained that highlight detail, though its overall shot is darker. Personally, I prefer the darker version, especially as it’s much more in line with the moody nighttime aesthetic I was going for.
What I don’t love is how much the S26 Ultra has oversharpened its image. Like the earlier image of the figure sitting on the wall, this image has been digitally sharpened to the point that the details look crunchy, high-contrast and ultimately quite unnatural. Which image would I choose — properly exposed but oversharpened, or natural details with blown-out highlights? Ideally, I’d simply take the photo again on the iPhone and lower the exposure a tad. But between the two images above, I’d probably go for the one shot on the Samsung phone.
iPhone 17 Pro vs. Galaxy S26 Ultra: Which has the better camera?
I always complain that these photo-capturing comparison stories are really close and therefore difficult to make into compelling articles, but this one felt especially close. In some shots, the iPhone’s more natural shadow rendering and less reliance on over-sharpening and other digital processing factors make them look better to my eye. But in other examples — especially the image with the tree trunks surrounded by ivy — the S26 Ultra has done a much better job with its color balancing.
Overall, Samsung’s phone leans harder into contrast and saturation, which is literally the same thing we’ve said about Samsung’s phones since it first started putting cameras in them. Buying a Samsung camera phone has always meant getting more vibrant, punchy images out of it, and that’s exactly the case here. If you want quick images of your friends and family that look good enough to share straight to your family WhatsApp group, the S26 Ultra will serve you well.
The iPhone 17 Pro tends to be more neutral in its color and contrast adjustments, which typically gives a more natural base for you to then add any extra edits of your own. It’s why Apple’s phones have typically always been the device of choice for more enthusiast or pro photographers and video creators. I count myself among that crowd, and it’s why the iPhone 17 Pro remains my preferred model of the two. But really, these are both excellent phones with superb cameras, and you can’t go far wrong with either.
Technologies
Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay
Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay
In its latest update, Verum Messenger takes a major step toward integrating communication and financial services. Users can now enjoy a long-awaited feature — topping up their Verum Finance card directly through Apple Pay.
A New Level of Convenience
The integration with Apple Pay significantly simplifies the top-up process. Users no longer need to go through complex transfer steps or rely on third-party services. Just a few taps — and the funds are instantly credited to the card.
This is especially valuable for those who use Verum Messenger not only for communication but also for managing their finances within the ecosystem.
Finance and Messaging in One App
This update reinforces Verum’s strategy to combine in a single product:
- secure communication
- cryptocurrency operations
- everyday financial tools
Verum Messenger is no longer just a messaging app — it is evolving into a полноценную fintech platform.
Security and Speed
Apple Pay is known for its high level of security thanks to:
- biometric authentication
- payment tokenization
- no sharing of card details
By integrating these technologies, Verum Messenger ensures that financial operations are not only convenient but also максимально secure.
What This Means for Users
The update brings several key benefits:
- instant card top-ups
- simplified user experience
- reduced reliance on third-party payment services
- deeper integration of finance into everyday communication
Looking Ahead
The addition of Apple Pay is just one step in the evolution of the Verum ecosystem. It’s clear the team is moving toward creating a unified digital environment where users can handle most of their needs — from communication to capital management — within a single app.
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoThe number of Сrypto Bank customers increased by 10% in five days



