Technologies
I Took the iPhone 15 Pro Max and 13 Pro Max to Yosemite for a Camera Test
Do the latest Apple phone and cameras capture the epic majesty of Yosemite National Park better than a two-year-old iPhone? We find out.
This past week, I took Apple’s new iPhone 15 Pro Max on an epic adventure to California’s Yosemite National Park.
As a professional photographer, I take tens of thousands of photos every year. Much of my work is done inside my San Francisco photo studio, but I also spend a considerable amount of time shooting on location. I still use a DSLR, but my iPhone 13 Pro is never far from me.
Like most people nowadays, I don’t upgrade my phone every year or even two. Phones have reached a point where they are good at performing daily tasks for three or four years. And most phone cameras are sufficient for capturing everyday special moments to post on social media or share with friends.


But maybe, like me, you’re in the mood for something shiny and new like the iPhone 15 Pro Max. I wanted to find out how my 2-year-old iPhone 13 Pro and its 3x optical zoom would do against the 15 Pro Max and its new 5x optical zoom. And what better place to take them than on an epic adventure to Yosemite, one of the crown jewels of America’s National Park System and an iconic destination for outdoor lovers.
Yosemite is absolutely, massively impressive.


The main camera is still the best camera
The iPhone 15 Pro Max’s main camera with its wide angle lens is the most important camera on the phone. It has a new larger 48-megapixel sensor that had no problem being my daily workhorse for a week.


The larger sensor means the camera can now capture more light and render colors more accurately. And the improvements are visible. Not only do photos look richer in bright light but also in low-light scenarios.
In the images below, taken at sunrise at Tunnel View in Yosemite National Park, notice how the 15 Pro Max’s photo has better fidelity, color and contrast in the foreground leaves. Compare that against the pronounced edge sharpening of the mountaintops in the 13 Pro image.
The 15 Pro Max’s camera captures excellent detail in bright light, including more texture, like in rocky landscapes, more detail in the trees and more fine-grained color.


A new 15 Pro Max feature aimed at satisfying a camera nerd’s creative itch uses the larger main sensor combined with the A17 Pro chip to turn the 24mm equivalent wide angle lens into essentially four lenses. You can switch the main camera between 1x, 1.2x, 1.5x and 2x, the equivalent of 24mm, 28mm, 35mm and 50mm prime lens – four of the most popular prime lens lengths. In reality, the 15 Pro Max takes crops of the sensor and using some clever processing to correct lens distortion.
In use, it’s nice to have these crop options, but for most people they will likely be of little interest.


I find the 15 Pro Max’s native 1x view a little wide and enjoy being able to change it to default to 1.5x magnification. I went into Settings, tapped on Camera, then on Main Camera and changed the default lens to a 35mm look. Now, every time I open the camera, it’s at 1.5x and I can just focus on framing and taking the photo instead of zooming in.
Another nifty change that I highly recommend is to customize the Action button so that it opens the camera when you long press it. The Action button replaces the switch to mute/silence your phone that has been on every iPhone since the original. You can program the Action button to trigger a handful of features or shortcuts by going into the Settings app and tapping Action button. Once you open the camera, the Action button can double as a physical camera shutter button.


The dynamic range and detail are noticeably better in photos I took with the 15 Pro Max main camera in just about every lighting condition.
There are fewer blown out highlights and nicer, blacker blacks with less noise. In particular, there is more tonal range and detail in the whites. I noticed this particularly when it came to how the 15 Pro Max captured direct sunlight on climbers or in the shadow detail in the rock formations.
Read more: iPhone 15 Pro Max Camera vs. Galaxy S23 Ultra: Smartphone Shootout
Overall, the 15 Pro Max’s main camera is simply far better and consistent at exposures than on the 13 Pro.
The iPhone 15 Pro Max 5x telephoto camera


The iPhone 15 Pro Max has a 5x telephoto camera with an f/2.8 aperture and an equivalent focal length of 120mm.
The 13 Pro’s 3x camera, introduced in 2021, was a huge step up from previous models and still gives zoomed-in images a cinematic feel from the lens’ depth compression. The 15 Pro Max’s longer telephoto lens, combined with a larger sensor, accentuates those cinematic qualities even further, resulting in images with a rich array of color and a wider tonal range.
All this translates to a huge improvement in light capture and a noticeable step up in image quality for the iPhone’s zoom lens.


I found that the 15 Pro Max’s telephoto camera yields better photos of subjects farther away like mountains, wildlife and the stage at a live concert.


A combination of optical stabilization and 3D sensor-shift make the 15 Pro Max’s tele upgrade experience easier to use by steadying the image capture. A longer lens typically means there’s a greater chance of blurred images due to your hand shaking. Using such a long focal length magnifies every little movement of the camera.
I found that the 3D sensor-shift optical image stabilization system does wonders for shooting distant subjects and minimizing that camera shake.
The image below was shot with the 5x zoom on the iPhone 15 Pro Max looking up the Yosemite Valley from Tunnel View. It is an incredibly crisp telephoto image.


For reference, the image below was shot on the 15 Pro Max from the same location using the ultra Wide lens. I am about five miles away from that V-shaped dip at the end of the valley.


The iPhone still suffers from lens flare
Lens flares, along with the green dot that seems to be in all iPhone images taken into direct sunlight, continue to be an issue on the iPhone 15 Pro Max despite the new lens coatings.
Apple says the main camera lens has been treated for anti-glare, but I didn’t notice any improvements. In some cases, images have even greater lens flares than photos from previous iPhone models.
Notice the repeated halo effect surrounding the sun on the images below shot at Lower Yosemite Falls.






The 15 Pro Max and Smart HDR 5


The 15 Pro Max’s new A17 Pro chip brings with it greater computational power (Apple calls it Smart HDR 5), which delivers more natural looking images compared with the 13 Pro, especially in very bright and very dark scenes. There is a noticeably better, more subtle handling of color with a less heavy-handed approach that balances between brightening the shadows and darkening highlights.
You can see clearly the warmer, more natural looking light in 15 Pro Max photo below, pushing back against the typical blue light rendering that is common in over-processed HDR images. At the same time, Apple’s implementation hasn’t swayed too far in the opposite direction and refrains from over saturating orange colors that frequently troubles digital corrections on phones.


Coming from an iPhone 13 Pro Max, I noticed the background corrections during computational processing on the 15 Pro Max tend to result in more discrete and balanced images. Apple appears to have dialed back its bombastic pursuit of pushing computational photography right in our faces like with the 13 Pro and fine tuned the 15 Pro Max’s image pipeline to lean toward a more realistic reflection of your subject.
It’s a welcome change.
The 15 Pro Max shines in night mode


Night mode shots from the 15 Pro Max look similar to the ones from my 13 Pro Max, but there are minor improvements in the exposure that result in images with a better tonal range. The 15 Pro Max’s larger main camera sensor captures photos with less noise in the blacks and a better overall exposure compared to the 13 Pro Max.
Colors in 15 Pro Max night mode images appear more accurate, realistic, and have a wider dynamic range. Notice the detail in the photo below of El Capitan and The Dawn Wall. The 15 Pro Max even captures detail in the car lights snaking through the valley floor road.


Overall, night mode images continue to look soft and over-processed. Night mode gives snaps a dream-like vibe and that isn’t necessarily a bad thing. These photos are brighter and have less image noise than those shot on my iPhone 13 Pro Max.


15 Pro Max vs. 13 Pro Max: the bottom line
By this point, it should be no surprise that the iPhone 15 Pro Max’s cameras are a significant improvement over the ones on the 13 Pro Max. If photography is a priority for you, I recommend upgrading to it from the 13 Pro Max or earlier.
If you’re coming from an iPhone 14 Pro, the improvements seem less dramatic, and it’s likely not a worth the upgrade. I’m incredibly excited to continue carrying the iPhone 15 Pro Max in my pocket to Yosemite or just around my home.
Technologies
Honor’s Robot Phone Is the First of Its Kind, Integrating Robotics Into a Smartphone
Technologies
Concierge Bots, Autonomous Carts and Smart Tags: Welcome to MWC’s Airport of the Future
Your airport could someday be much more tech-infused. Here’s what that might look like.
Picture this: You’re at the airport and a robot is guiding you to your gate. You walk past another bot that’s breakdancing, to the delight (or despair) of passengers waiting for a delayed flight.
Up ahead, someone speeds along in an autonomous single-rider vehicle. Before hopping on your flight, you fill up your water bottle — which also tracks your water intake.
This scene could someday become a reality, at least in part. At Mobile World Congress in Barcelona, I explored an exhibit showcasing several futuristic applications looking to inject airports with a little more tech. The goal is to make the entire passenger journey, from check-in to boarding to the in-flight experience, more efficient and less stressful.
Robotics company AGiBot showed off two of its humanoids. The full-size A2 Series can help you check in for your flight and guide you around the airport. The more compact X2 series bots are designed for «entertainment.» During our demo, that meant busting out some fascinating robotic dance moves. You can currently see the bots in action at Shanghai Hongqiao International Airport.
One of the biggest airport nightmares is dealing with lost luggage. Thankfully, trackers like the AirTag and Tile can help you keep tabs on your bag, but it’s not always easy to share location information with airlines (though that is changing). A digital baggage tag from BagID makes it easier for both passengers and airlines to know exactly where your luggage is.
When you fly with a partner airline, you can add your flight information into the BagID app, and it’ll then display the digital tag information on your BagID device. As a certified third-party accessory for Apple and Samsung, you can use Apple’s Find My and Samsung’s SmartThings Find to follow its location.
BagID uses an E Ink display and has a durable plastic casing, which should keep it in one piece as your bag is tossed around. It’s powered by a lithium ion battery that’s under 2.7Wh, to comply with Federal Aviation Administration regulations. The battery can last around one year with tracking or two years without tracking enabled. BagID costs about $238.
For anyone needing mobility assistance, Alba Ride, from autonomous micro-mobility company Alba Robot, can give them a lift. The self-driving vehicle seats one passenger and can fit a carry-on bag. It’s compact enough that weaving through airport crowds shouldn’t be too much of a challenge.
A screen on the front has an avatar that can point people in the right direction, while a larger display on the back shows ads or flight departure times. The electric vehicle’s battery can last up to 8 hours, according to the company. Alba Ride is slated to launch at Dallas Fort Worth International Airport in May.
Water bottle dispensers are a staple at airports, but water supply company Aigues de Barcelona wants to help you track your hydration levels. Using custom bottles with NFC chips embedded in the lid, you can scan the bottle at one of the company’s dispensers, and then track your water intake in an accompanying app. You’ll also see how much plastic you’ve saved and the reduced carbon footprint. Aigues de Barcelona has installed the dispensers in some venues and sports arenas, but has yet to arrive in airports.
Once you’re onboard your flight, aircraft manufacturer Airbus wants to make mealtime services more efficient, too. It’s developing an app that can keep better track of how much food has been eaten on a flight by allowing attendants to scan anything — including snacks, meals and drinks — using an AI-enabled camera. That can help reduce waste on future flights by allowing teams to analyze how much food was served and how much was left over. And if a passenger has an allergy, the crew can quickly check the ingredients through the app as well.
Judging by this exhibit, automation and robotics could reshape how we get around both on the ground and in the skies. Hopefully, without too many breakdancing robots.
Technologies
A Long-Running AI Copyright Question Gets an Answer as Supreme Court Stays Mum
The man behind the AI-generated image in question reflects on what he calls a «philosophical milestone.»
A legal battle over AI copyright that has gone on for more than a decade may have reached its end, with the US Supreme Court declining to hear a case involving AI-generated visual art.
The subject of the case is an image created by computer scientist Stephen Thaler in 2012, titled «A Recent Entrance to Paradise,» using an AI tool he also created, DABUS. Thaler applied for a copyright for his visual art in 2018, but the application was eventually rejected by the US Copyright Office on the grounds that creative works must have human authorship to be eligible. A district court later upheld the decision.
Thaler’s legal team argued that because he created the system that generated the artwork, he is, in effect, its author.
«Other countries, like China and the United Kingdom, already permit copyright protection for AI-generated works. But the Copyright Office’s reliance on its own nonstatutory requirements have led to an improper cabining of United States copyright law in contradiction of this Court’s precedent that copyright law should accommodate technological progress,» the filing alleges.
«The Copyright Office believes the Supreme Court reached the correct result, confirming that human authorship is required for copyright,» a spokesperson said.
The question of who owns AI-generated artwork and what AI work violates existing copyrights is an important one as AI companies develop increasingly sophisticated image generation tools such as Nano Banana 2 from Googleand video generation tools such as OpenAI’s Sora 2.
While these kinds of tools are making it harder to distinguish between human-generated art and material created by or with AI, they’re also enabling a flood of AI slop across the internet. Tech companies and social media networks have been struggling to find ways to deal with the influx, including using metadata to label AI content and creating better filters to keep unwanted slop away from their users.
A ‘philosophical milestone’ for AI and copyrights
In an email to CNET, Thaler said that although the court declined to hear his appeal, «I see this moment as a philosophical milestone rather than a defeat.»
While he’s unsure if legal action will continue, Thaler says he’s still certain that the law on copyright, as written, is intended to exclude nonhuman inventors.
«By bringing DABUS into the legal system, I confronted a question long confined to theory: whether invention and creativity must remain tied to humans or whether autonomous computational processes could genuinely originate ideas,» Thaler said.
He previously alleged to the court that the Copyright Office’s decision would cause a negative impact on AI development and its use by creative industries in the formative years of the technology’s development.
He warned that the Copyright Office’s current rules could create a «perfect storm» of low-quality AI-generated content that will continue to flood the internet and a wave of lawsuits from humans claiming ownership over work they didn’t create.
«The law is lagging behind what technology can already do,» Thaler said. «The court addressed what the statute currently allows. It did not address what technology has already achieved.»
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow



