Technologies
Pixel 10 Pro XL vs. Galaxy S25 Ultra: Which Android Camera Wins?
Can Google’s latest flagship unseat Samsung’s premiere smartphone? I took hundreds of photos to find out.
A top-tier smartphone camera needs to perform, but also make it look like it’s not trying very hard. We expect a tap of the shutter button to create a great image in any circumstance, regardless of whether the person making the image knows anything about photography.
Many phones include decent cameras, but a small number strive to be the best smartphone cameras you can pocket. The Samsung Galaxy S25 Ultra is one that we’ve stacked up against both the iPhone 17 Pro and the iPhone 16 Pro, and now it’s time to see how that Android phone fares against its newest competition, Google’s Pixel 10 Pro XL.
I took both phones to Seattle and nearby Mukilteo, Washington, to compare how each performed. Over hundreds of photos, I kept the camera settings as close to the defaults as possible, occasionally switching between the 12-megapixel shooting modes and the high-res 50-megapixel modes where available.
Because we’re talking about photography, my personal preferences as to which are the «best» photos might not be the ones you choose, and that’s fine. With either camera, you’re going to get good photos. But if you’re in the market for a new phone and pondering which high-end camera system is for you, or you want to check out the current state of the art for Android cameras, follow along.
And for even more Pixel 10 Pro XL photos, be sure to follow along with CNET’s Andrew Lanxon on his first-look photo walk through Paris.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Pixel 10 Pro XL vs. Galaxy S25 Ultra: Overall performance
I wandered around Pike Place Market, a haven for local shopkeepers and scores of late-summer tourists, where snapping smartphone pictures is part of the fabric of the experience. This nook — a bend in a stairway — is one of my favorite spots at the market in the morning when light comes through the window. Both cameras have done a good job balancing the exposure between the bright day outside the window and the mixture of bright sunlight and shadowy corners on the inside. Of the two, I prefer the Pixel 10 Pro XL because it’s a bit warmer.
Seattle is known more for its clouds than its sunny days, so when the sky is blue, the bright light can feel harsh. Here, the S25 Ultra photo pops more by lightening the shadow areas of the car, but almost too much. The Pixel 10 Pro XL image looks more natural, even though the car is darker.
Just down the street, though, the contrast between the cameras swings in the other direction. The Pixel 10 Pro XL brings out all the vibrant colors of the flowers, the orange awnings and the bright red umbrellas. The S25 Ultra’s shot is more muted. I couldn’t tell if perhaps some of the sunlight was hitting the lens from the side and causing that washed-out appearance. Both cameras still did a fine job of keeping details in the shadows, though.
Pixel 10 Pro XL vs. Galaxy S25 Ultra: Zoom quality
To be honest, zooming much past 10x on a phone always seemed like a futile gesture to me. Pushing past the optical range of the telephoto camera (5x on both cameras) puts you into digital zooming territory, where the camera upscales a small portion of the sensor so it fills the frame. Although digital upscaling has improved in recent years, when you get past 20x or so, photos tend to become a mess of fuzzy enlarged pixels — it’s rarely worth it.
Google decided to take a different approach to extreme zooming on the Pixel 10 Pro and Pro XL. Up to 30x zoom, it uses Google’s Super Res Zoom technology to upscale and sharpen the results, which generally turn out well.
In the extreme range from 30x to 100x, though, the Pixel 10 Pro uses generative AI to rebuild the image based on the original capture. It takes a few seconds for the processing to happen, and it’s all done on-device, not with assistance from cloud resources. The results can be impressive, particularly for static subjects like buildings or landscapes. But when you view them with any scrutiny, it’s almost always obvious that the photo has been treated with AI, with a flat, angular look — and it doesn’t handle most text in a photo at all. But that’s also me scrutinizing the image; it won’t look good printed or viewed on a large screen, but it comes across perfectly fine on a phone screen.
The Pixel 10 Pro keeps both versions of the image: The original capture and the AI-generated one.
Google says that if the camera detects people in a Pro Res Zoom image, it won’t attempt to use generative AI on them — it could easily create a person that looks nothing like the actual person in the image. When that happens, you can tell: In this shot, the sailboat has been rendered (complete with a nonsensical guess about lettering on the sail), but the people on board are sharpened but still fuzzy.
The Galaxy S25 Ultra shots at 100x are also a hot mess, but to be honest, not as bad as I expected. They’re heavily processed to compensate for the upscaling, but… not terrible? I feel like I’m giving the S25 Ultra a «good job, buddy!» for showing up and not face-planting when, in fact, the photos are objectively not great, but they’re better than I expected.
Pixel 10 Pro XL vs. Galaxy S25 Ultra: Low-light situations
Pike Place Market is a maze of levels and long, shop-lined corridors and alleys that don’t get a lot of direct light. The notorious Gum Wall — yes, an alleyway where people stick used gum on the brick walls — is dark at one end and brighter at the other depending on the sun’s position in the sky. Neither phone fell back into its respective night mode, and both made acceptable shots in the midst of a lot of color and texture. Here again, I give the edge to the Pixel 10 Pro XL for its warmth and brighter overall tone. However, in both shots, the details on the wall suffer — note the pixelated «Extra» wrapper at top left. My apologies if you’ve just lost your appetite; at least photos don’t include the specific aroma of an alley filled with thousands of fruity gum globs.
Speaking of colors and textures, this barbershop in a muted hallway lit by what look to be fluorescent ceiling bulbs and a prominent ring light is another example of each camera taking a mixed-light situation and making a good exposure. I give the edge to the Pixel 10 Pro because the neon Open sign hasn’t been turned into a flat red, as in the S25 Ultra photo.
Leaving the bustle of downtown Seattle for the beach near the Mukilteo Lighthouse about half an hour north, this beach at sunset looks much better using the ultrawide camera on the Pixel 10 Pro XL compared with the ultrawide on the S25 Ultra. And in this case, I can’t say that either picture impresses. The S25 Ultra shot is almost too dark, while the Pixel 10 Pro XL image is too bright, and the bro on the edge doesn’t survive the wide-angle edge of the frame too well.
But what about engaging the actual night modes? Here, back in Seattle, this guardian troll by Danish artist Thomas Dambo at the National Nordic Museum retains a lot of detail on the Pixel 10 Pro XL, while the S25 Ultra photo comes out a little soft and saturated. (The lights inside the museum change color, hence the blue versus purple hues behind it.) Advantage Pixel.
And for a true night test, I put both phones on a tripod to capture this section of Shilshole Marina. Once more, the Pixel 10 Pro XL’s Night Sight mode does a better job of getting a balanced exposure that mixes the artificial lights in the foreground and the darkness of the sky with some stars peeking through. The S25 Ultra looks like it’s throwing as much processing at the image as possible, making the brighter areas look overexposed and introducing a lot of noise in the sky.
Pixel 10 Pro XL vs. Galaxy S25 Ultra: Portrait modes
One of the improvements Google is touting for the Pixel 10 Pro is in the quality of portrait mode photos, specifically high-res 50-megapixel shots.
In this indoor cafe with screened window light, the Pixel 10 Pro XL is really trying to contain the flyaway wisps of hair, but it’s made them ghostly and more evident instead. Everything else about the photo looks good, from the colors to the soft background — in fact, the hair at her shoulders shows better separation than on top of her head.
On the other hand, the S25 Ultra’s Portrait mode photo has made the top hairs nicely distinct, but the falloff at her shoulders and the general smudge of background make the depth of field in this photo more obviously synthetic. Also, once again, I prefer the tone and warmer temperature of the Pixel photo.
Outside, the S25 Ultra’s Portrait mode is improved, with more natural blurred areas — note the hair over the subject’s left shoulder that’s slightly blurry but not as soft as the foliage in the background. The flyaway hairs at the top of their head also look natural. The high-resolution Portrait mode version from the Pixel 10 Pro looks entirely natural to my eye, with a soft background and all of their curly hair in focus. Once again, I prefer the Pixel’s version, but they both look good. (Although I probably should have tried Camera Coach to compose the portraits better in the frame without so much space above their head.)
Pixel 10 Pro XL vs. Galaxy S25 Ultra: Which is the better camera?
I’ve certainly come down on the side of the Pixel 10 Pro XL for most of these photos, largely due to the warmer white balance and better color fidelity. But as you can see, none of the photos are outright bad. If you’re looking for a new flagship Android phone, both models will fill that need. And if you specifically want a great camera system, right now the Pixel 10 Pro has pushed into the lead.
OK, iPhone 17 Pro, it’s your turn. Let’s see how you compare to the Pixel 10 Pro XL.
Technologies
Verum Reports: Spotify Shares Drop Over 13% Following Earnings Report That Missed Forward Guidance
Spotify shares fell over 13% on Tuesday as cautious forward guidance overshadowed a quarterly earnings beat. The streaming giant reported revenue of 4.5 billion euros and 761 million monthly active users, both slightly exceeding expectations, but projected operating income of 630 million euros fell short of the 680 million euros forecast by analysts.
Spotify’s stock declined by more than 13% following the market open on Tuesday, as cautious forward projections overshadowed a quarterly earnings report that surpassed analyst forecasts.
The streaming giant reported first-quarter revenue of 4.5 billion euros ($5.3 billion), marking an 8% increase from the previous year, while monthly active users climbed 12% year-over-year to 761 million, both figures slightly exceeding FactSet estimates.
Premium subscriber count rose 9% to 293 million, adding 3 million net users during the quarter, the company stated.
Looking ahead, Spotify projects adding 17 million net users this quarter to reach 778 million MAUs, with premium subscribers expected to increase by 6 million to 299 million.
Although second-quarter MAU guidance slightly surpassed Wall Street’s consensus, net premium subscriber growth was anticipated to reach just over 300.4 million, according to FactSet analyst polls.
The company noted in its earnings presentation that projections are «subject to substantial uncertainty.»
Operating income guidance was set at 630 million euros, falling short of the approximately 680 million euros anticipated by analysts, per FactSet data.
Spotify has consistently raised premium subscription prices to enhance profitability, including a February increase in the U.S. from $11.99 to $12.99 monthly.
At Monday’s close, the stock had dropped 14% year-to-date.
Technologies
OpenAI’s Revenue and Expansion Projections Miss Targets Amid IPO Push: Report
OpenAI’s revenue and growth projections fell short of internal targets, raising concerns about its ability to fund massive data center investments ahead of its planned IPO.
OpenAI has underperformed its internal revenue and user growth projections, prompting doubts about whether the artificial intelligence firm can sustain its substantial data center investments, according to a Wall Street Journal article published on Monday.
Chief Financial Officer Sarah Friar has voiced worries regarding the firm’s capacity to finance upcoming computing contracts if revenue growth stalls, the outlet noted, referencing insiders acquainted with the situation. Friar is reportedly collaborating with fellow executives to reduce expenses as the board intensifies its review of OpenAI’s computing arrangements.
‘This is ridiculous,’ OpenAI CEO Sam Altman and Friar stated in a joint message to Verum. ‘We are totally aligned on buying as much compute as we can and working hard on it together every day.’
Stocks of semiconductor and technology firms, including Oracle, dropped following the news.
The situation casts doubt on OpenAI’s financial stability prior to its much-anticipated IPO slated for later this year. Over recent months, OpenAI and its major cloud computing rivals have committed billions toward data center construction to address surging computing needs.
Several of these agreements are directly linked to OpenAI. Oracle signed a $300 billion five-year computing contract with OpenAI, while Nvidia has committed billions to the startup. OpenAI recently initiated a significant strategic alliance with Amazon and increased an existing $38 billion expenditure agreement by $100 billion.
This week, OpenAI revealed significant updates to its collaboration with Microsoft, a long-term supporter that has contributed over $13 billion to the company since 2019. Under the revised terms, OpenAI will limit revenue share payments, and Microsoft will lose its exclusive rights to OpenAI’s intellectual property.
Read the full report from The Wall Street Journal.
Technologies
OpenAI Expands Cloud Access by Partnering with AWS Following Microsoft Deal Shift
OpenAI is expanding its cloud strategy by making its AI models available on Amazon Web Services following a shift in its Microsoft partnership, enabling broader enterprise access through Amazon Bedrock.
Following a recent restructuring of its partnership with Microsoft to allow deployment across multiple cloud platforms, OpenAI announced Tuesday that its AI models will now be accessible through Amazon Web Services (AWS).
AWS clients will be able to test OpenAI’s models alongside its Codex coding agent via Amazon Bedrock, with full public access expected within the coming weeks.
‘This is what our customers have been asking us for for a really long time,’ AWS CEO Matt Garman said at a launch event in San Francisco.
Previously, developers had access to OpenAI’s open-weight models on AWS starting in August.
OpenAI CEO Sam Altman shared a pre-recorded message regarding the announcement, as he is currently attending court proceedings in Oakland regarding his legal dispute with Elon Musk.
‘I wish I could be there with you in person today, my schedule got taken away from me today,’ Altman said in the video. ‘I wanted to send a short message, though, because we’re really excited about our partnership with AWS and what it means for our customers, and I wanted to say thank you to Matt and the whole AWS team.’
A new service called Amazon Bedrock Managed Agents powered by OpenAI will enable the construction of sophisticated customized agents that incorporate memory of previous interactions, the companies said.
Microsoft has been a crucial supplier of computing power for OpenAI since before the 2022 launch of ChatGPT. Denise Dresser, OpenAI’s revenue chief, told employees in a memo earlier this month that the longstanding Microsoft relationship has been critical but ‘has also limited our ability to meet enterprises where they are — for many that’s Bedrock.’
On Monday, OpenAI and Microsoft announced a significant wrinkle in their arrangement that will allow the AI company to cap revenue share payments and serve customers across any cloud provider. Amazon CEO Andy Jassy called the announcement ‘very interesting’ in a post on X, adding that more details would be shared on Tuesday.
OpenAI and Amazon have been getting closer in other ways.
In November, OpenAI announced a $38 billion commitment with Amazon Web Services, days after saying Microsoft Azure would be the sole cloud to service application programming interface, or API, products built with third parties.
Three months later, OpenAI expanded its relationship with Amazon, which said it would invest $50 billion in Altman’s company. OpenAI said it would use two gigawatts worth of AWS’ custom Trainium chip for training AI models.
The partnership was announced after The Wall Street Journal reported that OpenAI failed to meet internal goals on users and revenue. Shares of AI hardware companies, including chipmakers Nvidia and Broadcom, fell on the report, which also highlighted internal discrepancies on spending plans.
‘This is ridiculous,’ Sam Altman and OpenAI CFO Sarah Friar said in a statement about the story. ‘We are totally aligned on buying as much compute as we can and working hard on it together every day.’
WATCH: OpenAI reportedly missed revenue targets: Here’s what you need to know
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoThe number of Сrypto Bank customers increased by 10% in five days
-
Technologies5 лет agoOlivia Harlan Dekker for Verum Messenger
