Connect with us

Technologies

Apple and Samsung Are Racing to Create the Ultimate Camera Phone

Commentary: The Galaxy S23 Ultra and iPhone 14 Pro reiterate Apple’s and Samsung’s ambitions to appeal to pro photographers and videographers.

This story is part of Samsung Event, CNET’s collection of news, tips and advice around Samsung’s most popular products.

A phone’s camera bridges our everyday lives with our online identities, whether you’re sharing family photos, posting clips from your vacation on TikTok or dialing into a Zoom meeting. Apple and Samsung clearly understand this, as evidenced by the iPhone 14 Pro and the new Galaxy S23 Ultra, which goes on sale Feb. 17. With both devices, Samsung and Apple are sending a strong message: The camera is what matters most in a new phone. It’s the biggest factor that separates the best phone money can buy from reasonably priced devices.

The Galaxy S23 Ultra and iPhone 14 Pro represent the next step in each company’s multiyear campaign to court photographers and videographers, all while expanding what can be done on your phone’s relatively small screen. They’re the culmination of the latest efforts by Apple and Samsung to outpace one another in an arms race that’s been progressing for more than a decade. Apple and Samsung aren’t the only ones focused on the camera; the same goes for Google and OnePlus. But as the world’s two largest phone brands by market share, Apple and Samsung have an outsized influence over the devices we carry in our pockets.

Annual smartphone updates feel incremental, making it harder for people to justify yearly upgrades, especially when the cost for everyday goods and services remains high. The latest high-end phones from Apple and Samsung serve as statements that customers are willing to pay for the best. And for both companies, being the «best» often means having the best camera.

Samsung and Apple bet people will spend more on better devices

Cameras with a 100x digital zoom magnification and a nearly tablet-sized screen aren’t for everyone, especially given their high price. As generational upgrades become less flashy, customers are holding onto their devices longer before upgrading. But there is evidence hinting that premium phones still appeal to shoppers despite inflation, potentially showing that Apple and Samsung’s camera-first approach may be working.

According to Counterpoint Research, the iPhone’s average selling price increased 7% year over year in the third quarter of 2022, indicating Apple’s more expensive phones may be its most popular. (However, that could also be because the price of Apple’s regular iPhones has increased over the years, while the Pro’s starting price has largely remained the same).

Ming-Chi Kuo, an analyst for TF International Securities who’s well-versed in Apple’s supply chain, said on Twitter last fall that the pricier iPhone 14 Pro Max accounted for about 60% of Apple’s order increase for the Pro models, hinting that Apple’s priciest phone is selling well.

TM Roh, head of Samsung’s mobile experience business, said in an interview with CNET earlier this year that the Galaxy S22 lineup saw double-digit sales growth compared to the Galaxy S21 series. That indicates Samsung’s more expensive phones are indeed top sellers.

Remarks from Roh and Apple CEO Tim Cook also suggest that people are simply willing to pay for better devices.

«When times get hard, then people would be more cautious in the choices that they make,» Roh also said to CNET. «In other words, they would be looking for greater value to be gained.»

Speaking with analysts during Apple’s fiscal first-quarter earnings call earlier this month, Cook said he thinks «people are willing to really stretch to get the best they can afford in that category.»

Samsung’s and Apple’s current premium phones could also influence the devices we see in the future as both companies are expected to lean more heavily into high-end devices. Apple is discussing releasing an iPhone Ultra that would be a step up from the $1,099 iPhone 14 Pro Max, according to Bloomberg, likely expanding upon the Pro Max’s features. It may also incorporate more features into next year’s iPhone Pro that further distinguish it from the regular iPhone, the report said. One of those features, Bloomberg reported, could be a periscope lens for better optical zoom on the Pro Max, further underscoring the camera’s significance.

Samsung, meanwhile, used its previous high-end smartphone line, the Galaxy Note, to build the foundation for its current Galaxy Ultra devices. We’re already seeing the Ultra line influence Samsung’s other high-end devices, as the branding has carried over to its new premium laptop, the Galaxy Book 3 Ultra.

What makes an «ultra» or «pro» phone? Mostly the camera

Make no mistake, Apple and Samsung both view the camera as the most significant smartphone upgrade that customers are willing to splurge for. Samsung made that clear at its Unpacked event on Feb. 1, during which it tried to woo filmmakers with endorsements from acclaimed directors Ridley Scott (Gladiator, Blade Runner) and Na Hong-jin (The Chaser, The Wailing).

Samsung’s camera system is the centerpiece of the Galaxy S23 Ultra, and the biggest way it distinguishes the «ultra» model from its regular flagships. The company spent a large portion of its Unpacked presentation outlining the various new camera improvements: a higher-resolution 200-megapixel sensor, wider dynamic range, steadier optical image stabilization for video, faster autofocus and clearer shots in low light, among other upgrades. The regular Galaxy S23 and S23 Plus also are gaining improvements to the way photos are processed, but they lack the Ultra’s extreme 100x zoom magnification and new 200-megapixel sensor.

If you weren’t already convinced that Samsung is trying to entice camera enthusiasts, the company also makes it easier to access settings for shooting raw files by integrating those options directly into the native camera app. A raw file has uncompressed image data straight from the camera sensor, which allows for more leeway when editing. An Expert Raw file contains data from several images processed together and offers even more clarity and a wider dynamic range. Google and Apple have their own special raw files that are created in a similar way, bridging a traditional raw file with advancements from computational photography.

And to help fit all of those big files on your phone, the S23 Ultra’s base storage is now 256GB, up from the S22 Ultra’s 128GB. The decision to offer more storage in the entry-level model could also be seen as another effort to attract photographers and videographers, since high-resolution photos, raw files and 8K videos occupy a lot of space. The S23 Plus also starts at 256GB, but Samsung doesn’t offer a 1TB storage option for that phone the way it does with the S23 Ultra. It shows what a long way Samsung has come since launching its original Ultra phone, the Galaxy S20 Ultra, which maxed out at 512GB and started at 128GB just like the regular Galaxy S23.

Apple also loves to flaunt the iPhone’s photography prowess, and you could even argue that may have influenced some of Samsung’s thinking. That approach was on full display in September when Apple unveiled the iPhone 14 Pro, which has better optical image stabilization and low-light performance. Like Samsung, Apple also made a leap in resolution that brings the iPhone 14 Pro’s camera from 12 to 48 megapixels, although it’s really the device’s larger main sensor that’s made a big difference in the camera’s low-light performance. ProRaw, Apple’s feature for capturing raw photos that still incorporate some of the company’s image-processing algorithms, can now shoot at a 48-megapixel resolution.

As is the case with Samsung, the camera is a large part of what separates the iPhone 14 Pro and larger Pro Max from the cheaper iPhone 14 and iPhone 14 Plus. Those phones, by comparison, are missing the iPhone 14 Pro’s telephoto lens and have a smaller 12-megapixel main camera sensor. Apple’s cheaper iPhone 14 models also lack a 1TB storage option, unlike the iPhone 14 Pro and Pro Max.

The camera is the star, but there’s more to it

While the camera may be the biggest defining characteristic of Apple’s Pro line and Samsung’s Ultra line, there are other common threads between these phones. Both phones have more productivity-oriented features than the cheaper alternatives in their respective lineups. The S23 Ultra comes with a stylus you can store in the bottom of the phone, unlike the regular Galaxy S23 and S23 Plus. The iPhone 14 Pro has the Dynamic Island, a clever software interface built around the selfie camera for showing system alerts and controlling background activities without leaving the app you’re using. That feature is absent from the regular iPhone 14 and iPhone 14 Plus.

Both phones also have more to offer when it comes to the screen. For Samsung, that’s quite literal; the Galaxy S23 Ultra’s 6.8-inch screen is physically larger than the displays on the Galaxy S23 (6.1 inches) and the S23 Plus (6.6 inches). Apple offers the same two display sizes across the entire iPhone 14 lineup (6.1 inches or 6.7 inches), but has found other ways to make the screens on its Pro iPhone’s stand out. Only the Pro models have an always-on display, the Dynamic Island and an adaptive refresh rate for smoother scrolling and graphics.

Despite these similarities, Apple and Samsung’s approaches also differ in significant ways — mostly when it comes to which technologies these companies bring to cheaper devices. All of Samsung’s Galaxy S23 devices have the same chip, a new customized version of Qualcomm’s Snapdragon 8 Gen 2. Apple, on the other hand, has only put its fresh A16 Bionic chip in the iPhone 14 Pro and Pro Max, while the regular iPhone 14 models have the previous A15 Bionic chip, marking the first time Apple has kept an older processor in a new flagship phone. Apple also equips its Pro iPhones with a lidar scanner for detecting depth, which helps improve AR apps and certain photography features like autofocus and enables accessibility functions like door and people detection.

For Apple and Samsung, adding more advanced camera and display features to their premium phones isn’t just about boosting sales. Both companies are under pressure to uphold their reputations as innovators while proving there are still plenty of reasons to be excited about the smartphone’s future.

Right now, many of those reasons come down to the camera — the tool we use for everything from video chatting to documenting vacations and, perhaps in the future, fueling augmented reality apps. It will be fascinating to see how Apple, Samsung and others attempt to improve and redefine that experience over the next few years.

Technologies

Silksong, Long-Awaited Hollow Knight Spinoff, Gets Release Date: Sept. 4

Announced in 2019, Team Cherry’s follow-up is coming sooner than expected, and it’s on Game Pass on Day 1.

Hollow Knight: Silksong is the follow-up, announced back in 2019, to one of the most beloved indie games of the last decade. In a special announcement video on Thursday, Australian developer Team Cherry revealed that the wait is almost over. 

Silksong will be released on Sept. 4, according to the new trailer. The almost two-minute video reveals some of the new enemies and bosses in the upcoming spinoff and ends with the surprise release date. 

Originally, Silksong was going to be a DLC for Hollow Knight. However, numerous delays resulted in it being pushed back again and again. Glimpses of the game would show up here and there over the years, but it was this year that it received the most attention from Nintendo as part of its Switch 2 lineup, and from Microsoft, which confirmed it would be available on Xbox Game Pass. 

Hollow Knight: Silksong will be available on PC, Switch, Switch 2, Xbox One, Xbox Series X and Series S, PS4 and PS5. It will be available on Day 1 for Xbox Game Pass subscribers. 

Continue Reading

Technologies

PS5 Prices Go Up Today. Here’s How Much and Why

You can expect to pay more for a new PlayStation, thanks to «a challenging economic environment.»

Sony will increase the prices of its PlayStation 5 consoles in the US, starting today. This follows the trend of console manufacturers such as Microsoft and Nintendo raising prices for their hardware in response to tariffs. 

The PlayStation-maker posted about the price change Wednesday. The jump in price is $50 more than the current price for each model.

The new prices are:

«Similar to many global businesses, we continue to navigate a challenging economic environment,» Sony said in a post about the price increase. 

As of Thursday morning, retailers and Sony’s online store have yet to update the console prices. This jump in price also will likely affect recently released PS5 bundles such as the Astro Bot bundle and Fortnite Cobal bundle

Sony says accessories have not been affected by the change and this cost hike only affects the US. 

In May, Microsoft increased the price of the Xbox Series consoles and Nintendo hiked the original Switch console price and Switch 2 accessories this month.

While the companies didn’t point to the tariffs instituted by President Donald Trump as the reason for the hardware price jump, it would explain the trend in recent months. 

Continue Reading

Technologies

Google Thinks AI Can Make You a Better Photographer: I Dive Into the Pixel 10 Cameras

The camera specs for the Pixel 10 series reveal only a small part of what’s new for mobile photographers. I spoke with the head of the Pixel camera team to learn more.

If a company releases new phone models but doesn’t change the cameras, would anyone pay attention? Fortunately that’s not the case with Google’s new Pixel 10, Pixel 10 Pro and Pixel 10 Pro Fold phones, which make a few advancements in the hardware — hello, telephoto camera on the base-level Pixel for the first time — and also in the software that runs it all, with generative AI playing an even bigger role than it has before.

«This is the first year where not only are we able to achieve some image quality superlatives,» Isaac Reynolds, group product manager for the Pixel cameras, told CNET, «but we’re actually able to make you a better photographer, because generative AI and large models can do things and understand levels of context that no technology before could achieve.»

Modern smartphone cameras must be more than glass and sensors, because they have to compensate for the physical limitations of those same glass and sensors. You can’t expect a tiny phone camera to perform as well as a large glass lens on a traditional camera, and yet the photos coming out of the Pixel 10 models surpass their optical abilities. In a call that covered a lot of photographic ground, Reynolds shared with me details about new features as well as issues of how we can trust images when AI — in Google’s own tools, even — is so prevalent.

Pro Res Zoom adds generative AI to reach 100x

The new Pro Res Zoom feature is likely to get the most attention because it strives for something exceptionally difficult in smartphones: long-range zoom that isn’t a fuzzy mess of pixels.

You see this all the time: Someone on their phone spreads two fingers against the screen to make a distant object larger in the frame. Photographers die a little each time that happens because, by not sticking to the main zoom levels — 1x, 2x, 5x and so on — the person is relying on digital zoom; the camera app is making pixels larger and then using software to try to clean up the result. Digital zoom is certainly better than it once was, but each time it’s used, the person sacrifices image quality for more zoom in the moment.

Google’s Super Res Zoom feature, introduced with the Pixel 3, interpolates and sharpens the image up to 30x zoom level on the Pixel 10 Pros (and up to 20x zoom on the Pixel 10 and Pixel 10 Pro Fold). The new Pro Res Zoom on the Pixel 10 Pro pushes way beyond that to 100x zoom — with a significant lift from AI.

Past 30x, Pro Res Zoom uses generative AI to refine and rebuild areas of the image based on the underlying pixels captured by the camera sensor. It’s similar to the technology that Magic Editor uses when you move an object to another area in the image, or type a prompt to add things that weren’t there in the first place. Only in this case, the Pixel Camera app creates a generative AI version of what you captured to give the image crisp lines and features. All the processing is performed on-device.

Reynolds explained that one of the factors driving the creation of Pro Res Zoom was the environments where people are taking photos. «They’re taking pictures in the same levels of low light — dinners did not get darker since we launched Night Sight,» he said. «But what is changing is how much people zoom, [and] because the tech is getting so much better, we took this opportunity to reset and refocus the program on incredible zoom quality.»

Pro Res Zoom works best on static scenes such as buildings, skylines, foliage and the like — things that don’t move. It won’t try to reconstruct faces or people, since generative AI can often make them stand out more as being artificially manipulated. The generated image is saved alongside the image captured by the camera sensor so you can choose which one looks best.

What about consistency and accuracy of the AI processing? Generative AI images are built out of pixel noise that is quickly refined based on the input driving them. Visual artifacts have often gone hand-in-six-fingered-hand with generated imagery.

But that’s a different kind of generative AI, says Reynolds. «When I think of Gen AI in this application, I think of something where the team has spent a couple of years getting it really tuned for exactly our use case, which is image enhancement, image to image.»

Initially, people inside Google were worried about artifacts, but the result is that «every image you see should be truly authentic to the real photo,» he said.

Auto Best Take

This new feature seems like a natural evolution — and by «natural,» I mean «processor speeds have improved enough to make it happen.» The Best Take feature was introduced with the Pixel 8, letting you capture several shots of a person or group of people, and have the phone merge them into one photo where everyone’s expressions look good. CNET’s Patrick Holland wrote in his review of the Pixel 8, «It’s the start of a path where our photography can be even more curated and polished, even if the photos we take don’t start out that way.»

That path has led to Auto Best Take, which does it automatically — and not just grabbing a handful of images to work with. Says Reynolds, «[It] can analyze… I think we’re up to 150 individual frames within just a few seconds, and pick the right five or six that are most likely to yield you the perfect photo. And then it runs Best Take.»

From the photographer’s point of view, the phone is doing all the work, though, as with Pro Res Zoom, you can also view the handful of shots that went into the final merged image if you’re not happy with the result. The shots are full-resolution and fully processed as if you’d snapped them individually.

«What’s interesting about this is you might actually find in your testing that Auto Best Take doesn’t trigger very often, and there’s a very particular reason for that,» said Reynolds. «Once the camera gets to look at 150 items, it’s probably going to find one where everybody was looking at the camera, because if there’s even one, it’ll pick it up.»

Improved Portrait mode and Real Tone

Another improvement enabled by the Pixel 10 Pro’s Tensor G5 processor is a new high-resolution Portrait mode. To take advantage of the wide camera’s 50-megapixel resolution, Reynolds said the Pixel team rebuilt the Portrait mode model so it creates a higher quality soft-background depth effect, particularly around a subject’s hair.

Real Tone, the technology for more accurately representing skin tones, is also incrementally better. As Reynolds explained, Real Tone has progressed from establishing color balances for people versus the other areas of a frame to individual color balances for each person in the image.

«That’s not just going to mean better consistency shot to shot, it means better consistency scene to scene,» he said, «because your color, your [skin] tone, won’t depend so strongly on the other things that happened in the image.»

He also mentioned that a core component of Real Tone has been the ability to scale up image quality testing methods and data collection in the process of bringing the feature’s algorithms to market.

«What standards are we setting for diversity and equity, inclusion across the entire feature set?» he said. «Real Tone is primarily a mission and a process.»

Instant View feature in the Pixel 10 Fold

One other significant photo hardware improvement has nothing to do with the cameras. On the Pixel 10 Pro Fold, the Pixel Camera app takes advantage of the large internal screen by showing the previous photo you captured on the left side of the display. Instead of straining to see details in a tiny thumbnail in the corner of the app, Instant View gives a full-size shot, which is especially helpful when you’re taking multiple photos of a person or subject.

Camera Coach

So far, these new Pixel 10 camera features are incorporated into the moment you capture a photo, but Reynolds also wants to use the phones’ cameras to encourage people to become better photographers. Camera Coach is an assistant that you can invoke when you’re stuck or looking for new ideas while photographing a scene.

It can look at the picture you’re trying to take and help you improve it using suggestions such as getting closer to a subject for better framing or moving the camera lower for a more dramatic angle. When you tap a Get Inspired button, the Pixel Camera app looks at the scene and makes suggestions.

«Whether you’re a beginner and you just need step-by-step instructions to learn how to do it,» said Reynolds, «or you’re someone like me who needs a little more push on the creativity when sometimes I’m busy or stressed, it helps me think creatively.»

CP2A content credentials

All of this AI being worked into the photographic process, from Pro Res Zoom to Auto Best Take, invariably brings up the unresolved question of whether the images we’re creating are genuine. And in a world that is now awash in AI-generated images that look real enough, people are naturally guarded about the provenance of digital images.

For Google, one answer is to label everything. Each image captured by the Pixel 10 cameras or touches Google Photos is tagged with C2PA Content Credentials (Coalition for Content Provenance and Authenticity), even if it’s untouched by AI. It’s the first smartphone with C2PA built in.

«We really wanted to make a big difference in transparency and credibility and teaching people what to expect from AI,» said Reynolds. «The reason we are so committed to saving this metadata in every Pixel camera picture is so people can start to be suspicious of pictures without any information.»

Marking images that have no AI editing is meant to instill trust in them. «The image with an AI label is less malicious than an image without one,» said Reynolds. «When you send a picture of someone, they can look at the C2PA in that picture. So we’re trying to build this whole network that customers can start to expect to have this information about where a photo came from.»

What’s new in the Pixel 10 camera hardware

Scanning the specs of the Pixel 10 cameras, listed below, you’d rightly notice that they match those found on last year’s Pixel 9 models, but a couple of details stand out.

For one, having a dedicated telephoto camera is no longer one of the features that separates the entry-level Pixel from the pro models. The Pixel 10 now has its own 10.8 megapixel, f/3.1 telephoto camera with optical image stabilization that offers a 5x optical zoom and up to 20x Super Res Zoom.

It’s not as good as the 48-megapixel f/2.8 telephoto camera used in the Pixel 10 Pro and Pixel 10 Pro XL (the same one used in the Pixel 9 Pros), but that’s not the point. You don’t need to give up extra zoom just to buy a more affordable phone.

Another difference you’ll encounter, particularly when recording video, is improved image stabilization. The optical image stabilization is upgraded in all three phones, but the stabilization in the Pixel 10 Pros is significantly improved. Although the sensor and lens share the same specs as the Pixel 9 Pro, the wide-angle camera in the Pixel 10 Pro models necessitated a new design to accommodate new OIS components inside the module enclosure. Google says it doubled the range of motion so the lens physically moves through a wider arc to compensate for motion. Alongside that, the stabilization software has been tuned to make it smoother.

Camera Specs for the Pixel 10 Lineup

Pixel 10 Pixel 10 Pro Pixel 10 Pro XL Pixel 10 Pro Fold
Wide Camera 48MP Quad PD, f/1.7, 1/2″ image sensor 50MP Octa PD, f/1.68, 1/1.3″ image sensor 50MP Octa PD, f/1.68, 1/1.3″ image sensor 48MP Quad PD, f/1.7, 1/2″ image sensor
Ultra-wide Camera 13MP Quad PD, f/2.2, 1/3.1″ image sensor 48MP Quad PD with autofocus, f/1.7, 1/2.55″ image sensor 48MP Quad PD with autofocus, f/1.7, 1/2.55″ image sensor 10.5MP Dual PD with autofocus, f/2.2, 1/3.4″ image sensor
Telephoto Camera 10.8MP Dual PD with optical image stabilization, f/3.1, 1/3.2″ sensor size, 5x optical zoom 48MP Quad PD with optical image stabilization, f/2.8, 1/2.55″ image sensor, 5x optical zoom 48MP Quad PD with optical image stabilization, f/2.8, 1/2.55″ image sensor, 5x optical zoom 10.8MP Dual PD with optical image stabilization, f/3.1, 1/3.2″ sensor size, 5x optical zoom
Front camera 10.5MP Dual PD with autofocus, f/2.2 42MP Dual PD with autofocus, f/2.2 42MP Dual PD with autofocus, f/2.2 10MP Dual PD, f/2.2
Inner camera n/a n/a n/a 10MP Dual PD, f/2.2

Continue Reading

Trending

Copyright © Verum World Media