Connect with us

Technologies

WWDC 2023 Biggest Reveals: Vision Pro Headset, iOS 17, MacBook Air and More

From its expected AR/VR headset to new Macs to software updates like iOS 17, here’s what Apple unveiled at WWDC.

Apple’s Worldwide Developers Conference kicked off on Monday with a keynote address showing everything coming to the company’s lineup of devices. WWDC has been typically where the company gives us a first look at new software for iPhones, iPads, Apple Watches and Macs. But this year, Apple revealed a bevy of new hardware, too. 

The big announcement was the debut of the Apple Vision Pro headset, a «new kind of computer» as Tim Cook put it in the presentation. But with MacBook Air and other Mac hardware announcements — including new silicon — as well as software upgrades, no corner of Apple’s ecosystem lacked for updates.  

230605-clean-apple-wwdc-supercut-thumbnail-1 230605-clean-apple-wwdc-supercut-thumbnail-1
Watch this: Everything Apple Announced at WWDC 2023

11:44

For a detailed summary of everything announced as it happened, give our live blog a look. Read on for the highlights of the presentation and links to our stories.

wwdc-new-seq-00-40-00-16-still004.png wwdc-new-seq-00-40-00-16-still004.png

Apple Vision Pro, a new headset

The Apple Vision Pro is the company’s answer to the AR and VR headset race. It’s a personal display on your face with all the interface touches you’d expect from Apple, with an operating system that looks like a combination of iOS, MacOS and TVOS. And it’s not going to come cheap: The Apple Vision Pro retails for $3,499 and will start shipping early next year.

The device itself looks like other headsets, though the glass front hides cameras and even a curved OLED outer display (more on why later). The headset is secured to the wearer’s head with a wide rear band (no over-the-top strap), though as rumors suggested, there’s an external battery back that connects over a cable and sits in your pocket. There’s a large Apple Watch-style digital crown on the right side that lets you dial immersion (the outside world) in and out.

The Vision Pro has three-element lenses that enable 4K resolution, though you can swap out lenses, presumably for different vision capabilities. Audio pods are embedded within the band to sit over your ears, and «audio ray tracing» maps sound to your position. A suite of lidar and other sensors on the bottom of the headset track hand and body motions. 

Technically speaking, the Vision Pro is a computer, with an M2 chip found on Apple’s highest-end computers. But a new R1 chip processes all the other headset inputs from 12 cameras, five sensors and six microphones and sends it to the M2 to reduce lag and get new images to displays within 12 milliseconds. The Vision Pro runs the new VisionOS, which uses iOS frameworks, a 3D engine, foveated rendering and other software tricks to make what Apple calls «the first operating system designed from the ground up for spatial computing.»

Interior cameras track your facial motion, which is projected to others when on FaceTime and other video chatting apps. 

Apple Vision Pro Apple Vision Pro

Apple Vision Pro can scan your face to create a digital 3D avatar. 

Apple/Screenshot by CNET

To keep users from being cut off from the outside world, the EyeSight feature uses inside-pointing cameras and the headset’s outer display to show your eyes — essentially showing people around you what your eyes are focusing on. If you’ve dialed your immersion all the way on, your eyes will disappear on the outside screen. But you’re not totally cut off. While wearing the headset, if someone approaches you they’ll filter in to your vision. 

The interface uses hand motions to control the device, though there are also voice controls. It’s tough to tell how these controls will work, and we’d expect that users will need some time to adapt to not using a mouse and keyboard.

This isn’t just an entertainment device. Apple is pitching its first new product in eight years as a work-from-home and travel device, essentially letting you open however many windows you want. It can work in the office as a display for Macs, and supports Apple’s Magic Keyboard and Trackpad devices. 

The Vision Pro has Apple’s first 3D cameras and can take spatial photos, providing 3D depth with binaural audio to experience moments with more immersion. Of course, this spatial experience is extended to movies that’s «impossible to represent on a 2D screen,» Apple said during its presentation, continually teasing the exclusivity that non-headset wearers won’t even understand without trying out a Vision Pro. Disney CEO Bob Iger took the WWDC stage to vouch for the headset, and followed with a short video showing interactive 3D experiences that Vision Pro users will soon get to experience on the Disney Plus streaming service.

Now that Apple has all these new cameras and eye-tracking, it’s introduced a way to secure your data and purchases with Optic ID, which uses your eyes as an optical fingerprint for authentication. Camera data is processed at the system level, so what the headset sees isn’t fed up to the cloud.

Read more: Apple’s ‘One More Thing’ retrospective 

4 apple macbooks with screens open 4 apple macbooks with screens open

New MacBook Air 15

As was rumored, Apple announced a new MacBook Air 15, a larger version of the MacBook Air 13 that launched last year. 

The MacBook Air 15 is powered by an M2 chip and gets up to 18 hours of battery life. Configurations can come with up to 24GB of memory and up to 2TB of storage, retailing for $1,299 to start (or $1,199 with a student discount).  

The 15-inch model is 11.5mm thick and 3.3 pounds, and has two Thunderbolt ports and a Magsafe cable connector — along with a 3.5mm headphone jack. It has an above-display 1080p camera in a notch, three microphones and six speakers with force-canceling subwoofers.

Read more: 15-inch MacBook Air M2 Preorder: Where to Buy Apple’s Latest Laptop

image inside of white home with plants image inside of white home with plants

Mac Studio with M2

A new Mac Studio has landed and it comes with Apple’s latest silicon. The new model comes with an M2 Max chipset, or the new M2 Ultra chipset — essentially two M2 Max chips combined, which enables up to 192GB of memory.

The M2 Ultra stole the spotlight with new capabilities, with a 24-core CPU and streaming 22 videos at 8K ProRes resolution at once. It can support up to six Apple Pro Displays at once. 

The Mac Studio starts at $1,999 and will be available starting next week.

apple mac device apple mac device

Apple/Screenshot by James Martin/CNET

Mac Pro with M2 Ultra

Apple wasted no time announcing that its new high-end desktop Mac Pro model would get the M2 Ultra as well. The new Mac Pro gets all the M2 Ultra upgrades as the Studio, including support for up to 192GB of RAM. 

The Mac Pro has eight thunderbolt ports, two HDMI ports and dual 10GB ethernet ports, with six open PCIe Gen 4 slots. The new Mac Pro comes in both upright tower and horizontal rack orientations.

The new Mac Pro starts at $6,999 and will be available starting next week.

iphone screens with photos and widgets display iphone screens with photos and widgets display

iOS 17

iOS 17 brings a ton of quality-of-life improvements, and the iOS 17 developer beta is available now to download. Finally, you can use more filters while searching within your Messages. In addition to pressing and holding on messages to reply, you can also simply swipe on specific messages to reply to them, and voice notes will be transcribed.

Say goodbye to gray screens when you get calls — now you can set full-screen photos or Memoji to contacts when they call you. And if someone leaves a voicemail, you can see it transcribed in real-time to help you screen calls if you don’t recognize a caller.

230605-clean-ios-17-walkthrough 230605-clean-ios-17-walkthrough
Watch this: WWDC 2023: Here Are All the Major iOS 17 Features

06:31

A new safety feature, Check In, sends a note to a trusted contact when you reach a location — like when you make it home safe after late-night travel. If it’s taking you longer to get to a destination, you’ll be prompted to extend the timer rather than alert your contact. It also shares your battery and signal status. Check In is end-to-end encrypted.

Last year, Apple introduced an iOS feature to let you copy photo subjects and paste them as stickers — and now you can do that with video to essentially create GIFs to share with friends or even as responses to Messages. All emoji are now shareable stickers, too.

AirDrop has been a helpful tool to send files between Apple devices, but now you can share your contact info with Name Drop. You can choose what you want to share between email addresses, phone numbers and more.

Also, say goodbye to relying on Notes to jot down your thoughts — Journal is a new secure app for personal recollections. Apple is pitching it as a gratitude exercise, but iOS will auto-include activities like songs and workouts you’ve done to your personal log. 

Apple Maps got an update that Android owners have had for years — the ability to use Maps offline, especially helpful when you’re outside network range while outdoors or conserving battery.

A new mode, StandBy, converts an iPhone to an alarm clock when it’s charging and rotated horizontally. It gets smart interactions like a large visible clockface along with calendar and music controls.

Lastly, as was rumored, you won’t have to say «Hey Siri» anymore. Just saying «Siri» will bring up the voice assistant.

Read more: Apple Finally Lets You Type What You Ducking Mean on iOS 17

people working on ipads people working on ipads

Apple/Screenshot by James Martin/CNET

iPadOS 17

iPadOS 17 brings more controls to widgets, which don’t just show more info at a glance — they have more interactive buttons to let you control your smart home or play music.

iPadOS 17 is bringing more interactive personal data to the Health app, including richer sleep and activity visualization. 

The next iPadOS update brings quality-of-life upgrades like more lock screen customization and multiple timers (helpful when cooking), as well as improvements to the follow-you-during-video-calls Stage Manager feature for iPad selfie cameras.

With all the screen space on an iPad, Apple expanded what you can do with PDFs, which can be autofilled and signed from within iPadOS. iPad owners can collaborate in real time while tweaking PDFs, and the files can now be stored in the Notes app.

macOS sonoma display macOS sonoma display

Apple/Screenshot by James Martin/CNET

MacOS Sonoma

MacOS Sonoma, named after one of California’s most famous wine-producing areas, continues the WWDC theme of adding more widget functionality. 

Sonoma also has some gaming upgrades like a new gaming mode that prioritizes CPU and GPU to improve frame rate. Apple is paying attention to immersion with lower latency for wireless controllers and speakers or headsets. The company is also courting developers with game dev kits and Metal 3. But the biggest gaming announcement is that legendary game creator Hideo Kojima’s opus Death Stranding is coming to Macs later this year. «We are actively working to bring our future titles to Apple platforms,» Kojima said during the WWDC presentation. 

On the business side, Mac has improved videoconferencing with an overlay that shows slide controls while you’re presenting. Apple also introduced new reactions — like ticker-tape falling for a congratulations — that can be triggered with gestures.

PassKey, the end-to-end encrypted password chain tech Apple introduced last year, can now be shared with other contacts, and everyone included can edit and update passwords to be shared with the group. 

Safari has security updates including locking the browser window when in private browsing mode, and profiles to separate accounts, logins and cookies between work and personal use.

AirPods and audio upgrades

Apple has a handful of improvements to its audio products. AirPods will get Adaptive Audio, which combines noise-canceling with intelligent audio to drown out annoying background noise while letting through important sounds — like car horns or bike bells. It’ll also pass through voices in case someone starts a conversation in person.

And it’s far easier to digitally take control of the music with SharePlay while somebody with CarPlay is driving — a prompt will go out to others in the car asking if they want to take control.

Apps in watchOS 10 Apps in watchOS 10

Apps in WatchOS 10 are getting a new look.

Apple (screenshot)

WatchOS 10

Yet again, widgets make an appearance with WatchOS 10, the next operating system upgrade for Apple Watches. Widgets are now accessible in a stack from your home screen — just use the digital crown to scroll between them.

Apple has focused on cycling this year, improving workouts by showing functional threshold data, an important metric for cyclists. It also connects over Bluetooth to sensors on bikes, and there’s a new full-screen mode for iPhones that allows you to use it as a full screen while cycling.

Hikers, rejoice! WatchOS 10 has upgraded its compass with cellular connection waypoints, telling you which direction to walk and how far you have to go before you can get carrier reception. It also shows SOS waypoint spots, and shows elevation view in the 3D compass view. There’s also a neat topographical view.

Apple is also expanding its Mindfulness app to log how you’re feeling in State of Mind, choosing between color-coded emotional states. You can even access this from your iPhone in case you’re away from your Apple Watch. 

Health focuses for 2023

On top of the WatchOS Mindfulness updates, Apple introduced a neutral survey to self-report mood and mental health, which acts as a sort of non-medical way to indicate whether you may want to get professional help.

Apple also has a new cross-device Vision Health focus in the Health app, and a new feature on the Apple Watch measures daylight time spent outside to watch for myopia in younger wearers. Screen Distance uses the TrueDepth camera on iPads to warn people if they’re too close to the screen.

Technologies

US Wants Judge to Break Up Google, Force Sale of Chrome: Here’s What to Know

OpenAI, Perplexity AI and Yahoo have expressed interest in buying Chrome, as Google’s legal battle escalates. Here’s what it could mean for the future of the web.

The US Department of Justice and Google are facing off in court over allegations that company is illegally maintaining its dominance in the search engine market. As a result, the DOJ is advocating for Google to sell off some of its key assets, including its Chrome browser. The hearings began April 22 and are expected to last three weeks.

This proposal has attracted interest from several tech companies, including OpenAI, Perplexity AI and Yahoo, all expressing willingness to purchase Chrome should the court mandate its sale.

The case could change how tech companies do business, as well as how people find answers to their online search queries. Government lawyers made their case in opening statements Monday, saying that Google should be forced to sell Chrome, its web browser, which pushes people to the Google search engine.

The company should also be forced to help rival search engines that it has unfairly kept out of competition, Justice Department lawyer David Dahlquist said.

«This is the time for the court to tell Google and all other monopolists who are out there listening, and they are listening, that there are consequences when you break the antitrust laws,» Dahlquist said, according to The New York Times.

Google counters

Google’s lawyers say that any remedies should only consider the company’s deals with companies such as Apple, Mozilla and Samsung to make it the default search engine for smartphones and other devices.

«Google won its place in the market fair and square,» said company attorney John Schmidtlein, according to NBC News.

Judge Amit P. Mehta, of the US District Court for the District of Columbia, is now hearing arguments and executives from major tech and artificial intelligence companies have been testifying. 

Mehta is the same judge who ruled in August that Google illegally maintained a monopoly in search. That trial, held last year, took 10 weeks and was years in the making.

«After having carefully considered and weighed the witness testimony and evidence, the court reaches the following conclusion: Google is a monopolist, and it has acted as one to maintain its monopoly,» Mehta wrote in the August decision. «It has violated Section 2 of the Sherman Act.»

After Mehta hears arguments, he’s expected to order remedies by the end of summer.

Google is currently the king of online search, with more than 89% global market share, according to GlobalStats, down slightly from 91% last summer.

A representative for Google referred CNET to the company’s online statement from before the hearings began. In it, company vice president Lee-Anne Mulholland says such sweeping remedies would harm America’s economy.

Mulholland calls the action «a backwards-looking case» and says the DOJ proposal would make it harder for users to get to preferred services, would prevent the company from competing fairly and would force Google to share users’ private search queries with other companies.

OpenAI, Perplexity and Yahoo want to buy Chrome

On Tuesday, OpenAI executive Nick Turley testified that his company would be interested in buying the Google Chrome browser if the company is forced to sell it. 

He also said that ChatGPT, OpenAI’s artificial intelligence chatbot, is «years away from its goal of being able to use its own search technology to answer 80% of queries,» according to Reuters. Turley also testified that Google declined an attempt by OpenAI to use Google search technology within ChatGPT.

Two other companies have also expressed interesting in buying Chrome — Perplexity AI and Yahoo. 

Perplexity’s chief business officer, Dmitry Shevelenko, expressed interest in purchasing Chrome in court. 

Yahoo’s general manager of search, Brian Provost, also testified that the company is interested in acquiring Chrome. Yahoo has been developing its own browser prototype but believes that purchasing Chrome is a faster route to increasing its search market share, according to The Verge.

Potential outcomes

Many things could happen to Google, including a breakup of the company. If such a penalty were instituted, it might involve breaking off the Chrome browser or Android smartphone operating system parts of the company. 

The DOJ wants to prohibit Google from entering into exclusive agreements that makes its search engine as the default on devices and browsers. The Department of Justice also wants Google to share certain user data with competitors to level the playing field.

This would be the government’s first attempt to dismantle a company for illegal monopolization since its unsuccessful efforts to break up Microsoft two decades ago.

Google could also be forced to make its data available to competitors or abandon the controversial economic deals that made the Google search engine the default on devices such as the iPhone.

Why does this matter?

Google is not the only company facing legal issues. Major tech companies Apple and Amazon are also facing antitrust lawsuits. An antitrust trial against Meta, owner of Facebook, Instagram, Threads and WhatsApp, began April 14.

The trial could also affect the burgeoning artificial intelligence era. The Justice Department has said that if remedies are not imposed on Google, it expects Google to use its AI products to further extend its monopoly.

And since the August trial, presidential administrations have changed. As the Times notes, the hearings signal that the Trump administration intends to keep an eye on the changing tech industry.

Do people switch from default search engines?

The August case focused on Google paying Apple and other companies to make its search engine the default on devices such as Apple’s iPhone. Google has said it didn’t maintain a monopoly through such agreements and that consumers could change their device defaults to use other search engines. 

Microsoft CEO Satya Nadella testified in October that the idea that people shift from one search engine to another is «completely bogus» and added «defaults is the only thing that matters in changing search behavior.»

According to the Justice Department, the Google search engine is used for nearly 90% of web searches, but the company disputes that number, the Times reports.

The Sherman Antitrust Act, which dates to 1890, prohibits activities restricting interstate commerce and competition in the marketplace, essentially outlawing corporate monopolies. It’s the cornerstone of US antitrust legislation, leading to the federal government’s breakup of late 19th century Gilded Age industrial giants.

CNET’s Imad Khan contributed to this report.

Continue Reading

Technologies

PayPal Teams Up With Coinbase and Launches Rewards System for PYUSD Holders

Continue Reading

Technologies

Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra

When photo quality is a top consideration, the best phones from Apple and Samsung are amazing. But which is better? It’s time to find out.

When you’re looking for the best camera to carry in your pocket, you need to consider today’s top-tier phones. The imaging capabilities of the iPhone 16 Pro and Galaxy S25 Ultra are among the best money can buy. And with travel season ramping up, carrying a phone may be the most convenient camera. But for photo details how do these two mobile titans compare?

To find out, I shot hundreds of photos using both phones in a variety of conditions to see which phone takes the best-looking images. What’s «best» is often down to personal perspective so while I’ll be giving my personal take on each test as a professional photographer and giving my reasons why I prefer one over the other, you may well find that you prefer the other. So have a look through the range of examples here and see if you come to a different conclusion. 

Read more: Best Camera Phone of 2025

All images shown have been taken using each phone’s default camera mode using default settings, unless otherwise stated. While images from the Galaxy S25 have been uploaded as taken, the iPhone’s images have had to be converted through Adobe Lightroom as our publishing platform doesn’t support Apple’s default HEIF image format. This process doesn’t affect the image in any way. 

Ready? Let’s dive in.

Starting out with an easy outdoor scene. Both phones have done a great job capturing an even exposure here and both images are packed with detail. It’s difficult to choose between them, but the iPhone has the edge for me as it’s achieved a slightly warmer image with more natural-looking tones. The S25 Ultra’s image looks too saturated, especially in the blue sky, which I find quite distracting. 

It’s much the same story when we switch to the ultrawide lenses on both phones. I prefer the warmer tones in the iPhone’s shot, which makes the S25 Ultra’s look quite cold by comparison. I also prefer the lighter shadows on the iPhone’s image, making it an easy win for the iPhone here. Notably, both phones are doing a good job of compensating for the ultrawide lenses at the edges (a function turned on by default on both phones); the railing remains straight in each shot and not curving as you’d typically see using a lens this wide. 

There’s almost no difference between these two outdoor scenes. The blossom looks crisp on both images, with excellent overall exposure. The iPhone’s image is again slightly warmer in tone but it’s negligible.

The Galaxy S25 takes an easy win with this image of bluebells. The colors are much more vibrant, especially in the greens on the blades of grass, which look quite washed out on the iPhone’s image. It actually looks like the S25’s camera lens is slightly polarized to reduce reflections and increase saturation, but I don’t know if that’s the case. Either way, Samsung takes the win here.

At 5x zoom things get worse for the iPhone. Despite the bluebells being reasonably far away, the phone seemed unable to achieve a sharp focus on the flowers. The S25 Ultra, meanwhile, managed to achieve a sharp image with richer colors. 

I prefer the iPhone’s image here though. It’s brighter and the warmer colors on the bricks on the surrounding buildings look much more true to life. 

The iPhone’s image is again brighter here and I prefer its colors too. The Galaxy S25 Ultra does have the edge in fine detail, though. You really need to zoom in to see it but the tiny lines on the building are slightly sharper on the S25. 

The S25 Ultra does have a physical advantage over the iPhone with its 10x optical zoom lens, which allows it to zoom in even further while still maintaining a pin-sharp image. 

You can still digitally zoom in with the iPhone to 10x, and the results aren’t bad. I prefer the colors of the S25 Ultra’s shot here, but the difference in detail isn’t that noticeable.

Zooming in close to see the fine details, the S25 Ultra’s optical zoom image definitely has a bit more clarity but the digital upscaling on the iPhone’s shot has done a great job here, as the difference isn’t immense.

iPhone 16 Pro vs. Galaxy S25 Ultra: Night modes compared

At first glance, the only real difference between the iPhone’s 5x shot and the S25 Ultra’s 5x shot is the color balance. And honestly, I don’t have a preference between the warmer tone of the iPhone or the more magenta bias of the S25. 

However, when you zoom in close to the details, the iPhone has produced a sharper image here, with an odd sort of digital blurring around the lamp post in the S25 Ultra’s image. So sometimes the S25 Ultra’s zoom is sharper, other times it’s the iPhone’s. I’m glad they’re making this easy for me. 

Again, the only real difference here is in the color balance and I don’t really know which I prefer. The exposure, noise levels and amount of detail are practically identical. 

Things changed when I switched to the ultrawide lenses, though. The S25 Ultra’s shot is definitely brighter, capturing more detail in the cobblestones in the foreground and in the buildings in the distance. The iPhone’s image is much darker overall. 

Just to confuse things further, the iPhone’s nighttime image with its ultrawide lens is noticeably brighter than the S25 Ultra’s in this example that I shot in the Arctic. I actually had to double-check the image metadata to make sure I hadn’t mixed these up, but I haven’t. The iPhone’s image has captured more light information here and produced more detail on the ice door to the right. 

The iPhone’s nighttime image is again slightly brighter here but it’s also kept the bright highlights on the pub sign under control. On the S25 Ultra’s image, those highlights are almost lost to pure white but the lovely green and yellow tones have been retained in the iPhone’s image. The colors overall are noticeably warmer on the iPhone’s shot, however, which may not be to your taste. Here, I think they work well.

But in this example, the iPhone has produced a weirdly warm-looking image that I really don’t like. Those warm colors were not present at the time of capture and it doesn’t work for the scene, especially not with such strong orange tones in the sky. The S25 Ultra’s image is much more balanced overall and it’s a slightly sharper image too. It’s a very easy win for Samsung here.

Things don’t improve for the iPhone when using the ultra-wide lens. Its image is again plagued by overly warm tones, while the S25 Ultra’s shot is both more color-accurate and brighter. 

iPhone 16 Pro vs. Galaxy S25 Ultra: Which takes better selfies?

While the Galaxy S25 Ultra’s selfie is slightly brighter, I don’t like what it’s done with the colors. My face has been made a weird shade of orange and my denim jacket is a much deeper blue than it really is. The skin tones on the iPhone’s shot are much more accurate, and its shot is sharper as well.

Both phones have a wider-angle mode for the selfie camera, although the iPhone’s seems to be a lot wider. That’s definitely worth keeping in mind if you frequently like to cram lots of friends into your group pics. You could probably squeeze at least one or two extra friends in if you used the iPhone, or have to decide who you like least and leave them out of frame if you used the S25 Ultra. Otherwise, the image differences are the same as before. 

iPhone 16 Pro Vs Galaxy S25 Ultra: Which camera is better?

I’ve written many of these comparison pieces on various generations of phones in my 14 years at CNET and I don’t remember having done one that’s felt this close. The problem is that neither phone excels consistently in one area; the iPhone 16 Pro’s ultra-wide shots aren’t as bright as the S25 Ultra’s, except on those occasions when they actually are, confusingly. I’ve taken many more images not included here that both support some of my conclusions and argue against them. Go figure.

But there are some takeaways I can give with confidence. Generally speaking, the iPhone’s colors are more natural than the S25 Ultra’s, which can sometimes look overly saturated. This has been the case with almost every Samsung phone since the company started putting cameras in them and it’s still the case today. Those looking for a more natural base image to apply your own filters and effects over will be better suited with the iPhone 16 Pro.

But that’s less the case at night, when the iPhone more consistently delivers warmer tones that look less natural than the S25 Ultra’s. So, if night photography is important to you, the S25 Ultra may be the better option. Overall, its night mode images from all lenses were brighter and sharper.

Sure, the S25 Ultra has the extended zoom range but you’d really need to know you’ll make the most of a 10x zoom to justify picking one over the other. Personally, I find the 5x zoom level a perfect sweet spot and here the phones are pretty much on par. And on those rare occasions you may want to push things further, the iPhone’s digital zoom can still deliver sharp results. 

There are other things for photographers to consider too: Apple’s ProRaw is superb and while the company’s Photographic Styles can be good for adding a creative look to your images, Samsung’s new tool for mimicking the color grade from example photos you feed it works surprisingly well — I actually think I might get more use out of that overall. I haven’t even gone into video quality either, which is a whole other article, especially when you consider both phones shoot Log video, although only the iPhone uses ProRes. 

Deciding between the phones based solely on the cameras is nigh on impossible. Which one you should get will instead come down to the bigger question of iOS versus Android; which platform you’re already using and which one will work best with other pieces of tech in your life. But for simple picture quality, you may as well toss a coin.

Continue Reading

Trending