Technologies
Can My iPhone 17 Pro Match a 6K Cinema Camera? I Teamed Up With a Pro to Find Out
I put a video shoot together to see just how close an iPhone can get to a pro cinema setup.
The iPhone 17 Pro packs a powerful video setup with a trio of cameras, large image sensors (for a phone), ProRes raw codecs and Log color profiles for advanced editing. It makes the phone one of the most powerful and dependable video shooters among today’s smartphones.
Apple often boasts about famous directors using the iPhone to shoot films and music videos. The company even records its event videos for new products with the iPhone.
But is the iPhone really good enough at shooting video to replace a traditional cinema camera? To see how good the iPhone 17 Pro is for professional use, I gave it a proper test.
I put together a video shoot where I pitted the $1,000 iPhone against a full professional cinema camera rig, worth thousands of dollars, to see just how well Apple’s phone can hold its own. I planned a video production at my favorite coffee roaster in Edinburgh, called Santu, which is based in a stunning building that I knew would look amazing on camera.
To give both cameras the best chance, I worked with Director of Photography Cal Hallows, who has been responsible for production on major shoots around the world, working with brands including Aston Martin, the BBC, IBM and Hilton Hotels.
Here’s what happened.
Our filming equipment
We didn’t use any external lenses with the iPhone; instead, we relied on either the built-in main, ultrawide or telephoto options. I shot my footage using the BlackMagic Camera app. I had a Crucial X10 external SSD since I was recording in Apple’s ProRes raw codec, which creates large files.
I also had a variable neutral density filter to achieve a consistent shutter speed. For some shots, I used Moment’s SuperCage to help give me a better grip — and therefore smoother footage. But for other shots, I just used the phone by itself to make it easier to get into tight spaces. More on that later.
The iPhone’s competition was the $3,300 BlackMagic Pyxis 6K. It’s a professional cinema camera with a full-frame 6K resolution image sensor and raw video capabilities. I paired that with some stunning pro cine lenses, including a set of Arles Primes, the XTract Probe lens from DZO Film and a couple of choice cine primes from Sigma. It’s a formidable and pricey setup for any cinematographer.
The shoot day
We shot over the course of a single day. I’d already created a rough storyboard of the shots I wanted to get, which helped me plan my angles and lens choices. I wanted to try and replicate some angles directly with both cameras.
This shot of the store room being opened (above), for example — was a lovely scene, and I didn’t see much difference in quality between the iPhone’s video and the BlackMagic’s. This was the case with a few of the scenes we replicated. Apple’s ProRes raw codec on the iPhone provided a lot of scope for adjusting the color, allowing us to create beautiful color grades that looked every bit as striking as footage from the Blackmagic camera.
Sure, you could tell that they were different, but I couldn’t honestly say if one was better than the other.
Other shots were more difficult to replicate. I love this low-angle of the roastery owner, Washington, pulling his trolley through the scene. On the iPhone, the main lens wasn’t wide enough to capture everything we wanted but switching to the ultrawide was too much the other way and we ended up having spare gear and other people in the frame.
This made several shots a challenge to replicate as the fixed zoom ranges of the iPhone simply didn’t translate to the same fields of view offered by our lenses on the BlackMagic camera. As a result, getting the right framing for shots from the iPhone was trickier than I expected. But focal length wasn’t the only reason using «real» lenses was better.
The DZO Arles Primes are awesome cinema lenses that offer wide apertures that allowed us to shoot with gorgeous natural bokeh. We used this to our advantage on several shots where we really wanted the subject to be isolated against an out-of-focus background.
Secret weapons
That was especially the case when we used our secret weapon: the DZO Films Xtract probe lens. This bizarre-looking, long, thin lens gives both a wide-angle perspective coupled with a close focusing distance.
I loved using the probe lens for this shot, particularly where we’ve focused on exactly where Washington was using the bean grinder. I tried to replicate it on the iPhone using the close-focusing ultrawide lens and the shot looks good, but it lacks the visual sophistication that I can get from a big, professional camera. Especially because the lack of background blur makes it easier to see distracting background items stored under the counter that are otherwise «hidden» in the blur on the main camera.
But the iPhone has its own secret weapon, too. Its size. The tiny dimensions of the iPhone — even with a filter and the SSD crudely taped to it — is so small that we were able to get shots that we simply couldn’t have achieved with the big cinema camera.
In particular, this shot, where I rigged the iPhone to an arm inside the cooling machine so that it travelled around as the beans were churned. I love this shot — and a top-down view I shot of the arms turning beneath. Both angles give this incredible energy to the film and I think they are my favourite scenes of the whole production. It wasn’t easy to see the phone screen in these positions but SmallRig’s wireless iPhone monitor made it much easier to get my angles just right. Trying to rig up a large, heavy camera and lens to get the same shots was simply out of the question.
How well did the iPhone compare?
I’m really impressed with both cameras on this project, but my expert Director of Photography, Cal, had some thoughts, too.
«The thing I really found with the iPhone,» Cal explained, «was simply the creative freedom to get shots that I’d have never had time to set up. There’s only so long in a day and only so long you have access to filming locations or actors, so the fact that you can just grab your iPhone and get these shots is amazing.»
«I have used my iPhone on professional shoots before. One time in particular was when I was driving away from set and I saw this great sunset. If I’d have spent time rigging up my regular camera, I’d have missed the sunset. So I shot it on my phone and the client loved it — it ended up being the final shot of the film. At the end of the day, a good shot is a good shot and it doesn’t matter what you shot it with,» said Cal.
So was it all good for the iPhone?
«The depth of field and the overall look of the cinema lenses still come out on top — you’re just not going to get that on a phone,» explained Cal. «When it came to grading the footage, I had to use a lot of little workarounds to get the iPhones to match. The quality quickly started to fall apart in certain challenging scenes that just weren’t a problem with the BlackMagic.»
So it’s not a total win for the iPhone, but then, I never expected it to be. The iPhone was never going to replace the pro camera on this shoot, but it instead allowed us to augment our video with shots that we would otherwise never have gotten.
I love the creative angles we found using just the phone, and while Cal struggled to balance its colors as easily, the footage does fit in nicely with the rest of the video and makes it more dynamic and engaging as a result.
And that’s not to say the shots we didn’t use from it weren’t good. I’m actually impressed with how the iPhone handled most of the things we threw at it.
So don’t assume that if you want to get into filmmaking, you need to drop tens of thousands on a pro cinema camera and a set of cine primes. Your iPhone has everything you need to get started, and it’ll let you flex your creativity much more easily.
Our days of shooting, editing and grading have proven that the iPhone isn’t yet ready to be the only camera you need on a professional set. But mix its small size in with your other cameras, and then you’ve got yourself a truly powerful production setup.
Technologies
Turns Out Perplexity Might Be the Sleeper Feature on Samsung’s Galaxy S26
Having Perplexity’s AI and models on devices from the world’s biggest phone-maker puts the company under a brighter light.
There were plenty of references to AI at today’s Galaxy Unpacked event. But Samsung isn’t alone; nearly every major smartphone launch in recent years has included new AI features or partnerships with AI companies.
Samsung launched its latest iteration of Galaxy AI, debuting it alongside Galaxy S26 phones. This follows weekend news that the company plans to integrate Perplexity’s AI agent — and even support a «Hey Plex» wake word — on its new phones. But the partnership appears to go beyond simply giving Samsung users another AI option.
Since late 2023, phone-makers have been leapfrogging one another to add generative AI features and integrate AI agents. Nearly every new Android phone supports Google’s Gemini assistant. Apple’s iPhones integrate OpenAI’s ChatGPT into the phone’s Visual Intelligence feature and its Siri overhaul will incorporate Google’s Gemini AI models.
While Perplexity has partnered with phone-makers such as Motorola to preload its app — and has been integrated into devices for Deutsche Telekom — having its AI and models built directly into phones from the world’s largest manufacturer puts the company on a much bigger stage. It marks a shift toward AI agents being just another tool people choose to use, much like a phone app.
«The first step toward an agentic mobile ecosystem is the user getting to choose whatever agent they want,» Dmitry Shevelenko, Perplexity’s chief business officer, told CNET. «I think this is where Samsung is taking a big, big leap forward.»
Perplexity’s Sonar API powers aspects of Samsung’s Galaxy AI ecosystem. Shevelenko said that the company’s engineers worked closely with Samsung’s team to revamp its Bixby assistant at the framework level, getting deep system access. He noted that it’s the first time a third-party AI company has achieved parity on a major mobile OS. The Galaxy S26 phones that Samsung announced support the new «Hey Plex» wake word, putting Perplexity shoulder-to-shoulder with Google’s Gemini AI assistant, which is integrated into Android on Samsung devices.
«What’s unique is the only other company that has it is Google, right?» said Shevelenko. «It’s a real paradigm shift for Samsung to be going into a multi-AI direction, where they are giving their users choice. And I think they see this as a strategic differentiator.»
Samsung’s inclusion of Perplexity touches many of the company’s own apps including Calendar, Clock, Gallery, Notes and Reminders. The benefit of structuring Perplexity’s AI deeply into Samsung’s software is that people can have a lighter interaction with their phones. As opposed to unlocking their device, navigating the home page, opening the app and entering a query, people will be able to simply press a button, say, «Hey Plex,» starting their search within seconds.
But the integration of Perplexity isn’t limited to Bixby. Shevelenko said Samsung’s browser, aptly named Internet, includes agentic browsing using Perplexity’s Comet technology as well.
Such a significant moment for Perplexity naturally draws parallels to Apple and its partnership with OpenAI, which has partnered with former Apple designer Jony Ive for its own hardware efforts. When I asked Shevelenko about the possibility of Perplexity making its own phone or hardware, he responded emphatically, «No.»
«We are laser-focused on working with all the best OEMs,» he said. «Our thing we’re world-class at is building accurate AI that is easy to use and delightful to use and growing that curiosity.»
And while we wait for Samsung to announce new phones, it’ll be interesting to see how Galaxy phone owners use the phone’s AI agents. Soon, people could say, «Hey Google» into their Samsung devices to prompt Gemini, or «Hey Plex» to trigger a query with Perplexity. And options are usually a good thing.
Samsung did not immediately respond to a request for comment.
Technologies
ADT Acquires AI Company for Sensing People and Activity in Your Home
ADT’s acquisition of Origin AI brings presence-sensing technology under the home security company’s umbrella.
ADT on Tuesday announced an interesting new acquisition for anyone looking to the future of home security — and it’s no surprise AI is a part of the story. In a $170 million deal, ADT has purchased Origin AI, which specializes in people detection in spaces like the inside of your home, something the security company is calling AI-sensing technology.
ADT has not disclosed specific plans for AI technology, but this comes at a time when concerns about corporate surveillance by companies like Ring and Flock have reached a fever pitch.
«ADT has been testing and evaluating Origin’s technology pre-acquisition,» ADT Chief Business Officer Omar Kahn told me. «In 2026, the focus is on integrating the technology into ADT’s platform, with commercialization expected to begin in 2027.»
Presence sensing doesn’t sound like the chatty, summary-creating large language models we consider AI these days, nor the person and car recognition features companies like Flock use. It’s a system that analyzes home Wi-Fi frequencies for disruptions. The AI is trained in pattern recognition to identify which disruptions indicate that humans are at home (ignoring pets) and what they may be doing.
The technology has cropped up in many spots over the past couple of years. I’ve seen it before with aging-in-place technology and Philips Hue’s newest smart bulbs, but most recently with Aqara’s sensor at CES 2026, which can detect when multiple people are congregating, standing, sitting or lying down.
How does presence sensing affect people’s privacy?
It’s not clear how ADT will use Origin’s presence sensing in its home security systems, though the company did mention smart automation, personalization and reducing false alarms. In one example, it could automatically adjust an ADT-supported thermostat when multiple people are detected moving around a house. But that also raises privacy questions.
Presence sensing, like Origin’s tech, has certain privacy benefits. It doesn’t use cameras to film anyone or save video recordings of people, and it doesn’t create identity profiles based on someone’s face or other data. It can’t tell who is in a house, only where they are and how/when they are moving around (or not moving).
That allows for capabilities such as notifying a nursing home that a resident hasn’t gotten out of bed when they usually do, without invasive investigation. But the technology also raises privacy concerns: A company could know when people in their own home are in bed, watching TV, or sitting to eat dinner, even if it can’t identify them by name.
ADT calls features like these home awareness, but also mentions municipal compliance and coordination with first responders. That could mean giving firefighters information on how many people are in a burning building. But there are concerns. Recent news reports indicate that some local law enforcement agencies have shared information with US Immigration and Customs Enforcement for use in home and apartment raids, raising the possibility that the technology could be applied in similar contexts.
The technology’s implications may ultimately hinge on how ADT chooses to implement and regulate it. Until those details are clearer, its promise and its risks remain closely intertwined.
Technologies
New York Times Debuts the Midi Crossword, Its In-Between Puzzle
Is the Mini Crossword too easy, but the original one just too time-consuming? Here’s your new puzzle.
The daily New York Times Mini Crossword can be solved in a minute or so, while the newspaper’s iconic original crossword puzzle might take hours. Now, puzzlers who want an in-between diversion can try a new puzzle from the Times, introduced this week — the Midi Crossword puzzle. (And CNET readers can get daily answers for five Times puzzles — Wordle, Connections, Strands, Connections: Sports Edition and the Mini Crossword.)
New York Times Games subscribers can play the Midi in the New York Times Games app for iOS and Android devices, or on mobile or desktop web. It’s online-only, not in the print newspaper.
«We’re really leaning into the digital-first nature of the puzzle,» NYT Games Puzzle Editor Ian Livengood said in a Times article about the new puzzle. «About once a week, the puzzle will have a visual effect — an extra flourish when you start or after you solve. This could be a cool animation or colorful shading.»
As the name «Midi» suggests, this is a mid-sized crossword puzzle. Where the Mini Crossword usually only has 5 Across and 5 Down clues, the Midi is usually a 9-by-9 puzzle, sometimes as long as 11-by-11.
«If you feel like the Mini is not enough but the Daily is too much, this will be the perfect puzzle for you,» Livengood said.
Each Midi Crossword has a theme that hints at the topics of the clues and answers. Unlike the other puzzles, Livengood says the Midi might occasionally have two-letter words and repeating answers.
I tried the Midi Crossword
I tried Wednesday’s Midi Crossword and solved it in just over 3 minutes. That’s much longer than I spend on the Mini Crossword, but much faster than the original New York Times crossword puzzle takes me.
I thought most of the clues were pretty simple, and the few tricky ones filled themselves in once I moved from Across to Down.
If you’re a New York Times Games subscriber, this is a nice addition to your daily puzzle stable. It tests your mind a bit more than the Mini, but you can also solve it while watching TV or waiting for someone to text you back.
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow
