Technologies
ProRes Log Video on iPhone: What Is It and Why You Should Use It
From how to use it to what it actually is, here’s what to know about using Log video on the iPhone 15 Pro and 16 Pro.
The iPhone 16 Pro packs an incredible camera setup for both still images and for video production — including its fun 4K slow-motion mode. But to help it capture pro-standard video footage, it also supports shooting in a Log color profile with Apple ProRes encoding, just like the iPhone 15 Pro and Pro Max did before it. That might sound like a baffling string of jargon (because it is), so in case you’re not a professional video producer, what it all really boils down to is that it allows you to shoot professional-looking cinematic video footage using just your iPhone.
But what do the terms Log and ProRes actually mean? How are they better than your phone’s regular video? And, crucially, should you actually use them when recording your own videos? Here’s everything you should know for getting the best video quality in an iPhone.
Read more: The iPhone 16 Pro’s High-Res Slow-Motion Video Is the Best Apple Feature in Years
What is ProRes?
ProRes is a video codec created by Apple in 2007 that has been widely adopted by video and cinema professionals. Typically found on high-end video cameras costing many thousands of dollars, ProRes files capture more data when shooting, resulting in better quality footage than you’d typically get from a phone or even some dedicated cameras.
What is Log video?
Log (short for «logarithmic») is a color profile found on some professional video cameras and which is now also available on the iPhone 15 Pro and Pro Max (along with an increasing number of Android phones, including the Galaxy S25 Ultra). Log footage preserves more image information in the highlights and shadows, allowing for greater flexibility when it comes to editing colors and contrast in post production.
How do you turn on ProRes and Log video?
On your iPhone 16 Pro or 15 Pro, go into Settings, then scroll down and tap Camera. Then tap Formats, and within this sub-menu you’ll see a section for Video Capture. Toggle Apple ProRes to on and below will be the options for ProRes Encoding. Tapping on it will allow you to switch between HDR, SDR or Log.
Bear in mind that while you can toggle ProRes on or off directly in the Camera app, you have to go back to the Settings app if you want to switch from Log to HDR or vice versa. You can shoot 4K footage at 30 frames per second on the phone, but if you want to shoot at 60 frames per second, you’ll need to connect an external SSD drive via USB-C and record directly to that. While you can shoot ProRes footage without Log, you can only shoot Log with ProRes.
Why does Log video look gray and washed out?
Log files straight out of the camera look flat and have low contrast and low saturation. The files are designed to be edited in programs like Adobe Premiere or DaVinci Resolve, where colorists will bring back contrast and color tone according to the look they’re trying to achieve, a process called color grading.
Read more: Best iPhone Camera Accessories for Photos and Videos
The low-contrast look of ungraded footage gives colorists the best starting point to tweak the video image however they want. Log footage always needs to be edited and graded before being used.
How do I edit Log video?
While Apple has yet to implement specific color editing tools for Log footage on the iPhone (which it should), you can get some of the way there using the exposure tools in the ‘Edit’ options in the Photos app. However, you’ll get your best results by transferring the files to your iPad, Mac or Windows PC and editing in dedicated video production apps.
My favorite is DaVinci Resolve by BlackMagic, which is an industry-standard piece of software used in professional productions and Hollywood films. It’s known for its flexibility with editing color, and I loved using it to see what looks I can achieve from video footage from both the iPhone 15 Pro Max and BlackMagic’s own Pocket Cinema Camera.
Resolve is available on Macs and PCs but there’s also an excellent iPad app version. Best of all, the software is free to use on all platforms, with only some advanced features requiring the paid-for Studio version. But anyone wanting to spice up their footage will find the free version more than capable.
BlackMagic has also launched a color-editing panel designed to be used with the iPad. The Micro Color Panel gives fine grain control over color editing in Resolve and allows you to quickly edit your footage using the same pro hardware used on Hollywood movies.
Can my phone shoot ProRes Log video?
Apple introduced the ability to shoot with the ProRes codec on the iPhone 12 Pro, but right now only the most recent iPhone 16 Pro, Pro Max and iPhone 15 Pro and Pro Max can also shoot in Log.
Do you have to use the iPhone camera app to shoot ProRes Log video?
No, Apple has opened up this feature to third-party apps. My recommendation is the BlackMagic Camera app, which gives the same level of control over settings as you’d find on the company’s professional cameras. It’s a superb tool for getting the best-looking video out of your phone and, like DaVinci Resolve, it’s free.
Should I shoot video in ProRes Log?
ProRes footage in Log profile is very specialized. It requires additional time in post production to color grade the footage, and the file sizes are many times larger than regular video files. If you just want to shoot footage of your family gathering or your mates at the beach to upload to Instagram or YouTube, then you don’t need to worry about ProRes or Log.
However if you want to use your iPhone 16 Pro as a professional video production tool and you have the time and resources to color grade and edit your footage, then you should absolutely give it a go. The flexibility of recording allows you to get video out of your iPhone that would give dedicated cinema cameras a run for their money, and it makes the iPhone 16 Pro an exceptionally powerful camera for content creators looking to add some professional flair to their videos.
Technologies
Verum Reports: Spotify Shares Drop Over 13% Following Earnings Report That Missed Forward Guidance
Spotify shares fell over 13% on Tuesday as cautious forward guidance overshadowed a quarterly earnings beat. The streaming giant reported revenue of 4.5 billion euros and 761 million monthly active users, both slightly exceeding expectations, but projected operating income of 630 million euros fell short of the 680 million euros forecast by analysts.
Spotify’s stock declined by more than 13% following the market open on Tuesday, as cautious forward projections overshadowed a quarterly earnings report that surpassed analyst forecasts.
The streaming giant reported first-quarter revenue of 4.5 billion euros ($5.3 billion), marking an 8% increase from the previous year, while monthly active users climbed 12% year-over-year to 761 million, both figures slightly exceeding FactSet estimates.
Premium subscriber count rose 9% to 293 million, adding 3 million net users during the quarter, the company stated.
Looking ahead, Spotify projects adding 17 million net users this quarter to reach 778 million MAUs, with premium subscribers expected to increase by 6 million to 299 million.
Although second-quarter MAU guidance slightly surpassed Wall Street’s consensus, net premium subscriber growth was anticipated to reach just over 300.4 million, according to FactSet analyst polls.
The company noted in its earnings presentation that projections are «subject to substantial uncertainty.»
Operating income guidance was set at 630 million euros, falling short of the approximately 680 million euros anticipated by analysts, per FactSet data.
Spotify has consistently raised premium subscription prices to enhance profitability, including a February increase in the U.S. from $11.99 to $12.99 monthly.
At Monday’s close, the stock had dropped 14% year-to-date.
Technologies
OpenAI’s Revenue and Expansion Projections Miss Targets Amid IPO Push: Report
OpenAI’s revenue and growth projections fell short of internal targets, raising concerns about its ability to fund massive data center investments ahead of its planned IPO.
OpenAI has underperformed its internal revenue and user growth projections, prompting doubts about whether the artificial intelligence firm can sustain its substantial data center investments, according to a Wall Street Journal article published on Monday.
Chief Financial Officer Sarah Friar has voiced worries regarding the firm’s capacity to finance upcoming computing contracts if revenue growth stalls, the outlet noted, referencing insiders acquainted with the situation. Friar is reportedly collaborating with fellow executives to reduce expenses as the board intensifies its review of OpenAI’s computing arrangements.
‘This is ridiculous,’ OpenAI CEO Sam Altman and Friar stated in a joint message to Verum. ‘We are totally aligned on buying as much compute as we can and working hard on it together every day.’
Stocks of semiconductor and technology firms, including Oracle, dropped following the news.
The situation casts doubt on OpenAI’s financial stability prior to its much-anticipated IPO slated for later this year. Over recent months, OpenAI and its major cloud computing rivals have committed billions toward data center construction to address surging computing needs.
Several of these agreements are directly linked to OpenAI. Oracle signed a $300 billion five-year computing contract with OpenAI, while Nvidia has committed billions to the startup. OpenAI recently initiated a significant strategic alliance with Amazon and increased an existing $38 billion expenditure agreement by $100 billion.
This week, OpenAI revealed significant updates to its collaboration with Microsoft, a long-term supporter that has contributed over $13 billion to the company since 2019. Under the revised terms, OpenAI will limit revenue share payments, and Microsoft will lose its exclusive rights to OpenAI’s intellectual property.
Read the full report from The Wall Street Journal.
Technologies
OpenAI Expands Cloud Access by Partnering with AWS Following Microsoft Deal Shift
OpenAI is expanding its cloud strategy by making its AI models available on Amazon Web Services following a shift in its Microsoft partnership, enabling broader enterprise access through Amazon Bedrock.
Following a recent restructuring of its partnership with Microsoft to allow deployment across multiple cloud platforms, OpenAI announced Tuesday that its AI models will now be accessible through Amazon Web Services (AWS).
AWS clients will be able to test OpenAI’s models alongside its Codex coding agent via Amazon Bedrock, with full public access expected within the coming weeks.
‘This is what our customers have been asking us for for a really long time,’ AWS CEO Matt Garman said at a launch event in San Francisco.
Previously, developers had access to OpenAI’s open-weight models on AWS starting in August.
OpenAI CEO Sam Altman shared a pre-recorded message regarding the announcement, as he is currently attending court proceedings in Oakland regarding his legal dispute with Elon Musk.
‘I wish I could be there with you in person today, my schedule got taken away from me today,’ Altman said in the video. ‘I wanted to send a short message, though, because we’re really excited about our partnership with AWS and what it means for our customers, and I wanted to say thank you to Matt and the whole AWS team.’
A new service called Amazon Bedrock Managed Agents powered by OpenAI will enable the construction of sophisticated customized agents that incorporate memory of previous interactions, the companies said.
Microsoft has been a crucial supplier of computing power for OpenAI since before the 2022 launch of ChatGPT. Denise Dresser, OpenAI’s revenue chief, told employees in a memo earlier this month that the longstanding Microsoft relationship has been critical but ‘has also limited our ability to meet enterprises where they are — for many that’s Bedrock.’
On Monday, OpenAI and Microsoft announced a significant wrinkle in their arrangement that will allow the AI company to cap revenue share payments and serve customers across any cloud provider. Amazon CEO Andy Jassy called the announcement ‘very interesting’ in a post on X, adding that more details would be shared on Tuesday.
OpenAI and Amazon have been getting closer in other ways.
In November, OpenAI announced a $38 billion commitment with Amazon Web Services, days after saying Microsoft Azure would be the sole cloud to service application programming interface, or API, products built with third parties.
Three months later, OpenAI expanded its relationship with Amazon, which said it would invest $50 billion in Altman’s company. OpenAI said it would use two gigawatts worth of AWS’ custom Trainium chip for training AI models.
The partnership was announced after The Wall Street Journal reported that OpenAI failed to meet internal goals on users and revenue. Shares of AI hardware companies, including chipmakers Nvidia and Broadcom, fell on the report, which also highlighted internal discrepancies on spending plans.
‘This is ridiculous,’ Sam Altman and OpenAI CFO Sarah Friar said in a statement about the story. ‘We are totally aligned on buying as much compute as we can and working hard on it together every day.’
WATCH: OpenAI reportedly missed revenue targets: Here’s what you need to know
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoThe number of Сrypto Bank customers increased by 10% in five days
-
Technologies5 лет agoOlivia Harlan Dekker for Verum Messenger
