Technologies
iPhone 17 Cameras Might Get Improved Video Skills. As a Creator, I’m Excited
The iPhone 17 Pro could be great for YouTubers and content creators. Here’s what Apple needs to do.

The iPhone 16 Pro is already an extremely powerful tool for both photographers and videographers alike, thanks to its stellar rear cameras and ProRes Log video support. However, Bloomberg’s Mark Gurman — an Apple analyst with a reliable track record — writes that for the iPhone 17 Pro line, Apple «will stress improvements to video recording» in a move to «get the vlogging community away from stand alone cameras.» As a YouTube creator and professional photographer myself, I’m intrigued.
Sadly, Gurman hasn’t offered any details on what these video improvements might be. And to be fair to Apple, it’s already leading the way with some of its video production capabilities. The combination of ProRes recording and Log color profiles on the last couple of iPhone Pro models has made them not just great video cameras for everyday vloggers, but powerful enough to be the primary cameras for Hollywood films. Samsung clearly took note of Apple’s video dominance in the creative space as it equipped the recent S25 Ultra with Log color, too.
Given the already top-end video skills of the iPhones, it’s difficult to know exactly what Apple might do to make its devices even more appealing to content creators. I produce videos for CNET and I operate a YouTube channel, so I spend a lot of my time shooting video and vlogging on a variety of equipment from mirrorless cameras like my Canon R5 and BlackMagic Cinema Camera to more mobile options like the DJI Osmo Pocket 3. Yet I rarely use my iPhone 16 Pro as part of my production. So, why don’t I?
In all honesty, there’s no specific reason beyond that I feel I have my bases adequately covered by what’s already available. When I want cinematic production quality, I use my main cameras. When I want a lightweight mobile setup for photowalk vlogging, I use my Osmo. So I’m left wondering what Apple would need to do to make me leave my Osmo at home and head out to shoot my YouTube videos using just my phone. I do have a couple of thoughts.
First, it needs to make the main camera app easier to use with Bluetooth microphones. While the iPhone’s built-in microphones are decent enough in quiet environments, external mics can offer more professional sound quality with better wind resistance. They allow you to stand further away from your camera while capturing crystal clear sound.
While it’s possible to pair the DJI Mic 2 with the iPhone, I’ve only been able to get it to work when using the BlackMagic Camera app, but not in the iPhone’s default camera app. It’s possible that Apple will try to push the AirPods Pro 2 as the better option for creators, but I don’t like wearing headphones when vlogging, so this isn’t a workaround I’d be happy with.
I also want to see Apple offer more editing options for its Log footage on the phone. Log video looks grey and low contrast by default as you typically take that footage into editing software like Adobe Premiere or DaVinci Resolve and adjust the colors and contrast to suit — a process called color grading. But that adds a lot of time and effort.
If Apple wants its high-level video skills to appeal to fast-paced YouTubers and social media creators, adding color presets (often called LUTs) to the iPhone’s video editing workflow would be a welcome addition.
I’m definitely excited to see what Apple has in store for the iPhone 17’s cameras. As someone who spends a lot of time producing videos, I’m keen to see whether its new updates will be enough to tempt me away from my own tried-and-tested setup.
Technologies
Google Takes Aim at Duolingo With AI Tools to Help You Learn New Languages
The tech giant is the latest company to adopt AI tools in order to teach foreign languages — but it isn’t the first.
Google is debuting three new AI experiments that are intended to help users learn foreign languages on the go. The tools utilize Google’s Gemini large language model to identify objects and situations in a user’s immediate environment and provide translations that could help users ask for help or spark a conversation.
If you want to give the new experiments a try, you can find them on the Google Labs webpage. Google experiments aren’t applications, which means you don’t have to download anything to get started. You can just click into the experiment you want to try and begin typing in your prompts.
Read more: Best AI Chatbots of 2025
In debuting these new features, Google is going head-to-head with other foreign language-learning services that are also focusing on AI tools. Duolingo’s CEO recently announced that the company «will be going AI-first,» and OpenAI’s ChatGPT has the ability to begin new foreign-language conversations at any time upon request.
Tiny Lesson: Describe a situation
Google’s new Tiny Lesson tool allows users to describe a situation they’re in to learn vocabulary and grammar that can help describe a problem to the locals. Using the provided context, the tool will provide suggestions that aid users in understanding how to ask for help if they haven’t learned specific phrases tailored to their current issue.
Slang Hang: Casual talk
The Slang Hang tool promotes casual conversation over rigid sentence structure and grammatical agreement, teaching users how to drop the formalities and adapt a more colloquial way of speaking a foreign language. Slang Hang simulates conversations between native speakers and lets users discover what any words or phrases in the series of messages mean. The AI model sometimes misidentifies or hallucinates words, so you’ll need to double-check with another source when using this feature.
Word Cam: Detect items in photographs
The third and final new tool, Word Cam, uses Gemini to detect objects in photographs you take — providing you translations for your surroundings in the foreign language you’re learning. This feature helps you describe the world around you, but it’s possible that Gemini may not accurately label every single object in a picture you take. It’s still worth double-checking the translations you’re provided against another source while using Word Cam.
The language-learning experiments were created as a way to «inspire developers using Gemini for building different use cases and experiences,» Google representative Maggie Shiels told CNET.
This particular set of experiments is meant to focus on using the multimodal LLM as a way to promote bite-sized lessons on the go.
Google’s new features aren’t launching for every language — at least, not yet. Tiny Lesson, Slang Hang and Word Cam currently support translations for the Arabic, Chinese, English, French, German, Greek, Hebrew, Hindi, Italian, Japanese, Korean, Portuguese, Russian, Spanish and Turkish languages.
Shiels said that Tiny Lesson, Slang Hang and Word Cam — like other Google Labs experiments — are not products and are not meant to be permanent features.
«This is a limited-time tool that will eventually sunset,» she told CNET. «We hope that developers have fun playing around.»
Technologies
With ‘Hey Meta,’ Ray-Ban Wearers Will Unlock All-New AI Abilities — and Privacy Concerns
The Meta smart glasses from Ray-Ban will soon be able to hold conversations about exactly what you’re seeing or hearing
As Google starts to revive its Google Glass concept, Meta is already a step ahead with new artificial intelligence functions coming to glasses this summer. The Ray-Ban smart glasses, in partnership with Meta, are getting several powerful AI updates for US and Canadian users.
Operating the Meta View app on a connected smartphone, users of Ray-Ban smart glasses will also be able to use the «Hey Meta, start live AI» command to give Meta AI a live view of whatever they are seeing through their glasses.
Similar to Google’s Gemini demo, users will be able to ask Meta AI conversational questions about what it sees and how it might solve problems. Meta provided the example of Meta AI giving possible substitutes for butter based on what it sees when you look in the pantry.
Even without live AI, you’ll be able to ask specific questions about objects that you’re looking at.
In addition to new seasonal looks, Ray-Ban’s smart glasses also will be able to use the «Hey Meta, start live translation» command to automatically translate incoming languages including English, French, Italian and Spanish. The glasses’ speakers will translate as other people talk and you can hold up your phone so the other party can see a translated transcript too.
Meta AI and concerns about being filmed
When I reached Inna Tokarev Sela, CEO and founder of AI data company illumex about privacy issues with smart glasses like these, she mentioned that in her own experience with Ray-Ban smart glasses, people usually reacted when they noticed the recording indicator light, which meant the glasses were watching. That can make some people uneasy, whether they are concerned about being filmed by a stranger or by what Meta may be doing with all that visual data it’s collecting.
«In the new models you can control the notification light, which could pose a privacy risk,» Sela said. «But everyone films everyone all the time anyway at touristy landmarks, public events, etc. What I expect is that Meta will not divulge any information on anyone, unless they register and explicitly give their consent.»
This could lead to other consent headaches too, depending on if users are recording for other purposes. «For example, users should be able to opt in and choose the type of information to expose when they’re in someone’s frame — similar to LinkedIn, for example,» Sela said. «Of course, any recording resulting from the glasses should not be admissible to use in a court of law, as with any other kind of recording, without explicit permission.»
Additional updates and rollout schedules
Along with the AI upgrades, Ray-Ban’s smart glasses will be able to post automatically on Instagram or send a message on Messenger with the right voice commands. New compatibility with music streaming services also will allow you to play songs through Amazon Music, Apple Music and Spotify on your glasses in lieu of earbuds.
Meta reports that the rollout of these new features will happen this spring and summer, along with object recognition updates for EU users arriving in late April and early May.
Meta and Ray-Ban didn’t immediately respond to a request for further comment.
Technologies
Amazon Prime Day Is Coming Back in July, With Tariffs Looming Large
Amazon’s big annual sales event for Prime members could be dampened by price hikes on imported goods. Here’s what to look for.
Amazon Prime Day will return in July. Amazon on Tuesday announced the 2025 edition of the summer shopping event, which typically brings some of its best Amazon deals of the year.
The mega retailer isn’t yet announcing specific dates, according to Amazon spokesperson Alicia Hopkins, who responded via email to questions about the timing. The two-day sales event, which is exclusively for Prime members, took place last year July 16-17.
Looming price hikes due to tariffs could impact how much savings shoppers can expect.
According to some reports, Amazon will start displaying the original prices of products alongside how much the Trump administration’s tariffs add to a product’s price, although the company denied this to Reuters.
The White House immediately denounced any such plan by Amazon.
«This is a hostile and political act by Amazon,» Press Secretary Karoline Leavitt said during a White House press briefing on Tuesday. Amazon did not immediately respond to a follow-up request.
So what could tariffs mean for your Prime Day shopping?
How could tariffs affect Prime Day deals?
Prices on everything, including electronics, are expected to rise as a result of Trump’s sweeping tariffs, which he originally announced on April 2. He quickly followed with a 90-day pause for most of the tariffs, but left triple-digit tariffs in place for China and a 10% baseline tariff for goods imported from other countries.
The administration has since said that it’s in the process of making deals with many countries to ease tariffs, but no official announcements have been released yet. Some companies, including Apple, have taken steps to reduce the impact of tariffs on their products, including reportedly moving some manufacturing operations to India.
If the 90-day tariff pause is lifted before agreements can be reached, they would take effect in July — the same month as Amazon’s Prime Day event.
If retailers pass along the full cost of the tariffs, it could mean we’ll be paying double (or more) for products manufactured in other countries. Shoppers on bargain sites Temu and Shein have already seen prices skyrocket as much as 377% ahead of the tariffs.
After launching the original Prime Day in 2015, the retailer has expanded the number of its sales events, including a Big Spring Sale in March and Prime Big Deal Days in October.
-
Technologies2 года ago
Tech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies2 года ago
Best Handheld Game Console in 2023
-
Technologies2 года ago
Tighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года ago
Verum, Wickr and Threema: next generation secured messengers
-
Technologies4 года ago
Google to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies4 года ago
Olivia Harlan Dekker for Verum Messenger
-
Technologies3 года ago
Black Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года ago
iPhone 13 event: How to watch Apple’s big announcement tomorrow