Connect with us

Technologies

A Decade Later, Your Phone Still Does Not Replace a Pro Camera

Commentary: Phone cameras are getting better and better, but they still aren’t much closer to replacing dSLRs and professional mirrorless cameras.

On a chilly Saturday afternoon in San Francisco, I was under a patio heater with a group of friends when someone said we should get a group photo. What happened next was surprising. Instead of using his phone to take a commemorative photo, my friend pulled out a point-and-shoot camera. I thought to myself, «Wait. The phone killed the point-and-shoot camera years ago. Why didn’t he just use his iPhone?» Granted it was the high-end Sony RX100 VII, which is an excellent compact camera and one of the few point-and-shoots still made today.

Phones from Apple, Samsung and Google include some of the best phone cameras you can buy, like the iPhone 14 Pro, Google Pixel 7 Pro and Samsung Galaxy S22 Ultra. But for professional photographers and filmmakers, that’s not always enough. The holy grail is being able to have a truly large image sensor like the one you’d find in a high-end mirrorless camera and a lens mount that could attach to your phone. Sounds simple enough right? Wrong.

Everyone from Samsung to Panasonic, Sony and Motorola has tried to make this dream a reality in some way. Now Xiaomi, the world’s third largest phone-maker (behind Samsung and Apple) is the latest to rekindle the quest for the phone camera holy grail. The company has a new prototype phone that lets you mount a Leica M lens on it.

But this is just a concept. If you’re wondering whether phones will ever make dedicated pro cameras obsolete the way they did with point-and-shoots, the answer is a resounding no. The past decade has shown us why.

Why phone cameras are limited

First, it’s important to understand how your phone’s camera works. Behind the lens is a tiny image sensor, smaller than a single Lego brick. Sometimes there are headlines that Sony, Sharp or, years ago, Panasonic put a 1-inch sensor in a phone. Sadly, that name doesn’t refer to the actual dimensions and in reality, a 1-inch image sensor is about 0.6 of an inch diagonally or, for the sake of approximation, two Lego bricks. The 1-inch sensor is the hoverboard of cameras, but it’s still one of the largest to be put into a phone.

Dedicated cameras have sensors that are closer to 12 Lego bricks (positioned side-by-side in a four-by-three rectangle) and most come with a lens mount that lets you change lenses. The «holy grail» is to put one of these larger sensors into a phone.

But bigger sensors are more expensive than the little ones used in your iPhone and there are space considerations. A lens for a phone camera sensor is relatively small. But lenses for a full-frame sensor are larger and require more space between the back of the lens and the sensor. Phones simply lack this room without becoming significantly thicker.

Every year we see Apple, Samsung and the like take small steps toward improving phone photography. But phone camera hardware has largely hit a ceiling. Instead of radical camera improvements, we get modest upgrades. This could be a sign that companies have honed in on what consumers want. But it could also be a consequence of space and size limitations of tiny sensors.

Instead smartphone-makers use computational photography to overcome a tiny sensor’s limitations — smaller dynamic range and light sensitivity. Google, Apple, Samsung all use machine learning algorithms and artificial intelligence to improve the photos you take with your phone.

But hardware is also important. Earlier this month Tim Cook, Apple’s CEO, shared a photo on Twitter, above, of a visit to Sony in Japan. While it’s been widely assumed that Apple uses Sony’s image sensors in the iPhone, this is the first time Cook formally acknowledged it. And as CNET readers already know, Sony phones like the Xperia 1 IV have some of the best camera hardware found on any phone sold today.

The Xperia 1 IV won a CNET Innovation award for its telephoto camera, which has miniature lens elements that actually move back and forth, like a real telephoto lens. The result is that you can use the lens to zoom without cropping digitally, which degrades the image. Can you imagine an iPhone 15 Pro with this lens?

The Xiaomi 12S Ultra Leica lens prototype is so 2013

That brings us to Xiaomi, which is the latest company attempting to merge pro-level cameras with your phone. In November, Xiaomi released a video of a phone camera concept that shows a Leica lens mounted on a 12S Ultra phone. This prototype is like a concept car: No matter how cool it is, you’ll never get to drive it.

The Chinese company took the 12S Ultra and added a removable ring around its circular camera bump. The ring covers a thread around the outside edge of the camera bump onto which you can attach an adapter that lets you mount Leica M lenses. The adapter’s thickness is the same distance that a Leica M lens needs to be positioned away from the sensor in order to focus.

A few caveats: The Xiaomi 12S Ultra concept uses an exposed 1-inch sensor, which as I mentioned earlier, isn’t actually 1-inch. Next, this is purely a concept. If something like this actually went on sale, it would cost thousands of dollars. A nice dedicated camera like the Fujifilm X100 V, which has a much bigger sensor, costs $1,399 in comparison.

Xiaomi isn’t the first phone-maker to try this. In 2013, Sony took an image sensor and put it on the back of a lens that has a grip to attach to the back of a phone. The idea is to use your phone’s screen as the viewfinder for the camera system, which you can control through an app. Essentially you bypass your phone’s cameras.

Sony made several different versions of this «lens with a grip» and used sensors that were just a bit bigger than those found in phone cameras. Sony also made the QX-1 camera, which had an APS-C sized sensor that in our Lego approximation is about six bricks positioned side-by-side in a three-by-two rectangle. That’s not as large as a full-frame sensor, but vastly bigger than your phone’s image sensors.

The Sony QX-1 has a Sony E-mount, meaning you can use various E-mount lenses or use adapters for Canon or Nikon lenses. Because the QX-1 is controlled with Bluetooth, you could either attach it to your phone or put it in different places to take photos remotely.

The QX-1 came out in 2014 and cost $350. Imagine having something like this today? I would definitely buy a 2022 version if Sony made it, but sadly the QX-1 was disconitntued a few years after it went on sale. That’s around the time that Red, the company that makes cinema cameras used to film shows and movies like The Hobbit, The Witcher, Midsommar and The Boys, made a phone called the Red Hydrogen One.

Despite being a phone made by one of the best camera companies in the world, the $1,300 Red Hydrogen One’s cameras were on par with those from a $700 Android phone. The back of the phone had pogo pins designed to attach different modules (like Moto Mods), including a «cinema camera module» that housed a large image sensor and a lens mount, according to patent drawings. The idea is that you would use a Hydrogen One and the cinema mod to turn the phone into a mini-Red cinema camera.

Well, that never happened.

The Red Hydrogen One was discontinued and now shows up as a phone prop in films like F9, on the dashboard of Dominic Toretto’s car or in the hands of Leonard DiCaprio in Don’t Look Up.

2023 will show that pro cameras won’t be killed off by our phones

There aren’t any rumors that Apple is making an iPhone with a camera lens mount, nor are there murmurs of a Google mirrorless camera. But if Xiaomi made a prototype of a phone with a professional lens mount, you have to imagine that somewhere in the basement of Apple Park sits an old concept camera that runs an iOS-like interface, is powered by the iPhone’s A-series chip and able to use some of the same computational photography processing. Or at least that’s what I’d like to believe.

How amazing would photos look from a pro-level dedicated camera that uses the same processing tricks that Apple or Google implement on their phones? And how nice would it be to have a phone-like OS to share those photos and videos to Instagram or TikTok?

Turns out, Samsung tried bringing an Android phone’s interface to a camera in 2012. Noticing a theme here? Most of these holy grail phone camera concepts were tried 10 years ago. A few of these, like the Sony QX-1, were truly ahead of their time.

I don’t think Apple will ever release a standalone iOS-powered camera or make an iPhone with a Leica lens mount. The truth is that over the past decade, cameras have gotten smaller. The bulky dSLRs that signified professional cameras for years are quickly heading into the sunset. Mirrorless cameras have risen in popularity. They tend to be smaller, since they don’t need the space for a dSLR mirror box.

If there is a takeaway from all of this, it’s just a reminder of how good the cameras on our phones have gotten in that time. Even if it feels like they’ve plateaued, they’re dependable for most everyday tasks. But they won’t be replacing professional cameras anytime soon.

If you want to step up into a professional camera, find one like the Fujifilm X100 V or Sony A7C, that pack a large image sensor, a sharp lens and can fit into a coat pocket. And next time I’m at a dinner party with friends, I won’t act so shocked when someone wants to take a picture with a camera instead of a phone.

Read more: Pixel 7 Pro Actually Challenges My $10,000 DSLR Camera Setup

Technologies

Today’s NYT Mini Crossword Answers for Wednesday, Aug. 20

Here are the answers for The New York Times Mini Crossword for Aug. 20.

Looking for the most recent Mini Crossword answer? Click here for today’s Mini Crossword hints, as well as our daily answers and hints for The New York Times Wordle, Strands, Connections and Connections: Sports Edition puzzles.


Today’s NYT Mini Crossword has a few challenging clues (4-Down threw me off), but it’s mostly OK. Need some help with today’s Mini Crossword? Read on. And if you could use some hints and guidance for daily solving, check out our Mini Crossword tips.

If you’re looking for today’s Wordle, Connections, Connections: Sports Edition and Strands answers, you can visit CNET’s NYT puzzle hints page.

Read more: Tips and Tricks for Solving The New York Times Mini Crossword

Let’s get to those Mini Crossword clues and answers.

Mini across clues and answers

1A clue: Something worn by an infant or marathon runner
Answer: BIB

4A clue: Diversion on a long flight
Answer: MOVIE

6A clue: Phobos and Deimos, for Mars
Answer: MOONS

7A clue: Join highway traffic
Answer: MERGE

8A clue: Coloring for a camp shirt
Answer: DYE

Mini down clues and answers

1D clue: Loudly voiced one’s disapproval
Answer: BOOED

2D clue: Material in walrus tusks
Answer: IVORY

3D clue: Experience four seasons in one day, say?
Answer: BINGE

4D clue: «Delicious!»
Answer: MMM

5D clue: Opposite of WNW
Answer: ESE

Continue Reading

Technologies

See Six Planets Line Up in the Upcoming Planet Parade Tonight

Mark your calendar so you can catch Mercury, Venus, Jupiter, Saturn, Neptune and Uranus in the sky at the same time.

Fresh off the excitement of the Perseids meteor shower is a chance to see six planets lined up in the sky at once. These events, colloquially known as planet parades, only occur about once or twice a year, with the most recent one in February showing off all seven planets in our solar system at once. The next one will feature six of our closest celestial neighbors, and the event starts on Tuesday. 

The six planets sharing the sky will be Mercury, Venus, Jupiter, Saturn, Neptune and Uranus. Mars will technically be there at the beginning of the night, but it dips below the horizon right after sunset, so it won’t be visible when all of the others are. Of those, Mercury, Venus and Jupiter will be visible to the naked eye, while the others will require high-powered binoculars or, preferably, a telescope. 

Even though they’re spread out across the eastern and southern skies, the planets pair up with this one, making many of them pretty easy to find if you know what to look for. From east to west, here’s where each one will be. 

  • Mercury — Eastern sky near the Cancer constellation. It’ll pop over the horizon just before sunrise, so you’ll have limited time to view it before the sun comes up and obfuscates it. 
  • Venus — At the lower tip of the Gemini constellation in the eastern sky, a couple of hours before sunrise. 
  • Jupiter — Will be near Venus, also in the Gemini constellation. It rises about an hour before Venus does. 
  • Uranus — Will be near the upper tip of Taurus, rising after midnight. This one will require some magnification. If you see Pleiades, a cluster of stars at the upper tip of Taurus, you’ve gone too far upward.
  • Saturn and Neptune — These two are right next to each other and will be sitting between the Pisces and Cetus constellations in the southern skies. Neptune will be closer to Pisces while Saturn will be closer to Cetus. 

Since it takes a long time for planets to move through the night sky, Aug. 20 is the starting point, and it’ll run through the rest of the month. Once September hits, Mercury will be too close to the sun, which will obscure it. From that point, there will be a five-planet parade for a while until Venus sinks below the horizon in early October. So, in all, you’ll have a chance to see at least five planets for over a month. 

Will the planet parade be visible from my region?

Yes. We double checked Stellarium’s sky map from a variety of locations across the country, and everything above will be applicable everywhere in the continental US. Per Starwalk, the parade will also be visible in other parts of the world after the following dates for about the same amount of time (one to two weeks). 

  • Abu Dhabi — Aug. 9
  • Athens, Beijing, Berlin, Tokyo and London — Aug. 10
  • Mumbai and Hong Kong — Aug. 11
  • Reykjavik, São Paulo and Sydney — Aug. 12

The planets will move based on date, though. The above locations are where they’ll be around Aug. 20, but if you’re looking a week or so later, they’ll be in the same general area, but will shift to a slightly different part of the sky. 

Will I need any special equipment?

Yes. Neptune and Uranus, especially, will require some sort of magnification to see. We recommend a telescope, but high-powered binoculars may work if the sky is dark enough. Saturn is also difficult to see without magnification, so you’ll want it for that too. Jupiter, Venus, and Mercury should be visible on their own with the naked eye. 

We also recommend taking a trip out to the country, as light pollution from suburbs and cities can make it even more difficult to see Neptune and Uranus. The moon will be out as well, which may make Venus, Jupiter, and Mercury harder to see. Other factors like weather may also make it more difficult to see all of them. If you’re lucky, you may see a few shooting stars at the tail end of Perseids as well.

Continue Reading

Technologies

Grammarly Pushes Beyond Proofreading With AI-Powered Writing Guidance

Grammarly dropped agents to spot plagiarism, cite sources and maybe even boost your GPA.

Grammarly is expanding beyond its grammar-checking roots. The company has announced the launch of several specialized AI «agents» and a new writing tool called Grammarly Docs, designed to help students and professionals with everything from drafting essays to polishing workplace emails.

It’s another example of generative AI expanding beyond general-purpose chatbots like ChatGPT and Gemini into more specialized domains. Other examples of gen AI in educational circles include Google’s NotebookLM and OpenAI’s new study mode for ChatGPT.

AI agents are digital helpers that go beyond traditional chatbots to understand context and assist in reaching your goals. Grammarly’s AI agents assist by offering feedback, predicting reactions, finding sources and more to increase efficiency in workflows. 

Read also: Grammarly AI: This Free AI Tool Will Easily Fix Your Grammar

What’s available now for Grammarly AI

The update introduces nine agents that move Grammarly into a more collaborative role. Instead of just correcting grammar or suggesting phrasing, the agents are intended to actively work alongside users. One predicts how a professor or manager might respond to a draft. Another offers an estimated grade based on an uploaded rubric. Others handle citation generation, proofreading, paraphrasing, plagiarism checks and AI detection. The tools are built directly into Docs, a «distraction-free» writing environment where all the agents can be summoned in context, according to the company.

As students head back to classrooms and colleges, Grammarly is looking to position itself as a study companion and writing coach rather than merely a browser extension. The company cites research showing that while only a small share of students feel confident using AI in professional settings (18%), most employers expect AI literacy from job candidates. By emphasizing skill-building and responsible use, Grammarly says it wants to bridge that gap rather than simply automate assignments.

«The launch of our new agents and AI writing surface marks a turning point in how we build products that anticipate user needs,» Luke Behnke, Grammarly’s vice president of product management, said in the company’s press release. «We’re moving beyond simple suggestions to intelligent agents that understand context and actively help users achieve their communication goals.» 

For professionals, Grammarly is marketing the tools as a way to tailor communication for different audiences. The Reader Reactions agent, for example, can highlight whether an email comes across as too vague or too blunt. And the Expert Review tool provides industry-specific feedback without requiring specialized prompts.

The launch also marks the debut of Docs as a standalone writing hub. Until now, Grammarly has functioned mostly as a browser extension layered on top of other apps, like Chrome or Google Docs. Grammarly Docs signals a push to keep users inside the platform’s own environment, though the company says it will expand agent functionality to the more than half a million apps and sites where its tools already appear.

The new features are rolling out immediately for free and premium subscribers, though plagiarism and AI detection remain locked behind the paid plan. Enterprise and education customers will also gain access later this year.

Early reactions to Grammarly’s AI agents 

Early reactions suggest strong interest from students and educators alike as the company shifts from a grammar checker to a productivity platform. Educators have noted the potential benefits and risks of tools like the AI Grader. Some users on social media welcomed the update as a way to cut through the anxiety of essay writing, while others questioned whether it might make students too dependent on machine feedback.

The launch comes just months after Grammarly raised $1 billion to fuel its AI pivot and acquired the email startup Superhuman. Together, those moves point to an ambitious strategy for the company: one that seeks to transform Grammarly from a background utility into a full-fledged productivity suite powered by AI. 

Continue Reading

Trending

Copyright © Verum World Media