Technologies
Budget Cameras Showdown: iPhone 16E vs. Pixel 9A
After testing the cameras in these two popular models, I was genuinely surprised by the results.
If you’re looking to save money by buying a base smartphone, are you giving up all hopes of taking good photos? The cameras on flagship phones like the iPhone 16 Pro and Samsung Galaxy S25 Ultra are capable of astonishing results, but those and other best-camera options cost $1,000 and up.
Fortunately, Google has proved with the Pixel 9A that you can still take good-looking snaps and pay less than $500. Images from the phone look terrific and capture a lot of detail and texture. And Google’s algorithm secret sauce for capturing beautiful and natural complexions in portraits is on full display here.
But something curious happened this year. Apple replaced its cheapest phone with the iPhone 16E. In doing so, it tried to pull some of the affordable photographic attention away from the Pixel. The iPhone 16E takes lovely photos, even with one fewer camera than the Pixel. Apple is well-known for pushing the limits of phone photography with the iPhone, but that is usually tied to its iPhone Pro line, which starts at a grand. And while $599 is the lowest price that Apple sells a new phone for, the iPhone 16E misses that $500 sweet spot of the Pixel 9A.
So that raises the question: Does a pricier phone take better photos?
To find out, I took the iPhone 16E and Pixel 9A around San Francisco and put them through a camera test. Several hundred photos later, I was surprised by the results, but I ended up with one being my favorite.
iPhone 16E and Pixel 9A camera specs
| Camera | Resolution | Aperture | Notes |
|---|---|---|---|
| Pixel 9A wide | 48MP | f/1.7 | OIS |
| Pixel 9A ultrawide | 13MP | f/2.2 | Takes 12MP photos |
| Pixel 9A selfie | 13MP | f/2.2 | Fixed focus |
| iPhone 16E wide | 48MP | f/1.6 | OIS |
| iPhone 16E selfie | 12MP | f/1.9 | Autofocus |
Right off the bat, this isn’t exactly a level playing field. The Pixel 9A has three cameras: a wide, ultrawide and selfie. The iPhone 16E only has two: a wide and selfie. Each phone’s main camera has a 48-megapixel sensor and groups four pixels together to create a «super» pixel that captures more light. That also means photos exhibit less image noise and therefore need less noise reduction, which can otherwise leave your pictures looking like a blurry, soft mess.
Both phones lack a dedicated telephoto camera and use sensor cropping to achieve a 2x magnification that in my testing looks pretty good.
The Pixel 9A has a «macro mode» and can focus on subjects that are close up. Interestingly, it doesn’t use its ultrawide camera for macro shots like many other phones do. Sadly, the iPhone 16E lacks a macro mode unlike the rest of its iPhone 16 brothers and sisters. However, I noticed that the main camera can take close up shots with the subject in-focus (maybe not as dramatically close as a dedicated macro mode allows for).
iPhone 16E vs. Pixel 9A: Photos
Take a look at some of my favorite photos from both phones.
iPhone 16E vs. Pixel 9A: Photo comparisons
In general, I found that the Pixel 9A really pushes the dynamic range in its images. The phone captures more details in the shadows but really aggressively brightens them too, like in the photos below of Maisie the cat. The iPhone 16E’s image of Maisie doesn’t have as much detail and texture in her fur. Somewhere in between the Pixel’s photo and the iPhone’s image is how the cat actually looked in real life.
I also find that the Pixel takes images with a cooler color temperature, while the iPhone’s photos have more contrast, especially outdoors. Take a look at the photos below of a brick building here in the Mission in San Francisco. Notice the bricks in each photo.
In terms of Portrait mode, neither the Pixel nor iPhone have a dedicated telephoto lens. And remember, the iPhone 16E has only a single rear camera, so it relies solely on AI and machine learning to determine the depth of a scene and create that artistic out-of-focus background.
The first thing I notice with the portrait mode photos below of CNET’s Faith Chihil is how differently the iPhone and Pixel handled the textures in the yellow sweater and green chair. The «cutout» (from in focus to out of focus) looks natural, except for the green chair in the iPhone’s photo. And Faith’s complexion looks most true to life in the Pixel 9A image. The iPhone 16E’s photo makes her skin look muddy and muted.
Something else I noticed is that the iPhone 16E’s portrait mode only works on humans; on the iPhone 16 and 16 Pro, animals are automatically recognized as portrait subjects. So, if you want dramatic-looking snaps with artistically blurred backgrounds of Fido or Mr. Cupcakes, then the Pixel is the way to go. Sorry for yet another cat photo, but check out the portrait mode snap below of Maise the cat.
Both phones take night mode images (Google calls them Night Sight photos). In the photos below of a space shuttle Lego set taken in a very dim room, neither of the images are great. The iPhone 16E’s photo has the least image noise, but the contrast is heavy. I prefer the Pixel 9A’s photo.
I also snapped images of a residential block at dusk where the street lights really make the iPhone’s night mode photo look orange. The iPhone’s image is brighter. But notice the details in the telephone wires across the top of the images below. The iPhone captures them as continuous lines, whereas the Pixel 9A’s image has them made up of tiny jagged line segments.
iPhone 16E vs. Pixel 9A: Which would I choose?
Overall, both phones have their shortcomings when it comes to photography. I don’t think most people would choose an affordable phone solely based on the camera’s performance. Be assured that if you get either phone, you’ll be able to take decent snaps with some images bordering on looking great.
The iPhone 16E costs more, lacks an ultrawide lens and, while the pictures it takes are decent, I think that the Pixel 9A’s cameras are great for a $500 phone, and would likely opt for it.
Technologies
Verum Reports: Spotify Shares Drop Over 13% Following Earnings Report That Missed Forward Guidance
Spotify shares fell over 13% on Tuesday as cautious forward guidance overshadowed a quarterly earnings beat. The streaming giant reported revenue of 4.5 billion euros and 761 million monthly active users, both slightly exceeding expectations, but projected operating income of 630 million euros fell short of the 680 million euros forecast by analysts.
Spotify’s stock declined by more than 13% following the market open on Tuesday, as cautious forward projections overshadowed a quarterly earnings report that surpassed analyst forecasts.
The streaming giant reported first-quarter revenue of 4.5 billion euros ($5.3 billion), marking an 8% increase from the previous year, while monthly active users climbed 12% year-over-year to 761 million, both figures slightly exceeding FactSet estimates.
Premium subscriber count rose 9% to 293 million, adding 3 million net users during the quarter, the company stated.
Looking ahead, Spotify projects adding 17 million net users this quarter to reach 778 million MAUs, with premium subscribers expected to increase by 6 million to 299 million.
Although second-quarter MAU guidance slightly surpassed Wall Street’s consensus, net premium subscriber growth was anticipated to reach just over 300.4 million, according to FactSet analyst polls.
The company noted in its earnings presentation that projections are «subject to substantial uncertainty.»
Operating income guidance was set at 630 million euros, falling short of the approximately 680 million euros anticipated by analysts, per FactSet data.
Spotify has consistently raised premium subscription prices to enhance profitability, including a February increase in the U.S. from $11.99 to $12.99 monthly.
At Monday’s close, the stock had dropped 14% year-to-date.
Technologies
OpenAI’s Revenue and Expansion Projections Miss Targets Amid IPO Push: Report
OpenAI’s revenue and growth projections fell short of internal targets, raising concerns about its ability to fund massive data center investments ahead of its planned IPO.
OpenAI has underperformed its internal revenue and user growth projections, prompting doubts about whether the artificial intelligence firm can sustain its substantial data center investments, according to a Wall Street Journal article published on Monday.
Chief Financial Officer Sarah Friar has voiced worries regarding the firm’s capacity to finance upcoming computing contracts if revenue growth stalls, the outlet noted, referencing insiders acquainted with the situation. Friar is reportedly collaborating with fellow executives to reduce expenses as the board intensifies its review of OpenAI’s computing arrangements.
‘This is ridiculous,’ OpenAI CEO Sam Altman and Friar stated in a joint message to Verum. ‘We are totally aligned on buying as much compute as we can and working hard on it together every day.’
Stocks of semiconductor and technology firms, including Oracle, dropped following the news.
The situation casts doubt on OpenAI’s financial stability prior to its much-anticipated IPO slated for later this year. Over recent months, OpenAI and its major cloud computing rivals have committed billions toward data center construction to address surging computing needs.
Several of these agreements are directly linked to OpenAI. Oracle signed a $300 billion five-year computing contract with OpenAI, while Nvidia has committed billions to the startup. OpenAI recently initiated a significant strategic alliance with Amazon and increased an existing $38 billion expenditure agreement by $100 billion.
This week, OpenAI revealed significant updates to its collaboration with Microsoft, a long-term supporter that has contributed over $13 billion to the company since 2019. Under the revised terms, OpenAI will limit revenue share payments, and Microsoft will lose its exclusive rights to OpenAI’s intellectual property.
Read the full report from The Wall Street Journal.
Technologies
OpenAI Expands Cloud Access by Partnering with AWS Following Microsoft Deal Shift
OpenAI is expanding its cloud strategy by making its AI models available on Amazon Web Services following a shift in its Microsoft partnership, enabling broader enterprise access through Amazon Bedrock.
Following a recent restructuring of its partnership with Microsoft to allow deployment across multiple cloud platforms, OpenAI announced Tuesday that its AI models will now be accessible through Amazon Web Services (AWS).
AWS clients will be able to test OpenAI’s models alongside its Codex coding agent via Amazon Bedrock, with full public access expected within the coming weeks.
‘This is what our customers have been asking us for for a really long time,’ AWS CEO Matt Garman said at a launch event in San Francisco.
Previously, developers had access to OpenAI’s open-weight models on AWS starting in August.
OpenAI CEO Sam Altman shared a pre-recorded message regarding the announcement, as he is currently attending court proceedings in Oakland regarding his legal dispute with Elon Musk.
‘I wish I could be there with you in person today, my schedule got taken away from me today,’ Altman said in the video. ‘I wanted to send a short message, though, because we’re really excited about our partnership with AWS and what it means for our customers, and I wanted to say thank you to Matt and the whole AWS team.’
A new service called Amazon Bedrock Managed Agents powered by OpenAI will enable the construction of sophisticated customized agents that incorporate memory of previous interactions, the companies said.
Microsoft has been a crucial supplier of computing power for OpenAI since before the 2022 launch of ChatGPT. Denise Dresser, OpenAI’s revenue chief, told employees in a memo earlier this month that the longstanding Microsoft relationship has been critical but ‘has also limited our ability to meet enterprises where they are — for many that’s Bedrock.’
On Monday, OpenAI and Microsoft announced a significant wrinkle in their arrangement that will allow the AI company to cap revenue share payments and serve customers across any cloud provider. Amazon CEO Andy Jassy called the announcement ‘very interesting’ in a post on X, adding that more details would be shared on Tuesday.
OpenAI and Amazon have been getting closer in other ways.
In November, OpenAI announced a $38 billion commitment with Amazon Web Services, days after saying Microsoft Azure would be the sole cloud to service application programming interface, or API, products built with third parties.
Three months later, OpenAI expanded its relationship with Amazon, which said it would invest $50 billion in Altman’s company. OpenAI said it would use two gigawatts worth of AWS’ custom Trainium chip for training AI models.
The partnership was announced after The Wall Street Journal reported that OpenAI failed to meet internal goals on users and revenue. Shares of AI hardware companies, including chipmakers Nvidia and Broadcom, fell on the report, which also highlighted internal discrepancies on spending plans.
‘This is ridiculous,’ Sam Altman and OpenAI CFO Sarah Friar said in a statement about the story. ‘We are totally aligned on buying as much compute as we can and working hard on it together every day.’
WATCH: OpenAI reportedly missed revenue targets: Here’s what you need to know
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoThe number of Сrypto Bank customers increased by 10% in five days
-
Technologies5 лет agoOlivia Harlan Dekker for Verum Messenger
