Connect with us

Technologies

No, the Viral iPhone Fold Video Isn’t Real. How We Know It’s Fake

Whether AI or clever 3D-printed mock-up, that’s not a real foldable iPhone going around the internet.

I know we’re all excited for the upcoming iPhone Fold, but be wary of fake leaks — like the supposed unboxing video that’s been making the rounds online.

Upcoming phones will always be the subject of rumors and leaks, and no device is more hyped than the foldable that Apple has purportedly been working on for years. Lots of that early info points toward a release later this year during the usual September iPhone release window, which makes the lead-up fertile territory for falsified leaks like the aforementioned video. 

Unfortunately, with the advance of generative AI tools that fabricate videos based on text prompts and other inputs, it’s easier than ever to fake your way to internet fame. Nowadays, videos churned out by gen AI tools have the correct number of fingers on hands, better lighting and far fewer indicators that they’re inauthentic. 

But there are still some tells that you’re not seeing the real deal — both in the video and when it’s released.

First, let’s dissect the video. A person in a gray long-sleeved shirt or sweatshirt rotates a box labeled «iPhone Fold» and pulls it open. On the first watch, a lot of signature Apple elements are present. The product is tucked inside snug packaging and presented screen-side-out to the opener, and there’s both a charging cord and supplementary materials tucked underneath. It all looks authentic enough — at least believably not generated by AI. 

But AI or not, there are a few details that are strong evidence that this isn’t an actual Apple device. When opening the package, there’s a peel-off protector for the inner screen, not the outer. The multicolored insert claims the device is IP68 dust- and water-resistant, which is rare for foldables. Only the Google Pixel 10 Pro Fold and Honor Magic V6, among a handful of others, have water-resistant ratings.

The device itself is suspect, and if not AI-generated, it’s likely 3D-printed. The cream-colored back makes an odd sound when scratched (unlike what glass or ceramic sounds like), and the device’s halves don’t fold neatly against each other — another thing that the design-obsessed Apple likely wouldn’t allow. What’s more, when it’s fully unfolded, the back of the supposed foldable has a big gap between both halves over the hinge, which other phone makers have solved in their flexible-screen devices.

There’s skepticism around its design, too. Yes, Apple’s patents point toward a wider style of foldables similar to the first Google Pixel Fold, but the supposed iPhone Fold in the video is so squat in its dimensions that its internal screen would make for bizarre dimensions that aren’t tall enough to fit the aspect ratio of, say, an iPad

iPhone Fold may or may not be the final name of the device, as rumors have disagreed for years on its product designation, with the most recent suggesting it could be deemed the iPhone Ultra. 

Since we don’t see it turn on, there’s no indication of how its software is laid out — which form of iOS or even iPadOS it might use. That makes this short, squat design even more suspect.

And then there are the factors outside of the video. Apple leaks happen, but we’ve only had a few pre-release leaks like CAD files, official renders or cases that agree on a design — and yet, this is supposedly the iPhone Fold’s final form, which looks somewhat but not completely like a recent CAD render

To the video’s credit, taking this many words to suspect and disprove its authenticity is a credit to its plausibility. There’s a lot of commitment to Apple staples, from product packaging to theorizing the final design of the foldable itself. If nothing else, it’s a functional guess at what the supposed iPhone Fold might look like, and how it might look coming out of the box. 

We’ll know in September at the earliest if Apple chooses to release its foldable in that window — and I’m sure we’ll see plenty of other leaks and rumors on the device before then.

Technologies

Watch a Robot Stuff Cash Into a Wallet Just Like You Do

Generalist AI’s Gen-1 model is all about «teaching robots physical common sense.»

In 2026, we’re seeing robots progress by leaps and bounds with markedly improved dexterity, the kind of progress long needed in the quest for truly useful household helpers. Now a new AI model has arrived to power robots through activities, including folding laundry, constructing boxes, fixing other robots and even filling wallets with flimsy paper money.

Earlier this month, California-based company Generalist AI released Gen-1, a new physical AI model that makes robots capable of performing all of these tasks (and more) with success. It’s a big step forward in terms of robots designed for the real world based on intelligence born from the real world, Pete Florence, co-founder and CEO of Generalist AI told me.

In most of the example videos published by the company, Gen-1 is seen running on a pair of robotic arms, but that’s not all it’s built for. «Gen-1 is designed to be the brain of any robot, meaning the same model can run on a humanoid, an industrial arm or other robotic systems,» said Florence.

Already, this has proved to be a breakthrough year for general-purpose humanoid robots, with companies including Boston Dynamics and Honor unveiling cutting-edge bots capable of uncannily humanlike movements. The market for robots is expected to explode, with one estimate from Morgan Stanley predicting growth to a $5 trillion market by 2050. Predictions see robots coming for industry, retail, hospitality and care environments before eventually landing in our homes. To get us there, we need to see further advances in AI.

Training robots to live alongside humans

Over the past few years we’ve seen large language models, such as ChatGPT, Gemini and Claude, evolve at lightning speed. The same hasn’t been true of the physical AI models required to power robots, in large part because of a lack of data to train those models on. Robots — and especially humanoid robots — must learn to navigate a world built for humans just as a human would.

Often this data is collected from robots performing tasks while being teleoperated by humans, but not Gen-1. Instead, the dataset used to train Generalist AI’s models has been assembled by humans completing millions of different tasks using wearable technology.

«We built our own lightweight ‘data hands’ and distributed them globally to learn how people actually interact with objects, with all the subtle force feedback, tactile feel, slips, corrections and recoveries that define human dexterity in the real world,» said Florence. «That kind of data is critical for teaching robots physical common sense, the intuitive understanding and ability to adapt in real time rather than execute rigid instructions.»

Generalist AI has released a series of videos showing the model running on robots repetitively performing a range of different tasks, with the most compelling, perhaps, being a robot drawing cash out of a wallet before reinserting it into the same pocket. This is a fiddly task that many humans fumble over. It’s clearly not easy for the robot, either, given the flimsiness of the paper money and the fabric of the wallet — and yet it completes the task.

Another video shows a robot sorting socks by color, folding them in neat piles and counting the number of pairs using a touchscreen. Other tricky tasks the model can complete include unzipping and filling a pencil case with pens, stacking oranges in a neat pyramid and plugging in an Ethernet cable.

These videos show the breadth of Gen-1’s capabilities, but more impressive is the success rate with which it can complete certain tasks. Generalist AI measured the model’s hit rate against the previous version and found Gen-1 could successfully service a robot vacuum cleaner in 99% of cases (up from 50% for Gen-0), fold boxes in 99% of cases (up from 81% for Gen-0) and package up phones in 99% of cases (up from 62% for Gen-0).

Robots do improv

Most robots are programmed to complete a task in a specific and orderly way. But what happens when a curve ball gets thrown? «The smallest changes in the environment can cause failures,» said Florence.

An important skill robots need, which humans innately possess, is the ability to think on their feet. This is why Gen-1 has been designed with improvisation in mind so it can come up with strategies to complete tasks. Florence gives me an example of a robot using two hands to reposition an awkwardly placed part for an automotive task, even though it has only been trained to use one. 

«This kind of creativity has been largely absent from robotics until now,» he said.

Significant work still needs to be done when it comes to beefing up robots’ improv chops, but early progress show glimpses of a positive impact on both reliability and speed, says Florence. «We’re beginning to see real progress and are excited to push the boundaries of embodied intelligence.» 

After all, there may come a day when you need a robot in your house that can fix all your other smaller robots.  

Continue Reading

Technologies

iPhone 17 Pro Camera Battles the Galaxy S26 Ultra: Let the Fun Begin

They’re both top-end flagship phones, but which one takes better photos? I wanted to find out.

Both Apple’s iPhone 17 Pro and Samsung’s Galaxy S26 Ultra earned coveted CNET Editors’ Choice awards in their full reviews. And they damned well earned them, too, thanks to their stellar overall performance and wealth of top-end tech on board. But they also garnered praise for their camera quality, with both able to take great-looking photos in a variety of conditions. But which does it better? 

As a professional photographer myself, I was keen to find out, so I took them on a series of photo walks around Scotland to put them to the test in the same conditions. 

Before we dive in, a few notes from me. First, all images were captured in JPEG format using the standard camera app on each phone. On some images on the iPhone, Apple’s Gold Photographic Style was activated; on others, it was set to Standard, and I’ll be highlighting which is which. The images have been imported into Adobe Lightroom for comparison purposes and exported at smaller file sizes to better suit online viewing. No edits to the images themselves were made, and no sharpening was applied on the export. 

Read moreThese Are the Best Phone Cameras That We’ve Tested

Crucially, though, it’s important to keep in mind that the analysis here is my opinion. Photography is largely subjective, and what might look good to one person might not to another. For me, I love a more natural-looking image with accurate tones that I could then edit further later if I want to. You may like a punchy, vibrant tone straight out of the camera, and that’s fine. You’ll just need to take my results here with a slight pinch of salt. 

All that said, let’s dive in.

This was an image I took with the Gold filter accidentally enabled on the iPhone. So its warmer color tones are to be expected to an extent, but what I liked more here is the depth of shadow that the iPhone has maintained. The S26 Ultra has done a fair bit of processing here to lift those shadows and create a more balanced exposure overall, but I think it’s killed some of the evening drama as a result. I see this in a lot of Android phones, to be fair. 

Taken earlier in the day, there’s much less difference to be seen here. The iPhone’s colors are a bit warmer, thanks to the Gold filter, but they actually look more natural as a result. The shot doesn’t look warm in its white balance; it just has a richness to it, while the S26 Ultra’s shot looks quite cold. 

I switched the iPhone to Standard Photographic Style here, and as a result, the shot it took looks pretty similar to that taken by the Galaxy S26 Ultra. The exposures are pretty much the same, and while the green plants on the steps definitely look more vivid in the Galaxy’s shot, the colors elsewhere are broadly on par. 

If I’m nitpicking — which I really have to when the phones cost this much money — the S26 Ultra appears to have done a neater job rendering the details on the front of the VW Camper’s spare wheel. I also noticed more detail in some of the small twigs on the tree, especially where they’re visible against the sky. Is that a difference you’d ever notice without a side-by-side comparison? Definitely not. But this whole article is basically an exercise in pedantry, so I will continue to pick away at even the tiniest of things in these photos.

I’m back on the Gold Photographic Style with the iPhone here, so again, those warmer tones are to be expected, but I will say again that I much prefer the deeper shadows seen on the house in the Apple phone’s image. It looks much more natural, while the S26 Ultra’s shot looks a bit too HDR and oversaturated for my tastes. But that’s not the most important thing here…

What took me more by surprise was what happened when I put each phone into the ultrawide camera mode. The iPhone’s color tones stay almost exactly the same, but the Galaxy’s image has shifted quite dramatically between the main and ultrawide lenses.

The blue sky has shifted its hue into a much more teal-toned color, and I’m surprised by just how different it looks from the main camera. I usually expect to see these sorts of color shifts on cheaper phones, where there’s less effort put into ensuring consistent colors across the lenses. So I’m a bit disappointed to see Samsung’s phones producing such a noticeable shift here. 

The iPhone 17 Pro also displays a color shift, but it’s far less pronounced than the S26 Ultra’s.

I turned on the zooms on both phones. With its 10x optical zoom, the S26 Ultra has a longer reach than the 8x on the iPhone 17 Pro, but in terms of details within those images, there’s honestly nothing to choose between them. Again, the iPhone had the Gold style applied, so it looks warmer, and also again, the S26 Ultra has gone further in lightening those shadows. I can’t really say either one is better than the other in this example. 

But there’s a much bigger difference in this example. The colors are much richer in the iPhone’s shot, even though the Photographic Style is set to Standard. The S26 Ultra’s shot looks like the phone’s white balance has been tricked by the warm orange tones of the brickwork, and produced a colder-looking image as a result. 

But I also don’t like what the S26 Ultra has done with the details here. It’s oversharpened the scene, giving a weird, crunchy look to the subject that looks extremely unnatural. The iPhone, despite not having the same zoom range on paper, has delivered a much better-looking image, even when viewed at the same scale. 

But here the opposite seems to have happened. The iPhone has looked at this warm, sun-drenched scene and automatically set its white balance to cool it, while the S26 Ultra has maintained those warmer tones. Sure, the greens of the leaves in the S26’s image look almost neon, but the image overall is the nicer of the two in my view. 

The iPhone has done a much better job here of capturing the warmer tones that I loved so much when I took these images. I do think the S26 Ultra has gone too far in its hyper-saturation of the green leaves. Sure, it’s a punchy look, but if I wanted that much saturation, I’d maybe add a bit more back in in the editing stage. I’d much rather have a more natural image as a starting point, so the iPhone takes the win here for me.

There’s so little to pick out between the images here. The greens are a little more vibrant in the S26 Ultra’s shot, but the tones overall in the iPhone’s are a bit more natural. Neither one is a spectacular photo, and honestly, you may as well toss a coin to decide which one is better. 

Switching to the ultrawide lenses on both phones, the S26 Ultra has again gone quite hard on the saturation, delivering a much more vibrant blue sky than it did in its image from the main camera. As before, I’m not a fan of this sort of high-contrast, high-saturation photo. As a result, the iPhone 17 Pro is my preferred shot here.

I think the S26 Ultra’s tendency towards vibrancy has helped here, however, with this shot of spring blossom looking more joyful than the almost drab-looking image from the iPhone. 

And sure, the colors are a little overbaked from the S26 Ultra’s ultrawide image, but it still screams «spring» more than the iPhone’s shot, which again looks pretty dull and lifeless by comparison.

I was thrilled to find these fishermen hanging out in Edinburgh, and I think the iPhone has done the better job of capturing the moment. The Gold Photographic Style hasn’t produced an overly warm image here. It’s more like it applied just the correct white balance, with the S26 Ultra’s shot looking quite cold. It’s especially the case on the pink paintwork on the base of the building, which looks richer and much more true-to-life on the iPhone’s image.

At night, both phones have done a good job of capturing this complex image. The bright moon has been kept under control, and there’s plenty of detail still visible in some of the more shadowy areas. The exposures are also broadly similar (the iPhone’s is a touch brighter), and even when peering up close, there’s not much to choose from in terms of detail. 

It’s a slightly different story here, though. The iPhone’s shot is much brighter, but that results in some detail being lost in the highlights inside the phone booth. The S26 Ultra has retained that highlight detail, though its overall shot is darker. Personally, I prefer the darker version, especially as it’s much more in line with the moody nighttime aesthetic I was going for. 

What I don’t love is how much the S26 Ultra has oversharpened its image. Like the earlier image of the figure sitting on the wall, this image has been digitally sharpened to the point that the details look crunchy, high-contrast and ultimately quite unnatural. Which image would I choose — properly exposed but oversharpened, or natural details with blown-out highlights? Ideally, I’d simply take the photo again on the iPhone and lower the exposure a tad. But between the two images above, I’d probably go for the one shot on the Samsung phone.

iPhone 17 Pro vs. Galaxy S26 Ultra: Which has the better camera?

I always complain that these photo-capturing comparison stories are really close and therefore difficult to make into compelling articles, but this one felt especially close. In some shots, the iPhone’s more natural shadow rendering and less reliance on over-sharpening and other digital processing factors make them look better to my eye. But in other examples — especially the image with the tree trunks surrounded by ivy — the S26 Ultra has done a much better job with its color balancing. 

Overall, Samsung’s phone leans harder into contrast and saturation, which is literally the same thing we’ve said about Samsung’s phones since it first started putting cameras in them. Buying a Samsung camera phone has always meant getting more vibrant, punchy images out of it, and that’s exactly the case here. If you want quick images of your friends and family that look good enough to share straight to your family WhatsApp group, the S26 Ultra will serve you well. 

The iPhone 17 Pro tends to be more neutral in its color and contrast adjustments, which typically gives a more natural base for you to then add any extra edits of your own. It’s why Apple’s phones have typically always been the device of choice for more enthusiast or pro photographers and video creators. I count myself among that crowd, and it’s why the iPhone 17 Pro remains my preferred model of the two. But really, these are both excellent phones with superb cameras, and you can’t go far wrong with either.

Continue Reading

Technologies

Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay

Verum Messenger Expands Its Capabilities: Verum Finance Card Can Now Be Topped Up via Apple Pay

In its latest update, Verum Messenger takes a major step toward integrating communication and financial services. Users can now enjoy a long-awaited feature — topping up their Verum Finance card directly through Apple Pay.

A New Level of Convenience

The integration with Apple Pay significantly simplifies the top-up process. Users no longer need to go through complex transfer steps or rely on third-party services. Just a few taps — and the funds are instantly credited to the card.

This is especially valuable for those who use Verum Messenger not only for communication but also for managing their finances within the ecosystem.

Finance and Messaging in One App

This update reinforces Verum’s strategy to combine in a single product:

  • secure communication
  • cryptocurrency operations
  • everyday financial tools

Verum Messenger is no longer just a messaging app — it is evolving into a полноценную fintech platform.

Security and Speed

Apple Pay is known for its high level of security thanks to:

  • biometric authentication
  • payment tokenization
  • no sharing of card details

By integrating these technologies, Verum Messenger ensures that financial operations are not only convenient but also максимально secure.

What This Means for Users

The update brings several key benefits:

  • instant card top-ups
  • simplified user experience
  • reduced reliance on third-party payment services
  • deeper integration of finance into everyday communication

Looking Ahead

The addition of Apple Pay is just one step in the evolution of the Verum ecosystem. It’s clear the team is moving toward creating a unified digital environment where users can handle most of their needs — from communication to capital management — within a single app.

Continue Reading

Trending

Copyright © Verum World Media