Technologies
Meta Says Its New AI Model Can Understand the Physical World
The new model could allow robots to grasp concepts like gravity and object permanence while relying less on large amounts of video or training data.
Meta says a new generative AI model it released Wednesday could change how machines understand the physical world, opening up opportunities for smarter robots and more.
The new open-source model, called Video Joint Embedding Predictive Architecture 2, or V-JEPA 2, is designed to help artificial intelligence understand things like gravity and object permanence, Meta said.
Current models that allow AI to interact with the physical world rely on labeled data or video to mimic reality, but this approach emphasizes the logic of the physical world, including how objects move and interact. The model could allow AI to understand concepts like the fact that a ball rolling off of a table will fall.
Meta said the model could be useful for devices like autonomous vehicles and robots by ensuring they don’t need to be trained on every possible situation. The company called it a step toward AI that can adapt like humans can.
One struggle in the space of physical AI has been the need for significant amounts of training data, which takes time, money and resources. At SXSW earlier this year, experts said synthetic data — training data created by AI — could help prepare a more traditional learning model for unexpected situations. (In Austin, the example used was the emergence of bats from the city’s famed Congress Avenue Bridge.)
Meta said its new model simplifies the process and makes it more efficient for real-world applications because it doesn’t rely on all of that training data.
Technologies
Tim Cook Stepping Down? Apple CEO’s 65th Birthday Today Sparks Succession Talk
Apple is no doubt considering who it will choose to fill the chief executive role once Tim Cook decides to retire. Here are a few potential candidates reportedly being considered.
With Tim Cook turning 65 on Saturday, Nov. 1, talks have been growing around the question of who his successor as Apple CEO could be, should he choose to retire. Cook has made no announcement that he’ll be stepping down, but according to Bloomberg’s Mark Gurman, the tech giant is working behind the scenes to ensure a seamless transition when the time does come.
Cook replaced Steve Jobs in 2011, and after a period of uncertainty, Cook ushered Apple into its most profitable era. Stock-watching website Stocktwits reports that the company’s stock has increased by around 1,800% since Cook took over leading the company.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Jobs may have introduced devices like the iPhone into everyday use that changed how we interact with technology, but Cook expanded on the Apple experience. Under his guidance, the company built upon Apple’s smartphone by introducing subscription services and more mobile products, including earbuds and wearables.
He introduced Apple Pay, Beats headphones became part of the company’s ecosystem, the Apple Watch launched 10 years ago, and Apple even entered the entertainment business, producing original Oscar-winning movies and Emmy-winning TV shows through Apple TV Plus.
Read more: Best iPhone in 2025: Here’s Which Apple Phone You Should Buy
We should reiterate that the notion of Cook stepping down is pure speculation at this point. We don’t know what Apple’s CEO is currently planning or what his thoughts about retirement may be. That said, there are a handful of contenders who have reportedly been part of the succession conversation.
Potential Apple CEO contenders
Apple likely has «a solid bench of successors» that the company’s board has been developing, says Bryan Ma, VP of Devices Research at IDC.
«But the anxiety gets amplified when there isn’t clear visibility for such a valuable and iconic company,» Ma says. «Compounding the challenge is the fact that the bar has been set by big rock stars like Steve Jobs and Tim Cook. The next generation of leaders have very big shoes to fill.»
John Ternus, Apple’s current vice president of Hardware Engineering, was top of Gurman’s list. Ternus has been with the tech giant for more than two decades, so he has the knowledge and experience for a chief executive upgrade. There would be value in having an engineer behind the wheel.
Ternus appeared during the September Apple event to introduce the iPhone Air. At 50, he’s the same age Cook was when he took over as Apple CEO.
Other potential contenders are also being considered, including Craig Federighi, Apple’s senior vice president of software engineering; Greg Joswiak, Apple’s senior vice president of worldwide marketing; and Jeff Williams, the company’s former chief operating officer, according to a report by Apple Insider. On Oct. 10, Bloomberg reported that Federighi also will soon be overseeing the Apple Watch operating system watchOS, while Ternus will be overseeing Apple Watch hardware engineering once Williams departs at the end of the year.
Federighi has been with Apple for a long time and has the public speaking experience — frequently speaking during Apple Events — that would be vital if he replaced Cook as CEO. Considering his current role, Joswiak has a more marketing perspective and a broader overview of the company and may not be as hands-on with the tech as Ternus and Federighi. And according to Gurman, Williams was viewed as a shoo-in to be Cook’s replacement until his role as COO was announced to be ending. (He’s now Apple’s senior vice president of design, watch and health.) Cook held the position of chief operating officer before he replaced Jobs as CEO in 2011. Sabih Khan will be stepping into that COO role, which also puts his name in the running.
When Cook steps down, Apple will undoubtedly have a pool of qualified talent to choose from to take up the leadership mantle. Who exactly will take the mantle remains to be seen.
Apple didn’t immediately respond to a request for comment.
Technologies
I Went Hands-On With the OnePlus 15’s Camera and You Need to See the Results
What better first test run than taking it on a neighborhood photo safari?
The OnePlus 15 is the next premium handset from the Chinese phone-maker, and I just got my hands on it. To give its cameras a whirl, I took it out for a quick spin through a hip corner of Los Angeles.
The OnePlus 15’s big advantage is that it’s one of the first to run the Snapdragon 8 Elite Gen 5, Qualcomm’s next-generation chip for high-end phones, which was launched in September. The system-on-a-chip has a big influence on how photos come out, processing every image captured through the rear cameras.
The OnePlus 15 has three 50-megapixel rear cameras, along with a selfie shooter on the front, and I took photos of my neighborhood flora and fauna using them all. While there’s a certain level of polish expected of premium phone cameras, this phone has something new: it’s the first major OnePlus handset released since the company’s partnership with Hasselblad ended. For years, OnePlus incorporated the iconic Swedish camera maker’s color science and image calibration in its cameras.
With Hasselblad gone, the OnePlus 15 features the debut of the DetailMax Engine, a loftily-titled computational processing system that aims to «present scenes as they truly are, without over-beautification or distortion,» as the company’s official blog post explained.
That means a new page for shooting photos on a OnePlus phone, which made me want to know what the OnePlus 15 is capable of. Join me through a casual tour of a vibrant Los Angeles neighborhood, taking the kinds of snapshots that make up the majority of everybody’s camera roll. I’ll need to spend a lot more time with the device to give it a comprehensive review.
Our first shot is of the outside of The Silver Lake House, a neighborhood Thai restaurant. While I clearly can’t resist a slight Dutch angle here, the blend of colors look distinct and not oversaturated — a win for true-to-life processing. I like the way the OnePlus 15 captured the light and shadows filtering through the trees, and the camera has handled the lens flare well without over-exposing that area. Also, notice the reflection on the chrome on the heat lamp.
Here’s a close-up of knick-knack plant vases on a windowsill overlooking the restaurant’s indoor tables. The light is really balanced, bright on the foreground outside the eatery and dimmer within — but colors and details are still visible inside. You can also pick out some detail in the reflections on the window of the street behind me.
I couldn’t resist this 1960s Ford Thunderbird sitting idly on the street, a cruising car from yesteryear resting in a hipper corner of LA. Note the texture of the dirt streaks over the paint contrasted against the shiny chromed metal surrounding the taillights. More importantly, despite the camera’s focus on the foreground, the OnePlus 15 still manages to capture the blue sky in the background, complete with details in the clouds.
I took this photo of a nearby dog park with the ultrawide lens, which preserves humdrum details in the brown dirt amid sprouted grass along the bottom.
Here’s an image of the same dog park that I took while zoomed in at 7x magnification. It has a lot of detail and color. But we can go further!
Here’s the dog park photographed at 120x magnification, the farthest this phone can zoom in. The image looked grainy as heck on the phone’s screen when I shot it, but that DetailMax Engine’s post-processing has done relative wonders, making this semi-recognizable despite a lot of smudging at the edges caused by noise reduction — look between the chain links. To be sure, this is not a great image — it’s nearly painterly — but the fact that it can zoom in this far and still serve up a photo with something recognizable is amazing.
Here’s a selfie featuring yours truly. I think this photo has good detail and shadow, but what most impresses me are the mountains in the distance, which can be seen to some degree through the classic Los Angeles haze (marine layer, not smog) occluding the air, not the OnePlus selfie camera.
For comparison, here’s a selfie I took at night. The color is fine, with decent details in the foreground, though they start to blur behind me — notice the bricks on the bottom right, the posters on the light pole on the mid-left, and especially the building over my shoulder.
Here’s the obligatory night shot of a Los Angeles street. While the city will never be dark enough to test the phone’s ability to capture constellations of stars in the night sky, this does show the contrast between warm streetlights and the bright neon. The details of the stucco pockmarked the walls of the bowling alley are clear, even from across the street. Look closely at the texture of the street’s pavement. It’s a granular mix of grays flecked with white spots. All the grime of the city, preserved by the OnePlus 15’s new shiny cameras.
That’s it for the first look at the OnePlus’ camera capabilities. Happy Halloween! And keep an eye out for my full OnePlus 15 review.
Technologies
A $20K Humanoid Robot to Help Around the House? The Price Isn’t the Only Caveat
The new Neo robot from 1X is designed to do chores. It’ll have to learn a lot from you — and about you.
It stands 5 feet, 6 inches tall, weighs about as much as a golden retriever and costs near the price of a brand-new budget car.
This is Neo, the humanoid robot. It’s billed as a personal assistant you can talk to and eventually rely on to take care of everyday tasks, such as loading the dishwasher and folding laundry.
Neo doesn’t work cheap. It’ll cost you $20,000. And even then, you’ll still have to train this new home bot.
If that sounds enticing, preorders are now open (for a mere $200 down). You’ll be signing up as an early adopter for what Neo’s maker, a California-based company called 1X, is calling a «consumer-ready humanoid.» That’s opposed to other humanoids under development from the likes of Tesla and Figure, which are, for the moment at least, more focused on factory environments.
Neo is a whole order of magnitude different from robot vacuums like those from Roomba, Eufy and Ecovacs, and embodies a long-running sci-fi fantasy of robot maids and butlers doing chores and picking up after us. If this is the future, read on for more of what’s in store.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
What the Neo robot can do around the house
The pitch from 1X is that Neo can do all manner of household chores: fold laundry, run a vacuum, tidy shelves, bring in the groceries. It can open doors, climb stairs and even act as a home entertainment system.
Neo appears to move smoothly, with a soft, almost human-like gait, thanks to 1X’s tendon-driven motor system that gives it gentle motion and impressive strength. The company says it can lift up to 154 pounds and carry 55 pounds, but it is quieter than a refrigerator. It’s covered in soft materials and neutral colors, making it look less intimidating than metallic prototypes from other companies.
The company says Neo has a 4-hour runtime. Its hands are IP68-rated, meaning they’re submersible in water. It can connect via Wi-Fi, Bluetooth and 5G. For conversation, it has a built-in LLM, the same sort of AI technology that powers ChatGPT and Gemini.
The primary way to control the Neo robot will be by speaking to it, just as if it were a person in your home.
Still, Neo’s usefulness today depends heavily on how you define useful. The Wall Street Journal’s Joanna Stern got an up-close look at Neo at 1X’s headquarters and found that, at least for now, it’s largely teleoperated, meaning a human often operates it remotely using a virtual-reality headset and controllers.
«I didn’t see Neo do anything autonomously, although the company did share a video of Neo opening a door on its own,» Stern wrote.
1X CEO Bernt Børnich told her that Neo will do most things autonomously in 2026, though he also acknowledged that the quality «may lag at first.»
What you need to know about Neo and privacy
Part of what early adopters are signing up for is to let Neo learn from their environment so that future versions can operate more independently.
That learning process raises privacy and trust questions. The robot uses a mix of visual, audio and contextual intelligence — meaning it can see, hear and remember interactions with users throughout their homes.
«If you buy this product, it is because you’re OK with that social contract,» Børnich told the Journal. «It’s less about Neo instantly doing your chores and more about you helping Neo learn to do them safely and effectively.»
1X says it’s taking steps to protect your privacy: Neo listens only when it recognizes it’s being addressed, and its cameras will blur out humans. You can restrict Neo from entering or viewing specific areas of your home, and the robot will never be teleoperated without owner approval, the company says.
But inviting an AI-equipped humanoid to observe your home life isn’t a small step.
The first units will ship to customers in the US in 2026. There is a $499 monthly subscription alternative to the $20,000 full-purchase price, though that will be available at an unspecified later date. A broader international rollout is promised for 2027.
Neo’s got a long road ahead of it to live up to the expectations set by Rosie the Robot in The Jetsons way back when. But this is no Hanna-Barbera cartoon. What we’re seeing now is a much more tangible harbinger of change.
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow
