Technologies
I Saw the AI Future of Video Games: It Starts With a Character Hopping Over a Box
At the 2025 Game Developers Conference, graphics-chip maker Nvidia showed off its latest tools that use generative AI to augment future games.

At its own GTC AI show in San Jose, California, earlier this month, graphics-chip maker Nvidia unveiled a plethora of partnerships and announcements for its generative AI products and platforms. At the same time, in San Francisco, Nvidia held behind-closed-doors showcases alongside the Game Developers Conference to show game-makers and media how its generative AI technology could augment the video games of the future.
Last year, Nvidia’s GDC 2024 showcase had hands-on demonstrations where I was able to speak with AI-powered nonplayable characters, or NPCs, in pseudo-conversations. They replied to things I typed out, with reasonably contextual responses (though not quite as natural as scripted ones). AI also radically modernized old games for a contemporary graphics look.
This year, at GDC 2025, Nvidia once again invited industry members and press into a hotel room near the Moscone Center, where the convention was held. In a large room ringed with computer rigs packed with its latest GeForce 5070, 5080 and 5090 GPUs, the company showed off more ways gamers could see generative AI remastering old games, offering new options for animators, and evolving NPC interactions.
Nvidia also demonstrated how its latest AI graphics rendering tech, DLSS 4 for its GPU line, improves image quality, light path and framerates in modern games, features that affect gamers every day, though these efforts by Nvidia are more conventional than its other experiments. While some of these advancements rely on studios to implement new tech into their games, others are available right now for gamers to try.
Making animations from text prompts
Nvidia detailed a new tool that generates character model animations based on text prompts — sort of like if you could use ChatGPT in iMovie to make your game’s characters move around in scripted action. The goal? Save developers time. Using the tool could turn programming a several-hour sequence into a several-minute task.
Body Motion, as the tool is called, can be plugged into many digital content creation platforms; Nvidia Senior Product Manager John Malaska, who ran my demo, used Autodesk Maya. To start the demonstration, Malaska set up a sample situation in which he wanted one character to hop over a box, land and move forward. On the timeline for the scene, he selected the moment for each of those three actions and wrote text prompts to have the software generate the animation. Then it was time to tinker.
To refine his animation, he used Body Motion to generate four different variations of the character hopping and chose the one he wanted. (All animations are generated from licensed motion capture data, Malaska said.) Then he specified where exactly he wanted the character to land, and then selected where he wanted them to end up. Body Motion simulated all the frames in between those carefully selected motion pivot points, and boom: animation segment achieved.
In the next section of the demo, Malaska had the same character walking through a fountain to get to a set of stairs. He could edit with text prompts and timeline markers to have the character sneak around and circumvent the courtyard fixtures.
«We’re excited about this,» Malaska said. «It’s really going to help people speed up and accelerate workflows.»
He pointed to situations where a developer may get an animation but want it to run slightly differently and send it back to the animators for edits. A far more time-consuming scenario would be if the animations had been based on actual motion capture, and if the game required such fidelity, getting mocap actors back to record could take days, weeks or months. Tweaking animations with Body Motion based on a library of motion capture data can circumvent all that.
I’d be remiss not to worry for motion capture artists and whether Body Motion could be used to circumvent their work in part or in whole. Generously, this tool could be put to good use making animatics and virtually storyboarding sequences before bringing in professional artists to motion capture finalized scenes. But like any tool, it all depends on who’s using it.
Body Motion is scheduled to be released later in 2025 under the Nvidia Enterprise License.
Another stab at remastering Half-Life 2 using RTX Remix
At last year’s GDC, I’d seen some remastering of Half-Life 2 with Nvidia’s platform for modders, RTX Remix, which is meant to breathe new life into old games. Nvidia’s latest stab at reviving Valve’s classic game was released to the public as a free demo, which gamers can download on Steam to check out for themselves. What I saw of it in Nvidia’s press room was ultimately a tech demo (and not the full game), but it still shows off what RTX Remix can do to update old games to meet modern graphics expectations.
Last year’s RTX Remix Half-Life 2 demonstration was about seeing how old, flat wall textures could be updated with depth effects to, say, make them look like grouted cobblestone, and that’s present here too. When looking at a wall, «the bricks seem to jut out because they use parallax occlusion mapping,» said Nyle Usmani, senior product manager of RTX Remix, who led the demo. But this year’s demo was more about lighting interaction — even to the point of simulating the shadow passing through the glass covering the dial of a gas meter.
Usmani walked me through all the lighting and fire effects, which modernized some of the more iconically haunting parts of Half-Life 2’s fallen Ravenholm area. But the most striking application was in an area where the iconic headcrab enemies attack, when Usmani paused and pointed out how backlight was filtering through the fleshy parts of the grotesque pseudo-zombies, which made them glow a translucent red, much like what happens when you put a finger in front of a flashlight. Coinciding with GDC, Nvidia released this effect, called subsurface scattering, in a software development kit so game developers can start using it.
RTX Remix has other tricks that Usmani pointed out, like a new neural shader for the latest version of the platform — the one in the Half-Life 2 demo. Essentially, he explained, a bunch of neural networks train live on the game data as you play, and tailor the indirect lighting to what the player sees, making areas lit more like they’d be in real life. In an example, he swapped between old and new RTX Remix versions, showing, in the new version, light properly filtering through the broken rafters of a garage. Better still, it bumped the frames per second to 100, up from 87.
«Traditionally, we would trace a ray and bounce it many times to illuminate a room,» Usmani said. «Now we trace a ray and bounce it only two to three times and then we terminate it, and the AI infers a multitude of bounces after. Over enough frames, it’s almost like it’s calculating an infinite amount of bounces, so we’re able to get more accuracy because it’s tracing less rays [and getting] more performance.»
Still, I was seeing the demo on an RTX 5070 GPU, which retails for $550, and the demo requires at least an RTX 3060 Ti, so owners of graphics cards older than that are out of luck. «That’s purely because path tracing is very expensive — I mean, it’s the future, basically the cutting edge, and it’s the most advanced path tracing,» Usmani said.
Nvidia ACE uses AI to help NPCs think
Last year’s NPC AI station demonstrated how nonplayer characters can uniquely respond to the player, but this year’s Nvidia ACE tech showed how players can suggest new thoughts for NPCs that’ll change their behavior and the lives around them.
The GPU maker demonstrated the tech as plugged into InZoi, a Sims-like game where players care for NPCs with their own behaviors. But with an upcoming update, players can toggle on Smart Zoi, which uses Nvidia ACE to insert thoughts directly into the minds of the Zois (characters) they oversee… and then watch them react accordingly. These thoughts can’t go against their own traits, explained Nvidia Geforce Tech Marketing Analyst Wynne Riawan, so they’ll send the Zoi in directions that make sense.
«So, by encouraging them, for example, ‘I want to make people’s day feel better,» it’ll encourage them to talk to more Zois around them,» Riawan said. «Try is the key word: They do still fail. They’re just like humans.»
Riawan inserted a thought into the Zoi’s head: «What if I’m just an AI in a simulation?» The poor Zoi freaked out but still ran to the public bathroom to brush her teeth, which fit her traits of, apparently, being really into dental hygiene.
Those NPC actions following up on player-inserted thoughts are powered by a small language model with half a billion parameters (large language models can go from 1 billion to over 30 billion parameters, with higher giving more opportunity for nuanced responses). The one used in-game is based on the 8 billion parameter Mistral NeMo Minitron model shrunken down to be able to be used by older and less powerful GPUs.
«We do purposely squish down the model to a smaller model so that it’s accessible to more people,» Riawan said.
The Nvidia ACE tech runs on-device using computer GPUs — Krafton, the publisher behind InZoi, recommends a minimum GPU spec of an Nvidia RTX 3060 with 8GB of virtual memory to use this feature, Riawan said. Krafton gave Nvidia a «budget» of one gigabyte of VRAM in order to ensure the graphics card has enough resources to render, well, the graphics. Hence the need to minimize the parameters.
Nvidia is still internally discussing how or whether to unlock the ability to use larger-parameter language models if players have more powerful GPUs. Players may be able to see the difference, as the NPCs «do react more dynamically as they react better to your surroundings with a bigger model,» Riawan said. «Right now, with this, the emphasis is mostly on their thoughts and feelings.»
An early access version of the Smart Zoi feature will go out to all users for free, starting March 28. Nvidia sees it and the Nvidia ACE technology as a stepping stone that could one day lead to truly dynamic NPCs.
«If you have MMORPGs with Nvidia ACE in it, NPCs will not be stagnant and just keep repeating the same dialogue — they can just be more dynamic and generate their own responses based on your reputation or something. Like, Hey, you’re a bad person, I don’t want to sell my goods to you,» Riawan said.
Technologies
Microsoft Is Eliminating Passwords in August: Here’s What You Need to Do to Prepare
Microsoft Authenticator has already stopped autofilling passwords, but the biggest change comes next month.

In June, Microsoft Authenticator stopped letting users create new passwords. In July, it turned off the autofill password function. And in August, the login app will stop supporting passwords entirely, moving to more secure passkeys, such as a PIN, fingerprint or facial recognition.
Attila Tomaschek, CNET’s software senior writer and digital security expert, says that passkeys are a safer alternative to the risky password habits practiced by 49% of US adults, according to a recent survey by CNET.
«Passwords can be cracked, whereas passkeys need both the public and the locally stored private key to authenticate users, which can help mitigate risks like falling victim to phishing and brute-force or credential-stuffing attacks,» Tomaschek said.
Using the same password for several accounts or adding personal hints can be a convenient way to remember your login. But that puts you at a big risk for scammers, identity theft and fraud. Here’s more on Microsoft’s plan for eliminating passwords and how to make the switch to passkeys before August.
When will Microsoft Authenticator stop supporting passwords?
Microsoft Authenticator houses your passwords and lets you sign into all your Microsoft accounts using a PIN, facial recognition like Windows Hello, or other biometric data like a fingerprint. Authenticator can be used in other ways, such as verifying you’re logging in if you forgot your password, or using two-factor authentication as an extra layer of security for your accounts. In June, the company stopped letting users add passwords to Authenticator, but here’s a timeline of other changes you can expect from Microsoft.
- July 2025: You won’t be able to use the autofill password function.
- August 2025: You’ll no longer be able to use saved passwords.
If you still want to use passwords instead of passkeys, you can store them in Microsoft Edge. However, CNET experts recommend adopting passkeys during this transition. «Passkeys use public key cryptography to authenticate users, rather than relying on users themselves creating their own (often weak or reused) passwords to access their online accounts,» Tomaschek said.
Why are passkeys a better alternative to passwords?
So what exactly is a passkey? It’s a credential created by the Fast Identity Online Alliance that uses biometric data or a PIN to verify your identity and access your account. Think about using your fingerprint or Face ID to log into your account. That’s generally safer than using a password that is easy to guess or susceptible to a phishing attack.
Passkeys aren’t stored on servers like passwords. Instead, they’re stored only on your personal device. More conveniently, this takes the guesswork out of remembering your passwords and the need for a password manager.
How to set up a passkey in Microsoft Authenticator
Microsoft said in a May 1 blog post that it will automatically detect the best passkey to set up and make that your default sign-in option. «If you have a password and ‘one-time code’ set up on your account, we’ll prompt you to sign in with your one-time code instead of your password. After you’re signed in, you’ll be prompted to enroll a passkey. Then the next time you sign in, you’ll be prompted to sign in with your passkey,» according to the blog post.
To set up a new passkey, open your Authenticator app on your phone. Tap on your account and select «Set up a passkey.» You’ll be prompted to log in with your existing credentials. After you’re logged in, you can set up the passkey.
Technologies
The AI Chatbots We Use Most, and How We Use Them
91% of AI users have a default artificial intelligence assistant they turn to for their AI tasks, a Menlo Ventures report has found.

If you have a particular artificial intelligence tool that you tend to try first every time you’re in need of an AI assist, you’re not alone. According to a new survey, 91% of people who use AI have a favorite chatbot they try first, whether it’s ChatGPT, Gemini, Alexa or something else.
A Menlo Ventures survey of 5,000 adults found that this «default tool dynamic» means most people using AI have chosen a general AI tool they’ll try first for every job, even if it’s not necessarily the best tool for the job.
In the report, ChatGPT is the AI assistant that tops default tools, with 28% of respondents choosing it first. It’s followed by Google’s Gemini at 23%, Meta AI and Amazon’s Alexa, both at 18% and Apple’s Siri at 16%. Other tools including Claude, Grok and Perplexity collectively make up another 33%.
Some of the most common ways people are using these AI tools include composing emails and other writing support, researching topics of interest and managing to-do lists, according to Menlo Ventures.
Some of that, Menlo Ventures says, is «first-mover advantage,» with tools like ChatGPT having built up a following by being the first to offer some chatbot and image-generation features. But, the company warns, «that position is not guaranteed,» with challengers moving fast.
«The consumer market for [large language models] is still nascent and far from saturated,» the report says, «leaving ample room for product innovation to shift market share over time.»
Overall, 61% of Americans have used AI in the last six months and nearly 1 in 5, 19%, rely on it daily, the report says.
Technologies
This Early Prime Day Deal Lets You Grab AirPods 4 At Their Lowest Price Yet
Apple’s AirPods 4 have dropped to their lowest price of 2025 — but this early Prime Day deal won’t last long.

Prime Day is still a few sunsets away, but Amazon isn’t waiting around. The retailer has already started slashing prices across tons of popular products, including the much-loved AirPods 4 (ANC).
For a limited time, you can snag these for 2025’s best price of just $149. This saves you $30, and you don’t even have to enter any codes or clip any coupons. The catch? We can’t promise that this deal will stick around for much longer.
This is the latest generation of Apple’s earbuds. The Apple AirPods 4 have the same H2 chip found in the AirPods Pro 2, so you can expect great sound quality. They also rock a more compact design with this new generation and offer excellent voice-calling performance. Plus, Spatial Audio support adds a touch of personalization.
Note that this deal is for the ANC (active noise cancellation) model of these earbuds. It comes with a wireless charging case and a speaker that can help you find them using Apple’s Find My tracking service.
«With a slightly smaller design, improved sound and Apple’s powerful H2 chip features, the Apple AirPods 4 are a worthy upgrade,» said CNET audio expert David Carnoy in his AirPods 4 review. «But what really makes them special is the noise canceling in the step-up ANC model.»
Hey, did you know? CNET Deals texts are free, easy and save you money.
What’s the competition like? You can find out by reading through the best early Prime Day deals on headphones and speakers, where we’ve rounded up all the latest and greatest prices from the likes of Apple, Sony, Beats and many more big names. Just be sure to get your orders in before the deals expire.
HEADPHONE DEALS OF THE WEEK
-
$300 (save $51)
-
$299 (save $151)
-
$220 (save $180)
Why this deal matters
Apple products rarely go on sale, and when they do, the stock tends to run out before the deal window closes. Combine that with AirPods 4 being Apple’s latest AirPods and this discount being the year’s all-time low price — you have a nice offer in your hands.
-
Technologies2 года ago
Tech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies2 года ago
Best Handheld Game Console in 2023
-
Technologies2 года ago
Tighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года ago
Verum, Wickr and Threema: next generation secured messengers
-
Technologies4 года ago
Google to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies4 года ago
Black Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года ago
Olivia Harlan Dekker for Verum Messenger
-
Technologies4 года ago
iPhone 13 event: How to watch Apple’s big announcement tomorrow