Connect with us

Technologies

How Razer Is Bringing Vibration ‘Soundtracks’ to Tomorrow’s Games and Movies

New software lets developers automatically add vibrations synced to the action on screen.

At GDC 2023, I sat down in gaming accessory company Razer’s office and felt something I’d never experienced before: playing a video game and having my controller and headphones vibrate at different intensities that I could adjust to my liking. Then I watched a blockbuster superhero film with headphone vibration tuned to the action — all powered by the same software.

The software development kit, or SDK, created by tech studio Interhaptics, which was acquired by Razer last year, lets companies easily add vibration to their games, films and other media. Interhaptics founder Eric Vezzoli, now Razer’s general manager of Interhaptics, walked me through a demonstration of what the software can do. 

He noted that the software takes just a day to be implemented into a game, and then vibration will be automatically added for any feedback device, be it a controller, smartphone, headphones, haptic vest or other device. Even if a developer is adding peripherals with different vibration frequency ranges, the software can add haptic feedback that’s suited for each device. That simplifies the process when, say, trying to set vibration levels to be similar on iPhones and Android phones, which have very different vibration ranges.

«We take the designer’s intention and we translate it to machine capability,» Vezzoli said.  

The haptic composer software, as it’s properly called, also puts vibration control in gamers’ hands. In the game demo I played, I was able to toggle whether vibrations would happen when triggered by my character, enemies or the environment, as well as tone them down if they were too intense. The software put control of vibration feedback in my hands.

The software SDK launched with support for PS4, PS5, Meta Quest 2 and X-input controllers, as well as iOS and Android phones. Developers can set up custom vibrations for potentially any number of different peripherals with haptics, allowing them to pulse or vibrate at different intensities to convey whatever emotion or action fits the game or movie scene.

That list of peripherals includes the Razer Kraken V3 HyperSense headphones, which have haptic motors spread around both earcups and are the headphones I wore for the demo. While I was playing the simple dungeon-crawling game that Vezzoli and his team built to show off the SDK, every sword swing by my character pulsed vibration around my ears, while enemies hitting my character buzzed my ears in a noticeably different way. 

Then I watched scenes from films with headphone vibration coinciding with exciting moments — buzzing along while a superhero used their powers, or, during a suspenseful silence, pulsing at a low frequency that subtly alternated between ears, like a heartbeat. 

If I’m being honest, it felt weird to have headphones buzzing around my ears with dynamic patterns — the pitter-patter of heartbeats or triumphant vibrating bursts of superheroes clashing, which I’m used to hearing via sound effects, not feeling on my skin. 

But I could see how, if I were to get used to dynamic vibrations around my ears — or with future devices, elsewhere on my body — they could make entertainment more immersive. I remember discovering how much listening to footsteps made me better at finding enemies in first-person shooters, and dynamic vibrations about explosions or activity could similarly point me in the right direction. Movies and shows, which rely on visuals and soundscapes to convey tone and mood, could add a new layer with haptics — and the technology seems ideally suited for VR developers to add texture to their immersive worlds.

Razer and Interhaptics’ software is admittedly a bit future-facing, since controllers and smartphones are far more common than vibrating headphones or other peripherals. But the company is sending out developer kits with the Razer Kraken V3 HyperSense headphones for developers to try adding the SDK software to their game. 

«It’s a different type of experience, and we believe we can generate enormous value from a user experience playing these games,» said Vezzoli. 

Technologies

Google’s New AI Features Are Trying to Make Data Entry a Thing of the Past

More Gemini AI features will come to Google Docs, Sheets and Slides.

The latest batch of Google updates to its workspace tools highlights AI’s promise to automate mundanity in the workplace. Google Docs, Slides, Sheets and Drive all have new AI-powered features, the company announced Tuesday. The one thing all these updates have in common? Gemini is using your files, emails and chats to give you relevant information, not random answers gleaned from the web.

These updates come as AI is playing a bigger role in our work lives, for better or worse. Agentic tools like Claude Cowork and coding assistants like Anthropic’s Claude Code and OpenAI’s Codex are more capable than chatbots and able to handle tasks announced independently. AI tools are also becoming more customized, with Google’s personalized intelligence rolling out across its platforms to help refine AI outputs to things that are relevant and useful for you. Google continues that trend with this new batch of Workspace updates.

New Gemini AI features in Google Workspace apps will cite their sources after each query. For example, if you ask Gemini in Google Docs to fill out an itinerary template, it will pull the information from your email, chats and files. The «sources» tab in the Gemini side panel will show you where it found the information it used, like your flight confirmation email and chats discussing dinner plans. Seeing where Gemini pulled its answers from is also how you’ll double-check Gemini’s work.

The most impressive new features are in Sheets, where AI can fill in the holes in your spreadsheets. You can describe what you want the AI to do with a simple prompt and avoid writing an exact formula. You can click on an empty cell, select the pop-up that says «Drag to fill with Gemini,» then highlight the cells you want Gemini to fill in. That deploys an AI agent to search the web to fill each cell with the necessary information.

For example, if you have a spreadsheet of the contact info for local companies, you can have Gemini search the web to fill in a the location, CEO and other publicly available information of each company. The tool aims to dramatically reduce the time needed for manual data entry. Gemini can also summarize, categorize and create charts with prompts alone.

You can also chat with Gemini in Sheets and have it scour your raw data to make custom reports and charts. No need for pivot tables if they confound you as much as they baffle me. One of the biggest uses of AI at work is helping create presentations.

In Google Slides, you can now tell Gemini in natural language what you want to appear on a slide, and it will create it, matching the style of your existing slides. You can also ask Gemini to edit your slides if you don’t want to waste time painstakingly moving design elements around the slide. The AI should fill the slides with relevant information based on your instructions and the work files it has access to, so you shouldn’t need to replace a bunch of filler text.

If you use Docs, Sheets and Slides through the Workspace account of your company, then you won’t be able to turn off AI features individually. The managing company is in control of AI access for users. Personal users can tweak their settings to limit Gemini. The new features are rolling out in beta now, in English only, to Google AI Ultra and Pro subscribers in the US, as well as some Google Workspace customers who are part of the Gemini Alpha testing program.

For more, check out the new cowork feature in Copilot and how to use Perplexity AI for deep research.

Continue Reading

Technologies

Nintendo Switches Lanes, Sues US Over Tariffs

Mario wants his money back.

Tariffs implemented by President Donald Trump were struck down by the Supreme Court last month. Companies that were subjected to those fees, such as FedEx and Dollar General, have since sued the federal government, and Nintendo wants a piece of the action. 

Nintendo filed a lawsuit against the federal government in the US Court of International Trade on Friday, as first spotted by Aftermath. The complaint seeks refunds of tariffs Nintendo paid, plus interest, and asks the court to declare the tariffs unlawful and stop the government from collecting them going forward. 

«Since February 1, 2025, President Trump has executed the unlawful Executive Orders, imposing tariffs on imports from a vast swath of countries,» Nintendo said in the complaint. 

When reached for comment, Nintendo of America confirmed the lawsuit. 

«We can confirm that we filed a request. We have nothing else to share on this topic,» Nintendo of America said in an emailed statement on Friday, March 6. 

It’s unclear how much Nintendo paid in tariffs, and it did not state an amount in the lawsuit. While the Switch 2 was priced at $450 when it launched last year, and has stayed at that amount, Nintendo did increase the price of the original Switch and accessories for both consoles. Microsoft and Sony also increased the prices of their hardware and accessories last year due to tariffs. 

The White House didn’t immediately respond to a request for comment. 

On Feb. 20, the Supreme Court ruled by a vote of 6 to 3 that the sweeping tariffs Trump instituted last year exceeded his executive powers. Following the ruling, on the same day, Trump announced a new set of tariffs of 10% on imported goods that would last for 150 days, starting Feb. 24. 

The decision on what to do with the collected tariffs — a reported $166 billion —  has been left to the US Court of International Trade. Judge Richard Eaton told the US Customs and Border Protection on Wednesday, March 4, to refund the importers that were forced to pay tariffs, which is more than 330,000. On Friday, the CBP said it couldn’t easily issue tariff refunds because its system requires duties to be recalculated and refunds processed entry by entry. This process would involve tens of millions of transactions. The agency said it’s updating its systems and could start providing refunds by late April. 

Continue Reading

Technologies

Sony WF-1000XM6 vs. Samsung Galaxy Buds 4 Pro Earbuds: A Photo Finish

Continue Reading

Trending

Copyright © Verum World Media