Connect with us

Technologies

How Razer Is Bringing Vibration ‘Soundtracks’ to Tomorrow’s Games and Movies

New software lets developers automatically add vibrations synced to the action on screen.

At GDC 2023, I sat down in gaming accessory company Razer’s office and felt something I’d never experienced before: playing a video game and having my controller and headphones vibrate at different intensities that I could adjust to my liking. Then I watched a blockbuster superhero film with headphone vibration tuned to the action — all powered by the same software.

The software development kit, or SDK, created by tech studio Interhaptics, which was acquired by Razer last year, lets companies easily add vibration to their games, films and other media. Interhaptics founder Eric Vezzoli, now Razer’s general manager of Interhaptics, walked me through a demonstration of what the software can do. 

He noted that the software takes just a day to be implemented into a game, and then vibration will be automatically added for any feedback device, be it a controller, smartphone, headphones, haptic vest or other device. Even if a developer is adding peripherals with different vibration frequency ranges, the software can add haptic feedback that’s suited for each device. That simplifies the process when, say, trying to set vibration levels to be similar on iPhones and Android phones, which have very different vibration ranges.

«We take the designer’s intention and we translate it to machine capability,» Vezzoli said.  

The haptic composer software, as it’s properly called, also puts vibration control in gamers’ hands. In the game demo I played, I was able to toggle whether vibrations would happen when triggered by my character, enemies or the environment, as well as tone them down if they were too intense. The software put control of vibration feedback in my hands.

The software SDK launched with support for PS4, PS5, Meta Quest 2 and X-input controllers, as well as iOS and Android phones. Developers can set up custom vibrations for potentially any number of different peripherals with haptics, allowing them to pulse or vibrate at different intensities to convey whatever emotion or action fits the game or movie scene.

That list of peripherals includes the Razer Kraken V3 HyperSense headphones, which have haptic motors spread around both earcups and are the headphones I wore for the demo. While I was playing the simple dungeon-crawling game that Vezzoli and his team built to show off the SDK, every sword swing by my character pulsed vibration around my ears, while enemies hitting my character buzzed my ears in a noticeably different way. 

Then I watched scenes from films with headphone vibration coinciding with exciting moments — buzzing along while a superhero used their powers, or, during a suspenseful silence, pulsing at a low frequency that subtly alternated between ears, like a heartbeat. 

If I’m being honest, it felt weird to have headphones buzzing around my ears with dynamic patterns — the pitter-patter of heartbeats or triumphant vibrating bursts of superheroes clashing, which I’m used to hearing via sound effects, not feeling on my skin. 

But I could see how, if I were to get used to dynamic vibrations around my ears — or with future devices, elsewhere on my body — they could make entertainment more immersive. I remember discovering how much listening to footsteps made me better at finding enemies in first-person shooters, and dynamic vibrations about explosions or activity could similarly point me in the right direction. Movies and shows, which rely on visuals and soundscapes to convey tone and mood, could add a new layer with haptics — and the technology seems ideally suited for VR developers to add texture to their immersive worlds.

Razer and Interhaptics’ software is admittedly a bit future-facing, since controllers and smartphones are far more common than vibrating headphones or other peripherals. But the company is sending out developer kits with the Razer Kraken V3 HyperSense headphones for developers to try adding the SDK software to their game. 

«It’s a different type of experience, and we believe we can generate enormous value from a user experience playing these games,» said Vezzoli. 

Technologies

AT&T Says It’s Pumping $250 Billion Into New Infrastructure Improvements

Continue Reading

Technologies

Apple’s New Smart Home Display Delayed Until Fall Over Siri Issues

It has been nearly a year and a half since the company announced the AI-powered product.

Your home could get smarter with Apple’s Siri, but it will have to wait a few more months. Bloomberg reported the iPad-shaped AI home hub won’t be ready until September, several months after the company was hoping to launch it this spring. Apple engineers first need to complete work on a new and improved Siri assistant for the home device, code-named J490, according to Bloomberg.

Apple was hoping to release J490 this month, along with a slew of other new devices, including the iPhone 17e, MacBook Neo, MacBook Air M5new Pro models, and iPad Air M4. Apple first teased the smart home display in November 2024.

A representative for Apple did not immediately respond to a request for comment.

Siri is Apple’s virtual assistant that uses voice recognition and AI to fulfill a variety of tasks and commands, along with intriguing uses. You might use Siri to find your iPhone — «Hey Siri, where are you?» — or to hear the weather forecast — «Siri, what will the weather be today?» Siri is available on iPhones, MacBooks and iPads. It was launched in 2011 as a feature of the iPhone 4S.

As CNET reported last month, Apple engineers have struggled to push the upgraded Siri assistant out the door. It isn’t fast enough, gets confused by complex commands and doesn’t interact well with other Apple AI models. The company is also wrestling with how much personal data to access to inform the AI, and the new Siri is not yet able to complete in-app tasks, such as finding a photo and posting it to socials, all with one command.

It has been nearly two years since Apple announced that it would give Siri a major upgrade. In the meantime, competitors like Alexa Plus and Gemini for Home have entered the marketplace.

Tech tester Jon Rettinger, whose YouTube channel has 1.66 million subscribers, says the repeated delays in upgrading Siri can «erode» confidence in Apple’s ability to keep up in the AI race.

«Apple as a whole is still one of the strongest companies on the planet. But their AI play is clearly the weakest link in an otherwise very strong chain,» Rettinger told CNET.

Rettinger said he has had issues getting Siri to complete basic commands, such as setting two alarms at the same time, and that it’s a bit of «a mess» right now.

«Having said that, the iPhone has such massive market penetration that I’m not sure it will actually matter in the end. Which is kind of wild when you think about it,» Rettinger said.

Facial recognition for residents

The hardware for the forthcoming smart home display has already been finished. It resembles an iPad and can be either attached to a wall or rest on a half-domed-shaped base, the Bloomberg report said.

The device will be equipped with facial recognition, so when residents walk up to it, they will be shown personalized data such as music preferences, news headlines, appointments, reminders, tasks and so on.

The screen interface will include a bunch of circular app icons, similar to the display on an Apple Watch. The Bloomberg report said the smart home display will be the first of several home devices by Apple. Future products include a tabletop robotic limb with a 9-inch screen, a smart security camera and a Face ID-enabled smart doorbell.

Continue Reading

Technologies

The Pixel 10A Is Available. Here’s How to Get Yours

Continue Reading

Trending

Copyright © Verum World Media