Technologies
Razer Kiyo Pro Ultra Review: It Comes So Close to Greatness
The 4K streaming-optimized webcam can deliver excellent quality compared to current competitors, but it can also be just a little too glitchy.
My initial reaction to the Razer Kiyo Pro Ultra’s video was «Wow! Finally a webcam as good as a compact vlogging camera.» My reaction after trying to adjust the settings, especially when using it with a third-party application, was «I’m going to reach through my screen and punch you now.» Seriously: Razer’s Synapse software is the only thing preventing this $300 (£300, AU$500) 4K webcam for streamers and power videoconferencers from getting an Editors’ Choice award.
Synapse is the only way to control most of the settings, so it can make or break your experience. If you don’t need to change settings besides zoom, focus or white balance that often (they’re available via the Windows driver), then you’ll probably be OK. And even Synapse 3 doesn’t work on the Mac, so the webcam’s not well suited for that platform.
Like
- Excellent quality and performance
- Nice built-in lens cover design
- Has a relatively large number of adjustable settings which compensate for issues other cameras have
Don’t Like
- Synapse control of camera is glitchy and the camera occasionally hangs when changing settings
- You can only change settings when Synapse has exclusive control of the camera
The Kiyo Pro Ultra’s closest competitor would have been the Elgato Facecam Pro, which no longer seems to be available anywhere despite shipping in November 2022. (It used a previous generation of the Sony Starvis sensor, and it’s always possible that it’s being reworked with the newer sensor.)
That camera supported 4K at 60fps compared to the Razer’s 30fps (at 1080p and lower it can do 60fps), but otherwise the Razer has a lot of the same strengths, including manual exposure controls, user presets and other settings which can help you tweak the quality of your output, such as MJPEG quality (for streaming at 1440p or 4K) and the ability to meter off your face in autoexposure mode (important if you’re off center) and lens distortion compensation.
While it looks similar to the rest of its Kiyo siblings, the Pro and the X (on our list of the best webcams we’ve tested), it has something I’ve wanted for a while: a built-in lens cover. Razer cleverly incorporated it as an iris that closes when you rotate the outer ring.
Top marks for quality and performance
When it’s good, the Kiyo Pro Ultra is great. It incorporates a 1/1.2-inch Sony Starvis 2 sensor, which is a just bit smaller than the 1-inch sensor in compact vlogging cameras like the Sony ZV-1 but loads bigger than the sensors in other webcams, with a good size f1.7 aperture.
The larger sensor and aperture means it shows perceptible depth-of-field blur. It doesn’t have as wide a field of view as many webcams, only up to 82 degrees (72 degrees with distortion correction on) rather than 90 or more, which could affect its suitability for your needs.


The Ultra displays excellent tonal range for what it is, though it falls short in handling bright areas. It needs some software tweaking for that, I think. It has the typical HDR option, but in a backlit shot with a properly exposed foreground (as well as without), it didn’t help clip the overbrightness in the back. There are toggles for both dark and light rooms, but neither seemed to make a perceptible difference. I’ve had other cameras handle it better.
It meters properly, for the most part. Center metering works best if you’re in the center — face metering overexposes oddly without tweaking the exposure compensation, otherwise. But if you lean to the side, face metering keeps it from spiking when it sees your black chair instead of your face. White balance is very good as long as you’re not in too dark an environment. Even then it’s not bad. Nor does it lose a lot of color saturation.
You can toggle a couple of noise reduction settings and they do make a significant difference in low light. The distortion compensation makes a visible difference as well.
Standard autofocus is meh, just like all the other webcams. But there are several settings to mitigate the frequent hunting, which other webcams don’t have. Face autofocus does a good job of keeping it from hunting when you move your head, and there’s a «stylized lighting» setting which helps the AF system lock when the lighting might otherwise confuse it.
The camera handles some of the image processing that might otherwise be sent to the PC, notably the MJPEG compression of the stream you’re sending, and you can set how aggressively it compresses either automatically or on a performance-to-quality continuum.
Still needs some baking
Unfortunately, it’s still just a little too glitchy and the software limits it unintentionally. You can’t access any of the settings in Synapse — most notably resolution/frame rate and manual exposure (ISO and shutter speed) — unless camera preview is enabled. And Windows only allows one application to access a camera at a time.
So, for example, if you’ve accidentally left the resolution at 4K but you need it to be 1080p in OBS, to change it you have to first deactivate the camera in OBS — thankfully, OBS has that option, but Nvidia Broadcast doesn’t. Then jump over to Synapse, turn on preview, change the resolution, turn off preview, jump back to OBS and reactivate the camera. And resolution, among other settings, doesn’t seem to be saved as part of the profiles you can create.
Doing it once isn’t that much of a problem. After the 10th time in an hour it gets old.
It’s also complicated by the occasional failure of settings to kick in, which sometimes forces you to loop back through that activate-deactivate cycle: Why does my adjusted exposure not look adjusted? Do I have to kick it to get autofocus to kick in? The preview in Synapse isn’t always accurate, though that’s not unique to Synapse, but it means you can’t assume your adjustments there will be correct. Synapse also froze several times while I was trying to swap between profiles.
Almost every other reasonable webcam utility allows you to change settings while viewing within the application you need them for. Yes, sometimes a few are disabled (because Windows), but at least they’re not all unavailable. It’s possible that all these issues can be ameliorated with firmware and software patches, but I have learned never to assume that just because they can be that they will be.
The Razer Kiyo Pro Ultra is a capable webcam that just needs some software and firmware polish before I’m comfortable considering it a reliable, consistent performer.
The best laptops in every category
- Best Laptop for 2023
- Best Windows Laptops
- Best Laptop for College
- Best Laptop for High School Students
- Best Budget Laptop Under $500
- Best Dell Laptops
- Best 15-Inch Work and Gaming Laptops
- Best 2-in-1 Laptop
- Best HP Laptops
- Best Gaming Laptop
- Best Cheap Gaming Laptop Under $1,000
- Best Chromebook: 8 Chromebooks Starting at Under $300
Technologies
I’ve Seen It With My Own Eyes: The Robots Are Here and Walking Among Us
The «physical AI» boom has created a world of opportunity for robot makers, and they’re not holding back.
It’s been 24 years since CNET first published an article with the headline The robots are coming. It’s a phrase I’ve repeated in my own writing over the years — mostly in jest. But now in 2026, for the first time, I feel confident in declaring that the robots have finally arrived.
I kicked off this year, as I often do, wandering the halls of the Las Vegas Convention Center and its hotel-based outposts on the lookout for the technology set to define the next 12 months. CES has always been a hotbed of activity for robots, but more often than not, a robot that makes a flashy Vegas debut doesn’t go on to have a rich, meaningful career in the wider world.
In fact, as cute as they often are and as fun as they can be to interact with on the show floor, most robots I’ve seen at CES over the years amount to little more than gimmicks. They either come back year after year with no notable improvements or are never seen or heard from again.
In more than a decade of covering the show, I’ve been waiting for a shift to occur. In 2026, I finally witnessed it. From Hyundai unveiling the final product version of the Boston Dynamics Atlas humanoid robot in its press conference to Nvidia CEO Jensen Huang’s focus on «physical AI» during his keynote, a sea change was evident this year in how people were talking about robots.
«We’ve had this dream of having robots everywhere for decades and decades,» Rev Lebaredian, Nvidia’s vice president of Omniverse and simulation told me on the sidelines of the chipmaker’s vast exhibition at the glamorous Fontainebleau Hotel. «It’s been in sci-fi as long as we can remember.»
Throughout the show, I felt like I was watching that sci-fi vision come to life. Everywhere I went, I was stumbling upon robot demos (some of which will be entering the market this year) drawing crowds, like the people lining up outside Hyundai’s booth to see the new Atlas in action.
So what’s changed? Until now, «we didn’t have the technology to create the brain of a robot,» Lebaredian said.
AI has unlocked our ability to apply algorithms to language, and it’s being applied to the physical world, changing everything for robots and those who make them.
The physical AI revolution
What truly makes a robot a robot? Rewind to CES 2017: I spent my time at the show asking every robotics expert that question, sparked by the proliferation of autonomous vehicles, drones and intelligent smart home devices.
This exercise predated the emergence of generative AI and models such as OpenAI’s ChatGPT, but already I could see that by integrating voice assistants into their products, companies were beginning to blur the boundaries of what could be considered robotics.
Not only has the tech evolved since that time, but so has the language we use to talk about it. At CES 2026, the main topic of conversation seemed to be «physical AI.» It’s an umbrella term that can encompass everything from self-driving cars to robots.
«If you have any physical embodiments, where AI is not only used to perceive the environment, but actually to take decisions and actions that interact with the environment around it … then it’s physical AI,» Ahmed Sadek, head of physical AI and vice president of engineering at chipmaker Qualcomm told me.
Autonomous vehicles have been the easiest expression of physical AI to build so far, according to Lebaredian, simply because their main challenge is to dodge objects rather than interact with them. «Avoiding touching things is a lot easier than manipulating things,» he said.
Still, the development of self-driving vehicles has done much of the heavy lifting on the hardware, setting the stage for robot development to accelerate at a rapid pace now that the software required to build a brain is catching up.
For Nvidia, which worked on the new Atlas robot with Boston Dynamics, and Qualcomm, which announced its latest robotics platform at CES, these developments present a huge opportunity.
But that opportunity also extends to start-ups. Featured prominently at the CES 2026 booth of German automotive company Schaeffler was the year-and-a-half-old British company Humanoid, demonstrating the capabilities of its robot HMND 01.
The wheeled robot was built in just seven months Artem Sokolov, Humanoid’s CEO, told me, as we watch it sort car parts with its pincerlike hands. «We built our bipedal one for service and household much faster — in five months,» Sokolov added.
Humanoid’s speed can be accounted for by the AI boom plus an influx of talent recruited from top robotics companies, said Sokolov. The company has already signed around 25,000 preorders for HMND 01 and completed pilots with six Fortune 500 companies, he said.
This momentum extends to the next generation of Humanoid’s robots, where Sokolov doesn’t foresee any real bottlenecks. The main factors dictating the pace will be improvements in AI models and making the hardware more reliable and cost effective.
Humanoid hype hits its peak
Humanoid the company might have the rights to the name, but the concept of humanoids is a wider domain.
By the end of last year, the commercialization of humanoid robots had entered an «explosive phase of growth,» with a 508% year-on-year increase in global market revenue to $440 million, according to a report released by IDC this month.
At CES, Qualcomm’s robot demonstration showed how its latest platform could be adapted across different forms, including a robotic arm that could assemble a sandwich. But it was the humanoids at its booth that caused everyone to pull out their phones and start filming.
«Our vision is that if you have any embodiment, any mechatronic system, our platform should be able to transform it to a continuously learning intelligent robot,» said Qualcomm’s Sadek. But, he added, the major benefit of the humanoid form is its «flexibility.»
Some in the robotics world have criticized the focus on humanoids, due to their replication of our own limitations. It’s a notion that Lebaredian disagrees with, pointing out that we’ve designed our world around us and that robots need to be able to operate within it.
«There are many tasks that are dull, dangerous and dirty — they call it the three Ds — that are being done by humans today, that we have labor shortages for and that this technology can potentially go help us with,» he said.
We already have many specialist robots working in factories around the world, Lebaredian added. With their combination of arms, legs and mobility, humanoids are «largely a superset of all of the other kinds of robots» and, as such, are perfect for the more general-purpose work we need help with.
The hype around robots — and humanoids in particular — at CES this year felt intense. Even Boston Dynamics CEO Robert Playter acknowledged this in a Q&A with reporters moments after he unveiled the new Atlas on stage.
But it’s not just hype, Playter insisted, because Boston Dynamics is already demonstrating that they can put thousands of robots in the market. «That is not an indication of a hype cycle, but actually an indication of an emerging industry,» he said.
A huge amount of money is being poured into a rapidly growing number of robotics start-ups. The rate of this investment is a signal that the tech is ready to go, according to Nvidia’s Lebaredian.
«It’s because, fundamentally, the experts, people who understand this stuff, now believe, technically, it’s all possible,» he said. «We’ve switched from a scientific problem of discovery to an engineering problem.»
Robot evolution: From industry to home
From what I observed at the show, this engineering «problem» is one that many companies have already solved. Robots such as Atlas and HMND 01 have crossed the threshold from prototype to factory ready. The question for many of us will be as to when will these robots be ready for our homes.
Playter has openly talked about Boston Dynamics’ ambitions in this regard. He sees Atlas evolving into a home robot — but not yet. Some newer entrants to the robotics market — 1X, Sunday Robotics and Humanoid among them — are keen to get their robots into people’s homes in the next couple of years. Playter cautions against this approach.
«Companies are advertising that they want to go right to the home,» he said. «We think that’s the wrong strategy.»
The reasons he listed are twofold: pricing and safety. Playter echoed a sentiment I’ve heard elsewhere: that the first real use for home humanoid robots will be to carry out care duties for disabled and elderly populations. Perhaps in 20 years, you will have a robot carry you in and out of bed, but relying on one to do so when you’re in a vulnerable state poses «critical safety issue,» he said.
Putting robots in factories first allows people to work closely with them while keeping a safe distance, allowing those safety kinks to be ironed out. The deployment of robots at scale in industrial settings will also lead to mass manufacturing of components that will, at some point, make robots affordable for the rest of us, said Playter (unlike 1X’s $20,000 Neo robot, for example).
Still, he imagines the business model will be «robots as a service,» even when they do first enter our homes. Elder care itself is a big industry with real money being spent that could present Boston Dynamics with a market opportunity as Atlas takes its first steps beyond the factory floor.
«I spent a lot of money … with my mom in specialty care the last few years,» he said. «Having robots that can preserve autonomy and dignity at home, I think people will actually spend money — maybe $20K a year.»
The first «care» robots are more likely to be companion robots. This year at the CES, Tombot announced that its robotic labrador, Jennie, who first charmed me back at the show in 2020, is finally ready to go on sale. It served as yet another signal to me that the robots are ready to lead lives beyond the convention center walls.
Unlike in previous years, I left Vegas confident that I’ll be seeing more of this year’s cohort of CES robots in the future. Maybe not in my home just yet, but it’s time to prepare for a world in which robots will increasingly walk among us.
Technologies
Today’s Wordle Hints, Answer and Help for Jan. 29, #1685
Here are hints and the answer for today’s Wordle for Jan. 29, No. 1,685.
Looking for the most recent Wordle answer? Click here for today’s Wordle hints, as well as our daily answers and hints for The New York Times Mini Crossword, Connections, Connections: Sports Edition and Strands puzzles.
Today’s Wordle puzzle was a tough one for me. I never seem to guess three of the letters in this word. If you need a new starter word, check out our list of which letters show up the most in English words. If you need hints and the answer, read on.
Read more: New Study Reveals Wordle’s Top 10 Toughest Words of 2025
Today’s Wordle hints
Before we show you today’s Wordle answer, we’ll give you some hints. If you don’t want a spoiler, look away now.
Wordle hint No. 1: Repeats
Today’s Wordle answer has no repeated letters.
Wordle hint No. 2: Vowels
Today’s Wordle answer has one vowel and one sometimes vowel.
Wordle hint No. 3: First letter
Today’s Wordle answer begins with F.
Wordle hint No. 4: Last letter
Today’s Wordle answer ends with Y.
Wordle hint No. 5: Meaning
Today’s Wordle answer can refer to a pastry that breaks apart easily.
TODAY’S WORDLE ANSWER
Today’s Wordle answer is FLAKY.
Yesterday’s Wordle answer
Yesterday’s Wordle answer, Jan. 28, No. 1684 was CRUEL.
Recent Wordle answers
Jan. 24, No. 1680: CLIFF
Jan. 25, No. 1681: STRUT
Jan. 26, No. 1682: FREAK
Jan. 27, No. 1683: DUSKY
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Technologies
Today’s NYT Strands Hints, Answers and Help for Jan. 29 #697
Here are hints and answers for the NYT Strands puzzle for Jan. 29, No. 697.
Looking for the most recent Strands answer? Click here for our daily Strands hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections and Connections: Sports Edition puzzles.
Today’s NYT Strands puzzle is a bit of a puzzler until you realize the theme. Some of the answers are difficult to unscramble, so if you need hints and answers, read on.
I go into depth about the rules for Strands in this story.
If you’re looking for today’s Wordle, Connections and Mini Crossword answers, you can visit CNET’s NYT puzzle hints page.
Read more: NYT Connections Turns 1: These Are the 5 Toughest Puzzles So Far
Hint for today’s Strands puzzle
Today’s Strands theme is: Talk of the town.
If that doesn’t help you, here’s a clue: What a legend.
Clue words to unlock in-game hints
Your goal is to find hidden words that fit the puzzle’s theme. If you’re stuck, find any words you can. Every time you find three words of four letters or more, Strands will reveal one of the theme words. These are the words I used to get those hints but any words of four or more letters that you find will work:
- ROIL, CLAIM, RARE, HELP, PEAR, PEARS, MORE, COIN, SPEAR, SPEARS
Answers for today’s Strands puzzle
These are the answers that tie into the theme. The goal of the puzzle is to find them all, including the spangram, a theme word that reaches from one side of the puzzle to the other. When you have all of them (I originally thought there were always eight but learned that the number can vary), every letter on the board will be used. Here are the nonspangram answers:
- HERO, ICON, CELEBRITY, SUPERSTAR, PERSONALITY
Today’s Strands spangram
Today’s Strands spangram is CLAIMTOFAME. To find it, start with the C that’s four letters to the right on the very top row, and wind down.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Toughest Strands puzzles
Here are some of the Strands topics I’ve found to be the toughest.
#1: Dated slang. Maybe you didn’t even use this lingo when it was cool. Toughest word: PHAT.
#2: Thar she blows! I guess marine biologists might ace this one. Toughest word: BALEEN or RIGHT.
#3: Off the hook. Again, it helps to know a lot about sea creatures. Sorry, Charlie. Toughest word: BIGEYE or SKIPJACK.
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow

