Connect with us

Technologies

Gemini Live Now Has Eyes. We Put the New Feature to the Test

The new feature gives Gemini Live eyes to «see.» I put it through a series of tests. Here are the results.

There I was, walking around my apartment, taking a video with my phone and talking to Google’s Gemini Live. I was giving the AI a tour – and a quiz, asking it to name specific objects it saw. After it identified the flowers in a vase in my living room (chamomile and dianthus, by the way), I tried a curveball: I asked it to tell me where I’d left a pair of scissors. «I just spotted your scissors on the table, right next to the green package of pistachios. Do you see them?»

It was right, and I was wowed. 

Gemini Live will recognize a whole lot more than household odds and ends. Google says it’ll help you navigate a crowded train station or figure out the filling of a pastry. It can give you deeper information about artwork, like where an object originated and whether it was a limited edition.

It’s more than just a souped-up Google Lens. You talk with it and it talks to you. I didn’t need to speak to Gemini in any particular way – it was as casual as any conversation. Way better than talking with the old Google Assistant that the company is quickly phasing out.

Google and Samsung are just now starting to formally roll out the feature to all Pixel 9 (including the new, Pixel 9a) and Galaxy S25 phones. It’s available for free for those devices, and other Pixel phones can access it via a Google AI Premium subscription. Google also released a new YouTube video for the April 2025 Pixel Drop showcasing the feature, and there’s now a dedicated page on the Google Store for it.

All you have to do to get started is go live with Gemini, enable the camera and start talking.

Gemini Live follows on from Google’s Project Astra, first revealed last year as possibly the company’s biggest «we’re in the future» feature, an experimental next step for generative AI capabilities, beyond your simply typing or even speaking prompts into a chatbot like ChatGPT, Claude or Gemini. It comes as AI companies continue to dramatically increase the skills of AI tools, from video generation to raw processing power. Somewhat similar to Gemini Live, there’s Apple’s Visual Intelligence, which the iPhone maker released in a beta form late last year. 

My big takeaway is that a feature like Gemini Live has the potential to change how we interact with the world around us, melding our digital and physical worlds together just by holding your camera in front of almost anything.

I put Gemini Live to a real test

Somehow Gemini Live showed up on my Pixel 9 Pro XL a few days early, so I’ve already had a chance to play around with it. 

The first time I tried it, Gemini was shockingly accurate when I placed a very specific gaming collectible of a stuffed rabbit in my camera’s view. The second time, I showed it to a friend when we were in an art gallery. It not only identified the tortoise on a cross (don’t ask me), but it also immediately identified and translated the kanji right next to the tortoise, giving both of us chills and leaving us more than a little creeped out. In a good way, I think.

In the tour of my apartment, I was following the lead of the demo that Google did last summer when it first showed off these Live video AI capabilities. I tried random objects in my apartment (fruit, books, Chapstick), many of which it easily identified. 

Then I got thinking about how I could stress-test the feature. I tried to screen-record it in action, but it consistently fell apart at that task. And what if I went off the beaten path with it? I’m a huge fan of the horror genre — movies, TV shows, video games — and have countless collectibles, trinkets and what have you. How well would it do with more obscure stuff — like my horror-themed collectibles?

First, let me say that Gemini can be both absolutely incredible and ridiculously frustrating in the same round of questions. I had roughly 11 objects that I was asking Gemini to identify, and it would sometimes get worse the longer the live session ran, so I had to limit sessions to only one or two objects. My guess is that Gemini attempted to use contextual information from previously identified objects to guess new objects put in front of it, which sort of makes sense, but ultimately neither I nor it benefited from this.

Sometimes, Gemini was just on point, easily landing the correct answers with no fuss or confusion, but this tended to happen with more recent or popular objects. For example, I was pretty surprised when it immediately guessed one of my test objects was not only from Destiny 2, but was a limited edition from a seasonal event from last year. 

At other times, Gemini would be way off the mark, and I would need to give it more hints to get into the ballpark of the right answer. And sometimes, it seemed as though Gemini was taking context from my previous live sessions to come up with answers, identifying multiple objects as coming from Silent Hill when they were not. I have a display case dedicated to the game series, so I could see why it would want to dip into that territory quickly.

Gemini can get full-on bugged out at times. On more than one occasion, Gemini misidentified one of the items as a made-up character from the unreleased Silent Hill: f game, clearly merging pieces of different titles into something that never was. The other consistent bug I experienced was when Gemini would produce an incorrect answer, and I would correct it and hint closer at the answer — or straight up give it the answer, only to have it repeat the incorrect answer as if it was a new guess. When that happened, I would close the session and start a new one, which wasn’t always helpful.

One trick I found was that some conversations did better than others. If I scrolled through my Gemini conversation list, tapped an old chat that had gotten a specific item correct, and then went live again from that chat, it would be able to identify the items without issue. While that’s not necessarily surprising, it was interesting to see that some conversations worked better than others, even if you used the same language. 

Google didn’t respond to my requests for more information on how Gemini Live works.

I wanted Gemini to successfully answer my sometimes highly specific questions, so I provided plenty of hints to get there. The nudges were often helpful, but not always. Below are a series of objects I tried to get Gemini to identify and provide information about. 

Technologies

Blue Origin Rocket Grounded After ‘Mishap’ Destroys Customer Satellite

After failing to deliver its first customer satellite into the correct orbit, the FAA grounds Blue Origin’s New Glenn rocket pending an investigation.

Blue Origin‘s New Glenn Mission 3 (NG-3) was supposed to mark another step forward for the company’s long-awaited entry into the commercial space launch market. Instead, the heavy-lift rocket’s third flight ended in a partial failure and, for now, a full stop. The Federal Aviation Administration has grounded the New Glenn vehicle from future missions following a «mishap» during Sunday’s launch from Cape Canaveral Space Force Station in Florida until an investigation into the incident can be completed. 

The mission wasn’t a total loss. New Glenn’s reusable first-stage booster performed as expected and landed successfully. However, the upper stage failed at the job that mattered most for the mission: delivering its payload into the correct orbit. 

That payload (the BlueBird 7 communications satellite for AST SpaceMobile, Blue Origin’s first commercial launch payload for a customer) was supposed to be deployed into a roughly 285-mile orbit. Instead, it reached only about 95 miles — far too low for the satellite’s boosters to keep it in orbit. BlueBird 7 will now be deorbited and destroyed during reentry.

The issue appears to trace back to the rocket’s upper stage. In a statement Monday, Blue Origin CEO Dave Limp said «one of the BE-3U engines didn’t produce sufficient thrust» during its second burn, a critical phase that’s needed to raise and circularize the orbit. Without it, the rocket didn’t have the energy to get the satellite where it needed to go. 

The consequences of that shortfall begin with the FAA classifying the event as a «mishap,» which sounds innocuous, but automatically triggers a mandatory grounding of the New Glenn vehicle while a full safety review is conducted. Blue Origin will lead the investigation under FAA oversight, working to pinpoint the root cause and outline corrective actions. 

Until the agency determines the issue poses no risk to public safety, New Glenn isn’t flying again. How long that process takes is uncertain and can vary wildly. The last time New Glenn was grounded, following a landing failure on its debut mission, it was unable to fly again for months. 

The longer the rocket is grounded, the more friction this will apply to Blue Origin’s 2026 and 2027 plans. In the short term, ripples may delay the deployment of Amazon’s already-delayed satellite broadband network, which would rely in part on New Glenn. Further out, the company’s Blue Moon MK1 lander mission’s target may also be affected by how long New Glenn remains sidelined. 

Then there’s the reputational hit. This was New Glenn’s first mission carrying a commercial customer payload, which would have been a key milestone for the heavy-lift rocket program. While AST SpaceMobile expects the cost of the satellite to be «recovered under the company’s insurance policy,» this is certainly egg on Blue Origin’s face and an opportunity for competitors like SpaceX to exploit. 

AST SpaceMobile said in a statement issued Sunday evening that it expects to continue its plans to expand its satellite network with «an orbital launch every one to two months on average during 2026,» supported by agreements with multiple launch providers.

Blue Origin didn’t immediately respond to a request for comment.

Continue Reading

Technologies

Today’s NYT Connections: Sports Edition Hints and Answers for April 21, #575

Here are hints and the answers for the NYT Connections: Sports Edition puzzle for April 21, No. 575.

Looking for the most recent regular Connections answers? Click here for today’s Connections hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle and Strands puzzles.


Today’s Connections: Sports Edition is a tough one. If you’re struggling with it but still want to solve it, read on for hints and the answers.

Connections: Sports Edition is published by The Athletic, the subscription-based sports journalism site owned by The Times. It doesn’t appear in the NYT Games app, but it does in The Athletic’s own app. Or you can play it for free online.

Read more: NYT Connections: Sports Edition Puzzle Comes Out of Beta

Hints for today’s Connections: Sports Edition groups

Here are four hints for the groupings in today’s Connections: Sports Edition puzzle, ranked from the easiest yellow group to the tough (and sometimes bizarre) purple group.

Yellow group hint: Choosing your team’s future.

Green group hint: Olympic sport.

Blue group hint: Play ball!

Purple group hint: Initials.

Answers for today’s Connections: Sports Edition groups

Yellow group: People involved in making a draft pick.

Green group: Pole vault equipment.

Blue group: First words of baseball positions.

Purple group: T.J. ____

Read more: Wordle Cheat Sheet: Here Are the Most Popular Letters Used in English Words

What are today’s Connections: Sports Edition answers?

The yellow words in today’s Connections

The theme is people involved in making a draft pick. The four answers are coach, GM, owner and scout.

The green words in today’s Connections

The theme is pole vault equipment. The four answers are crossbar, mat, pole and spikes.

The blue words in today’s Connections

The theme is first words of baseball positions. The four answers are center, designated, first and third.

The purple words in today’s Connections

The theme is T.J. ____. The four answers are Ford, Hockenson, Houshmandzadeh and Watt.

Continue Reading

Technologies

Pixel 11 May Revive the Old-School Notification LED With ‘Pixel Glow’

What’s old is new again.

The next Pixel phone may get a feature reminiscent of Nothing’s LED glyphs and old-school Android phones: a notification LED — only more interesting. 

What looks to be a new feature called Pixel Glow was reported earlier Monday by 9to5Google. The name was discovered in the latest Android 17 beta 4, which was released on April 16. Pixel Glow is described as using «subtle light and color on the back of your device to inform you of important activity when it’s face down.» In essence, it’s a fancy notification LED. 

Google didn’t immediately respond to a request for comment.

It appears that Pixel Glow will work in certain situations, like when a favorite contact calls. Unsurprisingly, it seems like the feature might also work when interacting with Gemini hands-free

While 9to5Google says the feature was referenced in previous Android beta and Canary builds under code names, the latest Android 17 gave us an official name for the feature. The progression makes it seem that the feature will debut on the upcoming Pixel 11, which we expect to be announced later this year, a few months after Google I/O in May

The exact location where the LED array might be placed is anyone’s guess at this point. The first CAD renders that we saw for the standard Pixel 11 showed a very similar design, suggesting that the LEDs could be living in the now all-black camera bar, the «G» logo on the back, or perhaps the feature will be reserved for the Pixel 11 Pro models only. 

The Pixel Glow feature will apparently also work on laptops. This cohesion also isn’t surprising, as we already know that Android and ChromeOS will be merged into a single operating system at some point to bring a robust desktop and laptop operating system. 

Continue Reading

Trending

Copyright © Verum World Media