Connect with us

Technologies

Best True Wireless Sports Earbuds With Ear Hooks

Need some more security with your earbuds? Here are our favorite wireless earbuds with ear hooks.

Although earbuds with ear hooks may not be for everyone, you can’t deny that they add an extra level of security. While your buds may fall out of your ears, the hooks keep them attached to your head, preventing you from losing them or having them drop to the pavement, which may lead to some damage. That’s an important feature, particularly if you wear earbuds while running or biking.

Here’s a look at the best earbuds with ear hooks, all of which we’ve tested. Most of them are affordable, with most costing less than $100. We’ll update this list as new sports earbuds hit the market.

Read more: Best Workout Headphones for 2023

Technologies

The Asteroid That Was Predicted to Hit Earth: See It in a New Image

A NASA telescope captured images of the asteroid that had a chance of hitting Earth three months ago.

A few months ago, an asteroid created quite a buzz. At one point, asteroid 2024 YR4 had a 3.1% chance of hitting Earth, creating plenty of headlines about its potential impact. The threat is all but gone, but we now have pictures of the once-worrisome asteroid.

NASA’s James Webb Space Telescope captured images of the asteroid with two cameras, including the Near-Infrared Camera and the Mid-Infrared Instrument. The former measures reflected light, while the latter shows thermal energy. The images were posted by the European Space Agency last week. The pictures demonstrate a couple of fun facts about the asteroid: It is the smallest object ever targeted by JWST’s instruments to date, and it’s one of the smallest objects ever directly measured.

Estimates initially put the asteroid at around 40 to 90 meters. The actual size turned out to be 60 meters or around 180 feet. «These measurements indicate that this asteroid does not share properties observed in larger asteroids,» the European Space Agency said in its post. «This is likely a combination of its fast spin and lack of fine-grained sand on its surface. Further research is needed, however, this is considered consistent with a surface dominated by rocks that are roughly fist-sized or larger.»

According to NASA, the asteroid will only be visible from Earth for a little longer. It orbit is now taking it away from Earth, and the agency estimates that it will disappear from even the strongest telescope instruments by late April or early May. NASA said the asteroid will not be visible again until 2028 when its orbit brings it back toward Earth.

The asteroid may hit the moon

The 2024 YR4 asteroid caused quite a stir when astronomers first reported it via the Minor Planet Center in December 2024. Based on the data collected on its trajectory at that point, the asteroid had a 1.3% chance of hitting Earth.

The percentage fluctuated over the next few months, reaching as high as 3.1%. After further research, the odds dropped dramatically to 0.28%. According to NASA’s Sentry tool, which monitors asteroids that may impact Earth, the threat now sits at 0.00078%.

The moon may not be so lucky. Per NASA, the odds of the asteroid impacting the moon are somewhere around 3.8%, which is even higher odds than the asteroid ever had of impacting Earth. Scientists are gathering data before the asteroid disappears, but it’s likely we won’t know more until the asteroid comes back into view in 2028.

Continue Reading

Technologies

Want New iPhone Controls? Here’s the Latest From iOS 18.4

One control in particular brings Apple’s Visual Intelligence to more iPhones.

Apple released iOS 18.4 on March 31, and the update brought bug fixes, new emoji and a new recipes section in Apple News to all iPhones. The update also brought a handful of new controls to your iPhone’s Control Center, including a control that brings Visual Intelligence to the iPhone 15 Pro and Pro Max.

When Apple released iOS 18 in September, the update remodeled the Control Center and gave you more, uh… control over how the feature functions. With iOS 18, you can resize controls, assign some controls to their own dedicated page and adjust the placement of controls to your liking. Apple has also introduced more controls to the feature, making it a central hub for all your most-used iPhone features. 

With iOS 18.4, Apple continues to expand the number of controls you can add to the Control Center. If you have the update on your iPhone, you can add ambient music controls, and Apple Intelligence-enabled iPhones get a few AI controls in the menu, too.

Read more: Everything You Need to Know About iOS 18

Here’s what to know about the new controls and how to add them to your Control Center.

Ambient Music controls

Apple gave everyone four new controls in the Control Center library under the Ambient Music category. These controls are Sleep, Chill, Productivity and Wellbeing. Each of these controls can activate a playlist filled with music that corresponds to the specific control — Sleep, for example, plays ambient music to help lull you to bed.

Some studies suggest white noise could help adults learn words and improve learning in environments full of distractions. According to the mental health company Calm, certain kinds of music can help you fall asleep faster and improve the quality of your sleep. So these new controls can help you learn, fall asleep and more.

Here’s how to find these controls. 

1. Swipe down from the top-right corner of your Home Screen to open your Control Center. 
2. Tap the plus (+) sign in the top-left corner of your screen. 
3. Tap Add a Control. 

You should see a section of controls called Ambient Music. You can also search for «Ambient Music» in the search bar at the top of the control library. 

Under Ambient Music, you’ll see all four controls. Tap one (or all) of them to add them to your Control Center. Once you’ve added one or all the controls to your Control Center, go back to your Control Center and tap one to start playing music.

You can also change the playlist for each control. Here’s how.

1. Swipe down from the top-right corner of your Home Screen to open your Control Center. 
2. Tap the plus (+) sign in the top-left corner of your screen. 
3. Tap the Ambient Music control you want to edit.
4. Tap the playlist to the right of Playlist

A dropdown menu will appear with additional playlists for each control. So if you’re in the Sleep control, you’ll see other playlists like Restful Notes and Lo-Fi Snooze. If you have playlists in your Music app, you’ll also see the option From Library, which pulls music from your library. Tap whichever playlist you want and it will be assigned to that control. 

Apple already lets you transform your iPhone into a white noise machine with Background Sounds, like ocean and rain. But Ambient Music is actual music as opposed to more static sounds like in that feature. 

Both of these features feel like a way for Apple to present itself as the first option for when you want some background music to help you fall asleep or be productive. Other services, like Spotify and YouTube, already have ambient music playlists like these, so this could be Apple’s way of taking some of those service’s audience.

Apple Intelligence controls

Only people with an iPhone 15 Pro, Pro Max or the iPhone 16 lineup can access Apple Intelligence features for now, and those people got three new dedicated Apple Intelligence controls with iOS 18.4. Those controls are Talk to Siri, Type to Siri and Visual Intelligence. 

Here’s how to find these controls. 

1. Swipe down from the top-right corner of your Home Screen to open your Control Center. 
2. Tap the plus (+) sign in the top-left corner of your screen. 
3. Tap Add a Control. 

Then you can use the search bar near the top of the screen to search for «Apple Intelligence» or you can scroll through the menu to find the Apple Intelligence & Siri section. Tap any (or all) of these controls to add them to your Control Center.

While Talk to Siri and Type to Siri controls can be helpful if you have trouble accessing the digital assistant, the Visual Intelligence control is important because it brings the Apple Intelligence feature to the iPhone 15 Pro and Pro Max.

Originally, Visual Intelligence was only accessible on the iPhone 16 lineup since those devices had the Camera Control button. Visual Intelligence was only accessible through that button. 

With iOS 18.4, Visual Intelligence is now accessible on more devices and people thanks to the titular control in Control Center. But remember, Visual Intelligence is like any other AI tool so it won’t always be accurate. You should double check results and important information it shows you.

For more on iOS 18, here are all the new emoji you can use now and what to know about the recipes section in Apple News. You can also check out everything included in iOS 18.4 and our iOS 18 cheat sheet.

Continue Reading

Technologies

Gemini Live Now Has Eyes. We Put the New Feature to the Test

The new feature gives Gemini Live eyes to «see.» I put it through a series of tests. Here are the results.

There I was, walking around my apartment, taking a video with my phone and talking to Google’s Gemini Live. I was giving the AI a tour – and a quiz, asking it to name specific objects it saw. After it identified the flowers in a vase in my living room (chamomile and dianthus, by the way), I tried a curveball: I asked it to tell me where I’d left a pair of scissors. «I just spotted your scissors on the table, right next to the green package of pistachios. Do you see them?»

It was right, and I was wowed. 

Gemini Live will recognize a whole lot more than household odds and ends. Google says it’ll help you navigate a crowded train station or figure out the filling of a pastry. It can give you deeper information about artwork, like where an object originated and whether it was a limited edition.

It’s more than just a souped-up Google Lens. You talk with it and it talks to you. I didn’t need to speak to Gemini in any particular way – it was as casual as any conversation. Way better than talking with the old Google Assistant that the company is quickly phasing out.

Google and Samsung are just now starting to formally roll out the feature to all Pixel 9 (including the new, Pixel 9a) and Galaxy S25 phones. It’s available for free for those devices, and other Pixel phones can access it via a Google AI Premium subscription. Google also released a new YouTube video for the April 2025 Pixel Drop showcasing the feature, and there’s now a dedicated page on the Google Store for it.

All you have to do to get started is go live with Gemini, enable the camera and start talking.

Gemini Live follows on from Google’s Project Astra, first revealed last year as possibly the company’s biggest «we’re in the future» feature, an experimental next step for generative AI capabilities, beyond your simply typing or even speaking prompts into a chatbot like ChatGPT, Claude or Gemini. It comes as AI companies continue to dramatically increase the skills of AI tools, from video generation to raw processing power. Somewhat similar to Gemini Live, there’s Apple’s Visual Intelligence, which the iPhone maker released in a beta form late last year. 

My big takeaway is that a feature like Gemini Live has the potential to change how we interact with the world around us, melding our digital and physical worlds together just by holding your camera in front of almost anything.

I put Gemini Live to a real test

Somehow Gemini Live showed up on my Pixel 9 Pro XL a few days early, so I’ve already had a chance to play around with it. 

The first time I tried it, Gemini was shockingly accurate when I placed a very specific gaming collectible of a stuffed rabbit in my camera’s view. The second time, I showed it to a friend when we were in an art gallery. It not only identified the tortoise on a cross (don’t ask me), but it also immediately identified and translated the kanji right next to the tortoise, giving both of us chills and leaving us more than a little creeped out. In a good way, I think.

In the tour of my apartment, I was following the lead of the demo that Google did last summer when it first showed off these Live video AI capabilities. I tried random objects in my apartment (fruit, books, Chapstick), many of which it easily identified. 

Then I got thinking about how I could stress-test the feature. I tried to screen-record it in action, but it consistently fell apart at that task. And what if I went off the beaten path with it? I’m a huge fan of the horror genre — movies, TV shows, video games — and have countless collectibles, trinkets and what have you. How well would it do with more obscure stuff — like my horror-themed collectibles?

First, let me say that Gemini can be both absolutely incredible and ridiculously frustrating in the same round of questions. I had roughly 11 objects that I was asking Gemini to identify, and it would sometimes get worse the longer the live session ran, so I had to limit sessions to only one or two objects. My guess is that Gemini attempted to use contextual information from previously identified objects to guess new objects put in front of it, which sort of makes sense, but ultimately neither I nor it benefited from this.

Sometimes, Gemini was just on point, easily landing the correct answers with no fuss or confusion, but this tended to happen with more recent or popular objects. For example, I was pretty surprised when it immediately guessed one of my test objects was not only from Destiny 2, but was a limited edition from a seasonal event from last year. 

At other times, Gemini would be way off the mark, and I would need to give it more hints to get into the ballpark of the right answer. And sometimes, it seemed as though Gemini was taking context from my previous live sessions to come up with answers, identifying multiple objects as coming from Silent Hill when they were not. I have a display case dedicated to the game series, so I could see why it would want to dip into that territory quickly.

Gemini can get full-on bugged out at times. On more than one occasion, Gemini misidentified one of the items as a made-up character from the unreleased Silent Hill: f game, clearly merging pieces of different titles into something that never was. The other consistent bug I experienced was when Gemini would produce an incorrect answer, and I would correct it and hint closer at the answer — or straight up give it the answer, only to have it repeat the incorrect answer as if it was a new guess. When that happened, I would close the session and start a new one, which wasn’t always helpful.

One trick I found was that some conversations did better than others. If I scrolled through my Gemini conversation list, tapped an old chat that had gotten a specific item correct, and then went live again from that chat, it would be able to identify the items without issue. While that’s not necessarily surprising, it was interesting to see that some conversations worked better than others, even if you used the same language. 

Google didn’t respond to my requests for more information on how Gemini Live works.

I wanted Gemini to successfully answer my sometimes highly specific questions, so I provided plenty of hints to get there. The nudges were often helpful, but not always. Below are a series of objects I tried to get Gemini to identify and provide information about. 

Continue Reading

Trending

Copyright © Verum World Media