Connect with us

Technologies

Apple’s AR/VR Headset: What Could Be Coming in 2023

The company’s next big product should arrive next year. Here’s what we expect.

This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.

Apple has been integrating augmented reality into its devices for years, but the company looks like it will leap right into the territory of Meta, Microsoft and Magic Leap with a long-expected mixed-reality headset in 2023.

The target date of this AR/VR headset keeps sliding, with the latest report in early December from noted analyst Ming Chi-Kuo suggesting an arrival in the second half of 2023. With an announcement event that could happen as soon as January, we’re at the point where every Apple event seems to feel like the one where it could pull the covers off this device at last. Bloomberg’s Mark Gurman reported in early January that’s he’s heard the company is aiming to unveil the headset in the spring ahead of the annual Worldwide Developers Conference in June.

2023 looks like a year full of virtual reality headsets that we originally expected in 2022, including the PlayStation VR 2 and Meta Quest 3. Apple has already laid down plenty of AR clues, hinting at what its mixed-reality future could hold and has been active in AR on its own iPhones and iPads for years.

As far as what its device could be like, odds are strong that the headset could work from a similar playbook as Meta’s recent high-end headset, the Quest Pro, with a focus on work, mixed reality and eye tracking onboard.

Here’s what we’re expecting.

Is its name Reality Pro? Is the software called xrOS?

The latest report from noted Apple reporter Mark Gurman at Bloomberg suggests the operating system for this headset could be called «xrOS,» but that may not indicate the name of the headset itself. Recent trademark filings reported by Bloomberg showed the name «Reality» showing up a lot: Reality One, Reality Pro and Reality Processor. Apple’s existing AR software framework for iOS is named RealityKit, and previous reports suggested that «Reality OS» could be the name for the new headset’s ecosystem.

No one really expected the Apple Watch’s name (remember iWatch?), so to some degree, names don’t matter at this point. But it does indicate that Apple’s moving forward on a product and software, for sure.

One of several headsets?

The headset has been cooking for a long while. Reports have been going around for several years, including a story broken by former CNET Managing Editor Shara Tibken in 2018. Apple’s been building more advanced AR tools into its iPhones and iPads for years, setting the stage for something more.

Whatever the headset might become, it’s looking a lot more real lately. A detailed report from The Information earlier this year discussed likely specs, which include what Bloomberg’s Mark Gurman says is Apple’s latest M2 chip. According to another report from Bloomberg earlier this year, Apple’s board of directors have already seen a demonstration of the mixed-reality headset.

The expected arrival of this headset has kept sliding for years. Kuo previously predicted that Apple’s VR-AR headset would arrive in the fourth quarter of 2022 with Wi-Fi 6 and 6E support. But this VR-type headset could be the start of several lines of products, similar again to how Meta has been targeting future AR glasses. Kuo has previously predicted that Apple smart glasses may arrive in 2025.

Apple could take a dual headset approach, leading the way with a high-end AR-VR headset that may be more like what Meta has done with the Quest Pro, according to Bloomberg’s Gurman. Gurman also suggests a focus on gaming, media and communication on this initial first-wave headset. In terms of communication, Gurman believes FaceTime using the rumored headset could rely on Memoji and SharePlay: Instead of seeing the person you’re talking to, you’d see a 3D version of their personalized Memoji avatar.

Eventually, Apple’s plans for this headset could become larger. The company’s «goal is to replace the ‌iPhone‌ with AR in 10 years,» Kuo explained in a note to investors, seen by MacRumors. The device could be relatively lightweight, about 300 to 400 grams (roughly 10.5 to 14 ounces), according to Kuo. That’s lighter than Meta’s Oculus Quest 2. However, it’s larger than a normal pair of glasses, with early renders of its possible design looking a lot more like futuristic ski goggles.

Read more: The Metaverse is Just Getting Started: Here’s What You Need to Know

The headset could be expensive, maybe as much as $2,000 or more, with 8K displays, eye tracking and cameras that can scan the world and blend AR and VR together, according to a report from The Information last year. That’s to be expected, considering the Quest Pro costs $1,500 and AR headsets like the Magic Leap 2 and Hololens 2 are around $3,000.

It’s expected to feature advanced processors, likely based on Apple’s recent M2 chips, and work as a stand-alone device. But it could also connect with Apple’s other devices. That’s not a surprising move. In fact, most of the reports on Apple’s headset seem to line right up with how VR is evolving: lighter-weight, with added mixed-reality features via more advanced pass-through cameras. Much like the Quest Pro, this will likely be a bridge to future AR glasses efforts.

Previous reports on Apple’s AR/VR roadmap suggested internal disagreements, or a split strategy that could mean a VR headset first, and more normal-looking augmented reality smart glasses later. But recent reports seem to be settling down to tell the story of a particular type of advanced VR product leading the way. What’s increasingly clear is that the rest of the AR and VR landscape is facing a slower-than-expected road to AR glasses, too.

VR, however, is a more easily reachable goal in the short term.

Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: this year, the year after or even further down the road. Or whether Apple proceeds with just glasses, or with a mixed-reality VR and AR headset, too.

I’ve worn more AR and VR headsets than I can even recall, and have been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple acquired VR media-streaming company NextVR in 2020 and it bought AR headset lens-maker Akonia Holographics in 2018.

I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well-aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses probably won’t be a massive surprise if you’ve been paying attention to the AR and VR landscape lately.

Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Meta is working on AR glasses too, as well as Snap and also Niantic. The landscape got crowded fast.

Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.

Apple declined to comment on this story.

Launch date: Looks likely for 2023

New Apple products tend to be announced months before they arrive, maybe even earlier. The iPhone, Apple Watch, HomePod and iPad all followed this path.

The latest reports from Kuo point to possible delays for the release of the headset to the second half of 2023, but an event announcing the headset could happen as soon as January. That timeframe would make a lot of sense, giving time for developers to understand the concept well ahead of the hardware’s release, and even possibly allowing for Apple’s WWDC developer conference (usually in June) to go over specifics of the software.

Either way, developers would need a long head start to get used to developing for Apple’s headset, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving a heads-up on its hardware well in advance of its actual arrival.

An Apple headset could be a lot like the Meta Quest, but higher end

There’s already one well-polished success story in VR, and the Quest 2 looks to be as good a model as any for where future headsets could aim. Gurman’s report makes a potential Apple VR headset sound a lot like Facebook’s stand-alone device, with controller-free hand tracking and spatial room awareness that could be achieved with Apple’s lidar sensor technology, introduced on the iPad Pro and iPhone 12 Pro.

Apple’s headset could end up serving a more limited professional or creative crowd. But it could also go for a mainstream focus on gaming or fitness. My experiences with the Oculus Quest’s fitness tools feel like a natural direction for Apple to head in, now that the Apple Watch is extending to subscription fitness training, pairing with TVs and other devices.

The Oculus Quest 2 (now officially the Meta Quest 2) can see through to the real world and extend some level of overlap of virtual objects like room boundaries, but Apple’s headset could explore passthrough augmented reality to a greater degree. I’ve seen impressive examples of this in headsets from companies such as Varjo. It could be a stepping stone for Apple to develop 3D augmented reality tech on smaller glasses designs down the road.

Right now, there aren’t any smart glasses manufacturers able to develop normal-looking glasses that can achieve advanced, spatially aware 3D overlays of holographic objects. Some devices like the nReal Light have tried, with mixed success. Meta’s first smart glasses, Ray-Ban Stories, weren’t AR at all. Meta is working on ways to achieve that tech later on. Apple might take a similar approach with glasses, too.

The VR headset could be a ‘Pro’ device

Most existing reports suggest Apple’s VR headset would likely be so expensive — and powerful — that it will have to aim for a limited crowd rather than the mainstream. If so, it could target the same business and creative professionals that more advanced VR headsets like the Varjo XR-3 and Meta Quest Pro are already aiming for.

I tried Varjo’s hardware. My experience with it could hint at what Apple’s headset might also be focusing on. It has a much higher-resolution display (which Apple is apparently going to try to achieve), can blend AR and VR into mixed reality using its passthrough cameras, and is designed for pro-level creative tools. Apple could integrate something similar to its lidar sensors. The Quest Pro does something similar, but in a standalone device without as high-end a display.

Varjo’s headset, and most professional VR headsets, are tethered to PCs with a number of cables. Apple’s headset could work as a standalone device, like the Quest 2 and Quest Pro, and also work when connected to a Mac or iPad, much like the Quest 2 already does with Windows gaming PCs. Apple’s advantage could be making a pro headset that is a lot more lightweight and seamlessly standalone than any other current PC-ready gear. But what remains unknown is how many apps and tools Apple will be able to introduce to make its headset feel like a tool that’s truly useful for creators.

Controls: Hand tracking or a small wearable device?

The Information’s previous reports on Apple’s headset suggest a more pared-down control system than the elaborate and large game controller-like peripherals used by many VR headsets right now. Apple’s headset should work using hand tracking, much like many VR and AR headsets already enable. But Apple would likely need some sort of controller-type accessory for inputs, too. Cracking the control and input challenge seems to be one of the bigger hurdles Apple could face.

Recent patent filings point to a possible smart ring-type device that could work for air gestures and motion, and maybe even work with accessories. It’s also possible that Apple might lean on some of its own existing hardware to act as inputs, too.

Could that controller be an Apple Watch? Possibly, but the Apple Watch’s motion-control capabilities and touchscreen may not be enough for the deeper interactions an Apple headset would need. Maybe iPhones could pair and be used as controllers, too. That’s how Qualcomm is envisioning its next wave of phone-connected glasses.

Future AR smart glasses may also be in the works

Getting people to put on an AR headset is hard. I’ve found it a struggle to remember to pack smart glasses, and find room to carry them. Most of them don’t support my prescription, either. Developer-focused AR glasses made by Snap that I tried at home show what everyday AR glasses could look like someday, but they’re still a work in progress.

Qualcomm’s plans for AR glasses show a wave of devices arriving between 2023 and 2025, but at this point no one has been able to crack making a perfect pair. Software, battery life and even common cross-platform interfaces remain a big challenge.

Kuo’s prediction of AR glasses coming a few years after a VR-AR goggle-type headset would line up with what other companies are promising. The challenges with AR glasses are a lot greater than VR. No one’s figured out how wearing them all the time would work, or how you’d interact with virtual objects: Hand tracking? A watch or a ring? Voice? Neural inputs?

Apple always touted the Apple Watch, first and foremost, as a «great watch.» I would expect the same from its glasses. If Apple makes prescription glasses and makes them available, Warby Parker-style, in seasonal frames from its Apple Stores, that might be enough for people if the frames look good. Apple’s VR headset, according to Gurman, will also offer prescription lenses. That could be a stepping stone to developing glasses later on.

Google acquired smart glasses manufacturer North in 2020, which made a prescription, almost normal set of eyewear. North’s concept for glasses might be too similar to Google Glass for Apple’s tastes, but the idea of AR glasses doubling as functional glasses sounds extremely Apple-like. More recently, Vuzix’s planned smart glasses for 2021 show how far the tech has shrunken down, but even those planned glasses won’t have the ability to spatially scan the world and overlay augmented reality: They’ll be more like advanced glasses with heads-up displays and 3D audio.

A report from The Information in 2020 said new AR lenses were entering a trial production phase for Apple’s AR hardware (9to5Mac also broke the report down). These lenses sound much closer to normal glasses than current AR headsets allow, but when would those be ready?

Could Apple make its first smart glasses something more basic, letting Apple slowly add more AR features over time and let newcomers settle into the experience? Or would Apple try to crack the AR challenge with its first pair of glasses? Augmented reality is a weird concept for eyewear, and potentially off-putting. Maybe Apple will aim for subtlety. The original Apple Watch was designed to be glanced at for just 5 seconds at a time.

A recent patent filing also showed Apple looking to solve vision conditions with adaptive lenses. If true, this could be the biggest killer app of Apple’s intelligent eyewear.

Are the AirPods Max a sign of how expensive a headset could be?

The business-focused HoloLens and Magic Leap cost thousands of dollars. Current VR headsets have trended towards $500 or more.

The latest price reports suggest something between $2,000 and $3000, which is in the territory of business-focused AR headsets like the HoloLens 2, or business-creative VR headsets like those from Varjo. An analysis from TrendForce published in February also estimates that an Apple headset’s hardware would cost in the thousands, and it predicts that Apple would employ a «monthly subscription-based software solution.»

Apple’s headphones, the AirPods Max, indicate that the pricing could climb high. At $549, they cost more than a PlayStation 5. And those are just headphones. A pair of smart glasses, or an advanced VR headset, would be a lot more advanced.

iPhone-connected, too?

Qualcomm’s AR and VR plans telegraph the next wave of headsets: Many of them will be driven by phones. Phone-powered glasses can be lighter and just have key onboard cameras and sensors to measure movement and capture information. Meanwhile the phone does the heavy lifting and doesn’t drain headset battery life.

Apple’s star device is the iPhone, and it’s already loaded with advanced chipsets that can do tons of AR and computer vision computation. It could already power an AR headset right now; imagine what could happen in another year or two.

Apple could also have its own high-end dedicated chip in its first wave of VR and AR headsets, as reports suggest, but they’ll also undoubtedly dovetail with more advanced processors in Apple’s phones, tablets and Macs. Over time, this could mean smaller glasses that lean on connecting to other Apple devices, or the cloud.

How Apple could blend the real world with AR and VR

Apple already dabbles with AR overlays with real world locations: QR code and NFC-enabled App Clips can launch experiences from real-world locations with a tap or scan. These micro apps are made to work with AR, too: With glasses or an AR headset, they could eventually launch interactions at a glance.

Maybe QR codes can help accelerate AR working in the «dumb» world. Apple’s iPhones also have a U1 chip that can be used to improve accuracy in AR object placement, and also to more quickly locate other Apple devices that have the U1 chip, too.

Apple’s AirTags arrived in 2021 with features similar to Samsung’s SmartTags Plus that use similar ultrawideband technology. These tags could be seen via an iPhone app using AR, which could possibly extend into Apple’s future VR or AR headsets. If all Apple’s objects recognize each other, they could act as beacons in a home. The U1 chips could also be indoor navigation tools for added precision.

Microsoft’s collaborative mixed-reality platform, Mesh, shows how meetings with people in virtual spaces could happen instantly and in work-like environments. Apple already enables multiperson AR in real places, but a necessary next step would be to allow a platform for collaboration in AR and VR like Microsoft is developing.

Apple’s depth-sensing hardware is already here

Apple is already deeply invested in camera arrays that can sense the world from short and long distances. The front-facing TrueDepth camera, which Apple has used on every Face ID iPhone since the X, is like a shrunken-down Microsoft Kinect and can scan a few feet out, sensing 3D information with high enough accuracy to be used for a secure face scan. Apple’s lidar technology on its recent iPhones and iPads can scan out much further, several meters away. That’s the range that glasses would need.

Apple’s existing lidar technology, combined with cameras, is already good enough to scan environments and 3D objects. Add to this the wider-scale lidar scanning Apple is doing in Maps to enable overlays of real-world locations with virtual objects via a technology called Location Anchors, and suddenly it seems like the depth-scanning Apple is introducing could expand to worldwide ambitions.

Apple’s new Mac chips already point toward VR-AR compatibility

Apple’s M1-enabled Macs and those since are technically a lot more capable of the power needed to run AR and VR, and they share similarities to how iPhone and iPads handle graphics. Developing a common groundwork across devices could allow a headset to feasibly run on an iPhone, iPad or Mac, making it a universal Apple device accessory.

That would be essential if Apple intends on its VR or AR headsets to have any role in creative workflows, or be used for games or apps. It’s one of the limitations of existing VR headsets, which need to run off particular Windows gaming PCs and still don’t play that well with iOS or Android phones.

Look to AirPods for ease of use — and audio augmented reality

I’ve thought about how the AirPods’ comfort — and weird design — was an early experiment in wearing Apple’s hardware directly on our faces — and it was a success. It proved that doing so could be accepted and become normal. AirPods are expensive compared to in-box wired buds, but they’re also utilitarian. They’re relaxed. If Apple’s working on AR or VR headsets, they’ll need to feel the same way.

The AirPod Pros’ spatial audio, which AirPods Max and AirPods 3 also have, points to where future ideas could head. Immersive audio is casual, and we do it all the time. Immersive video is hard and not always needed. I could see AR working as an audio-first approach, like a ping. Apple glasses could potentially do the world-scanning spatial awareness that would allow the spatial audio to work. In the meantime, Apple’s already developing the spatial audio tech that its VR headset would need.

Apple Watch and AirPods could be great companions

Apple’s already got a collection of wearable devices that connect with the iPhone, and both make sense with glasses. Its AirPods can pair for audio (although maybe the glasses have their own Bose Frames-like audio, too), while the Watch could be a helpful remote control. The Apple Watch already acts as a remote at times, for the Apple TV or for linking up with the iPhone camera. Apple’s future headsets could also look to the Watch and expand its display virtually, offering enhanced extras that show up discreetly, like a halo. Or they could use the Watch as some sort of controller.

The Apple Watch could also provide something that it’ll be hard to get from hand gestures or touch-sensitive frames on a pair of glasses: haptics. The rumbling feedback on the Watch could lend some tactile response to virtual things, possibly.

There’s already a low-cost pair of phone goggles, the HoloKit X, that explores these ideas. It uses an iPhone for the headset’s display and cameras and can channel spatial audio to AirPods, and use an Apple Watch for gesture controls. Apple could do the same.

Could Qualcomm and Apple’s reconciliation also be about XR?

Qualcomm and Apple are working together again on future iPhones, and I don’t think it’s just about modems. 5G is a key feature for phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has already been exploring how remote rendering could allow 5G-enabled phones and connected glasses to link up to streaming content and cloud-connected location data. Glasses could eventually stand on their own and use 5G to do advanced computing, in a way like the Apple Watch eventually working over cellular.

Qualcomm’s chipsets are in almost every self-contained AR and VR headset I can think of (Meta Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Will Apple’s tech dovetail at all with Qualcomm’s cross-device platforms?

Technologies

The Future’s Here: Testing Out Gemini’s Live Camera Mode

Gemini Live’s new camera mode feels like the future when it works. I put it through a stress test with my offbeat collectibles.

«I just spotted your scissors on the table, right next to the green package of pistachios. Do you see them?»

Gemini Live’s chatty new camera feature was right. My scissors were exactly where it said they were, and all I did was pass my camera in front of them at some point during a 15-minute live session of me giving the AI chatbot a tour of my apartment. Google’s been rolling out the new camera mode to all Android phones using the Gemini app for free after a two-week exclusive to Pixel 9 (including the new Pixel 9A) and Galaxy S5 smartphones. So, what exactly is this camera mode and how does it work?

When you start a live session with Gemini, you now how have the option to enable a live camera view, where you can talk to the chatbot and ask it about anything the camera sees. Not only can it identify objects, but you can also ask questions about them — and it works pretty well for the most part. In addition, you can share your screen with Gemini so it can identify things you surface on your phone’s display. 

When the new camera feature popped up on my phone, I didn’t hesitate to try it out. In one of my longer tests, I turned it on and started walking through my apartment, asking Gemini what it saw. It identified some fruit, ChapStick and a few other everyday items with no problem. I was wowed when it found my scissors. 

That’s because I hadn’t mentioned the scissors at all. Gemini had silently identified them somewhere along the way and then  recalled the location with precision. It felt so much like the future, I had to do further testing. 

My experiment with Gemini Live’s camera feature was following the lead of the demo that Google did last summer when it first showed off these live video AI capabilities. Gemini reminded the person giving the demo where they’d left their glasses, and it seemed too good to be true. But as I discovered, it was very true indeed.

Gemini Live will recognize a whole lot more than household odds and ends. Google says it’ll help you navigate a crowded train station or figure out the filling of a pastry. It can give you deeper information about artwork, like where an object originated and whether it was a limited edition piece.

It’s more than just a souped-up Google Lens. You talk with it, and it talks to you. I didn’t need to speak to Gemini in any particular way — it was as casual as any conversation. Way better than talking with the old Google Assistant that the company is quickly phasing out.

Google also released a new YouTube video for the April 2025 Pixel Drop showcasing the feature, and there’s now a dedicated page on the Google Store for it.

To get started, you can go live with Gemini, enable the camera and start talking. That’s it.

Gemini Live follows on from Google’s Project Astra, first revealed last year as possibly the company’s biggest «we’re in the future» feature, an experimental next step for generative AI capabilities, beyond your simply typing or even speaking prompts into a chatbot like ChatGPT, Claude or Gemini. It comes as AI companies continue to dramatically increase the skills of AI tools, from video generation to raw processing power. Similar to Gemini Live, there’s Apple’s Visual Intelligence, which the iPhone maker released in a beta form late last year. 

My big takeaway is that a feature like Gemini Live has the potential to change how we interact with the world around us, melding our digital and physical worlds together just by holding your camera in front of almost anything.

I put Gemini Live to a real test

The first time I tried it, Gemini was shockingly accurate when I placed a very specific gaming collectible of a stuffed rabbit in my camera’s view. The second time, I showed it to a friend in an art gallery. It identified the tortoise on a cross (don’t ask me) and immediately identified and translated the kanji right next to the tortoise, giving both of us chills and leaving us more than a little creeped out. In a good way, I think.

I got to thinking about how I could stress-test the feature. I tried to screen-record it in action, but it consistently fell apart at that task. And what if I went off the beaten path with it? I’m a huge fan of the horror genre — movies, TV shows, video games — and have countless collectibles, trinkets and what have you. How well would it do with more obscure stuff — like my horror-themed collectibles?

First, let me say that Gemini can be both absolutely incredible and ridiculously frustrating in the same round of questions. I had roughly 11 objects that I was asking Gemini to identify, and it would sometimes get worse the longer the live session ran, so I had to limit sessions to only one or two objects. My guess is that Gemini attempted to use contextual information from previously identified objects to guess new objects put in front of it, which sort of makes sense, but ultimately, neither I nor it benefited from this.

Sometimes, Gemini was just on point, easily landing the correct answers with no fuss or confusion, but this tended to happen with more recent or popular objects. For example, I was surprised when it immediately guessed one of my test objects was not only from Destiny 2, but was a limited edition from a seasonal event from last year. 

At other times, Gemini would be way off the mark, and I would need to give it more hints to get into the ballpark of the right answer. And sometimes, it seemed as though Gemini was taking context from my previous live sessions to come up with answers, identifying multiple objects as coming from Silent Hill when they were not. I have a display case dedicated to the game series, so I could see why it would want to dip into that territory quickly.

Gemini can get full-on bugged out at times. On more than one occasion, Gemini misidentified one of the items as a made-up character from the unreleased Silent Hill: f game, clearly merging pieces of different titles into something that never was. The other consistent bug I experienced was when Gemini would produce an incorrect answer, and I would correct it and hint closer at the answer — or straight up give it the answer, only to have it repeat the incorrect answer as if it was a new guess. When that happened, I would close the session and start a new one, which wasn’t always helpful.

One trick I found was that some conversations did better than others. If I scrolled through my Gemini conversation list, tapped an old chat that had gotten a specific item correct, and then went live again from that chat, it would be able to identify the items without issue. While that’s not necessarily surprising, it was interesting to see that some conversations worked better than others, even if you used the same language. 

Google didn’t respond to my requests for more information on how Gemini Live works.

I wanted Gemini to successfully answer my sometimes highly specific questions, so I provided plenty of hints to get there. The nudges were often helpful, but not always. Below are a series of objects I tried to get Gemini to identify and provide information about. 

Continue Reading

Technologies

Today’s Wordle Hints, Answer and Help for April 26, #1407

Here are hints and the answer for today’s Wordle No. 1,407 for April 26. Hint: Fans of a certain musical group will rock out with this puzzle.

Looking for the most recent Wordle answer? Click here for today’s Wordle hints, as well as our daily answers and hints for The New York Times Mini Crossword, Connections, Connections: Sports Edition and Strands puzzles.


Today’s Wordle puzzle isn’t too tough. The letters are fairly common, and fans of a certain rock band might get a kick out of the answer. If you need a new starter word, check out our list of which letters show up the most in English words. If you need hints and the answer, read on.

Today’s Wordle hints

Before we show you today’s Wordle answer, we’ll give you some hints. If you don’t want a spoiler, look away now.

Wordle hint No. 1: Repeats

Today’s Wordle answer has no repeated letters.

Wordle hint No. 2: Vowels

There is one vowel in today’s Wordle answer.

Wordle hint No. 3: Start letter

Today’s Wordle answer begins with the letter C.

Wordle hint No. 4: Rock out

Today’s Wordle answer is the name of a legendary English rock band.

Wordle hint No. 5: Meaning

Today’s Wordle answer can refer to a violent confrontation.

TODAY’S WORDLE ANSWER

Today’s Wordle answer is CLASH.

Yesterday’s Wordle answer

Yesterday’s Wordle answer, April 25,  No. 1406 was KNOWN.

Recent Wordle answers

April 21, No. 1402: SPATE

April 22, No. 1403: ARTSY

April 23, No. 1404: OZONE.

April 24, No. 1405: GENIE

What’s the best Wordle starting word?

Don’t be afraid to use our tip sheet ranking all the letters in the alphabet by frequency of uses. In short, you want starter words that lean heavy on E, A and R, and don’t contain Z, J and Q. 

Some solid starter words to try:

ADIEU

TRAIN

CLOSE

STARE

NOISE

Continue Reading

Technologies

T-Mobile Adds New Top 5G Plans, T-Satellite and New 5-Year Price Locks

The new top unlimited plans, Experience More and Experience Beyond, shave some costs and add data and satellite options.

Just two years after expanding its lineup of cellular plans, T-Mobile this week announced two new plans that replace its Go5G Plus and Go5G Next offerings, refreshed its prepaid Metro line and wrapped them all in a promised five-year pricing guarantee. 

To convert more subscribers, the carrier is also offering up to $800 to help customers pay off phone balances when switching from another carrier.

In a briefing with CNET, Jon Friar, president of T-Mobile’s consumer group, explained why the company is revamping and simplifying its array of mobile plans. «The pain point that’s out there over the last couple of years is rising costs all around consumers,» Friar said. «For us to be able to bring more value and even lower prices on [plans like] Experience More versus our former Go5G Plus is a huge win for consumers.»

The new plans went into effect April 23.

With these changes, CNET is already hard at work updating our picks for Best T-Mobile Plans, so check back soon for our recommendations.

More Experiences to define the T-Mobile experience

The top of the new T-Mobile postpaid lineup is two new plans: Experience More and Experience Beyond.

Experience More is the next generation of the Go5G Plus plan, which has unlimited 5G and 4G LTE access and unlimited Premium Data (download speeds up to 418Mbps and upload speeds up to 31Mbps). High-speed hotspot data is bumped up to 60GB from 50GB per month. The monthly price is now $5 lower per line than Go5G Plus.

The Experience More plan also gets free T-Satellite with Starlink service (the new name for T-Mobile’s satellite feature that uses Starlink’s constellation of satellites) through the end of 2025. Although T-Satellite is still officially in beta until July, customers can continue to get free access to the beta starting now. At the start of the new year, the service will cost $10 per month, a $5 drop from T-Mobile’s originally announced pricing. T-Satellite will be open to customers of other carriers for the same pricing beginning in July.

The new top-tier plan, Experience Beyond, also comes in $5 per line cheaper than its predecessor, Go5G Next. It has 250GB of high-speed hotspot data per month, up from 50GB, and more data when you’re traveling outside the US: 30GB in Canada and Mexico (versus 15GB) and 15GB in 215 countries (up from 5GB). T-Satellite service is included in the Experience Beyond plan.

However, one small change to the Experience plans affects that pricing: Taxes and fees, previously included in the Go5G Plus and Go5G Next prices, are now broken out separately. T-Mobile recently announced that one such fee, the Regulatory Programs and Telco Recovery Fee, would increase up to 50 cents per month.

According to T-Mobile, the Experience Beyond rates and features will be «rolling out soon» for customers currently on the Go5G Next plan.

The Essentials plan is staying in the lineup at the same cost of $60 per month for a single line, the same 50GB of Premium Data and unlimited 5G and 4G LTE data. High-speed hotspot data is an optional $10 add-on, as is T-Satellite access, for $15 (both per month).

Also still in the mix is the Essentials Saver plan, an affordable option that has ranked high in CNET’s Best Cellphone Plans recommendations.

Corresponding T-Mobile plans, such as those for military, first responders and people age 55 and older are also getting refreshed with the new lineup.

T-Mobile’s plan shakeup is being driven in part by the current economic climate. Explaining the rationale behind the price reductions and the streamlined number of plans, Mike Katz, president of marketing, innovation and experience at T-Mobile told CNET, «We’re in a weird time right now where prices everywhere are going up and they’ve happened over the last several years. We felt like there was an opportunity to compete with some simplicity, but more importantly, some peace of mind for customers.»

Existing customers who want to switch to one of the new plans can do so at the same rates offered to new customers. Or, if a current plan still works for them, they can continue without changes (although keep in mind that T-Mobile earlier this year increased prices for some legacy plans).

Five years of price stability

It’s nearly impossible to think about prices these days without warily eyeing how tariffs and US economic policy will affect what we pay for things. So it’s not surprising to see carriers implement some cost stability into their plans. For instance, Verizon recently locked prices for three years on their plans.

Now, T-Mobile is building a five-year price guarantee for its T-Mobile and Metro plans. That pricing applies to talk, text and data amounts — not necessarily taxes and other fees that can fluctuate.

Given the uncertain outlook, it seems counterintuitive to lock in a longer rate. When asked about this, Katz said, «We feel like our job is to solve pain points for customers and we feel like this helps with this exact sentiment. It shifts the risk from customers to us. We’ll take the risk so they don’t have to.»

The price hold applies to new customers signing up for the plans as well as current customers switching to one. T-Mobile is offering the same deals and pricing to new and existing subscribers. Also, the five-year deal applies to pricing; it’s not a five-year plan commitment.

More money and options to encourage switchers

The promise of a five-year price guarantee is also intended to lure people from other carriers, particularly AT&T and Verizon. As further incentive, T-Mobile is offering up to $800 per line (distributed via a virtual prepaid Mastercard) to help pay off other carriers’ device contracts. This is a limited-time offer. There are also options to trade in old devices, including locked phones, to get up to four new flagship phones.

Or, if getting out of a contract isn’t an issue, T-Mobile can offer $200 in credit (up to $800 for four lines) to bring an existing number to the network.

Four new Metro prepaid plans

On the prepaid side, T-Mobile is rolling out four new Metro plans, which are also covered by the new five-year price guarantee:

• Metro Starter costs $25 per line per month for a family of four and there is no need to bring an existing number. (The cost is $105 the first month.)

• Metro Starter Plus runs $40 per month for a new phone, unlimited talk, text and 5G data when bringing an existing number. For $65 per month, new customers can get two lines and two new Samsung A15 phones. No autopay is required.

• Metro Flex Unlimited is $30 per line per month with autopay for four lines ($125 the first month) with unlimited talk, text and 5G data.

• Metro Flex Unlimited Plus costs $60 per line per month, then $35 for lines two and three and then lowers the price of the fourth line to $10 per month as more family members are added. Adding a tablet or smartwatch to an existing line costs $5. And streaming video, such as from the included Amazon Prime membership, comes through at HD quality.

See more: If you’re looking for phone plans, you may also be looking for a new cell phone. Here are CNET’s picks.

The Pixel 9A’s Design: Google Takes Minimalism to the Extreme

See all photos

Continue Reading

Trending

Copyright © Verum World Media