Connect with us

Technologies

Apple’s AR/VR Headset: What Could Be Coming in 2023

The company’s next big product should arrive next year. Here’s what we expect.

This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.

Apple has been integrating augmented reality into its devices for years, but the company looks like it will leap right into the territory of Meta, Microsoft and Magic Leap with a long-expected mixed-reality headset in 2023.

The target date of this AR/VR headset keeps sliding, with the latest report in early December from noted analyst Ming Chi-Kuo suggesting an arrival in the second half of 2023. With an announcement event that could happen as soon as January, we’re at the point where every Apple event seems to feel like the one where it could pull the covers off this device at last. Bloomberg’s Mark Gurman reported in early January that’s he’s heard the company is aiming to unveil the headset in the spring ahead of the annual Worldwide Developers Conference in June.

2023 looks like a year full of virtual reality headsets that we originally expected in 2022, including the PlayStation VR 2 and Meta Quest 3. Apple has already laid down plenty of AR clues, hinting at what its mixed-reality future could hold and has been active in AR on its own iPhones and iPads for years.

As far as what its device could be like, odds are strong that the headset could work from a similar playbook as Meta’s recent high-end headset, the Quest Pro, with a focus on work, mixed reality and eye tracking onboard.

Here’s what we’re expecting.

Is its name Reality Pro? Is the software called xrOS?

The latest report from noted Apple reporter Mark Gurman at Bloomberg suggests the operating system for this headset could be called «xrOS,» but that may not indicate the name of the headset itself. Recent trademark filings reported by Bloomberg showed the name «Reality» showing up a lot: Reality One, Reality Pro and Reality Processor. Apple’s existing AR software framework for iOS is named RealityKit, and previous reports suggested that «Reality OS» could be the name for the new headset’s ecosystem.

No one really expected the Apple Watch’s name (remember iWatch?), so to some degree, names don’t matter at this point. But it does indicate that Apple’s moving forward on a product and software, for sure.

One of several headsets?

The headset has been cooking for a long while. Reports have been going around for several years, including a story broken by former CNET Managing Editor Shara Tibken in 2018. Apple’s been building more advanced AR tools into its iPhones and iPads for years, setting the stage for something more.

Whatever the headset might become, it’s looking a lot more real lately. A detailed report from The Information earlier this year discussed likely specs, which include what Bloomberg’s Mark Gurman says is Apple’s latest M2 chip. According to another report from Bloomberg earlier this year, Apple’s board of directors have already seen a demonstration of the mixed-reality headset.

The expected arrival of this headset has kept sliding for years. Kuo previously predicted that Apple’s VR-AR headset would arrive in the fourth quarter of 2022 with Wi-Fi 6 and 6E support. But this VR-type headset could be the start of several lines of products, similar again to how Meta has been targeting future AR glasses. Kuo has previously predicted that Apple smart glasses may arrive in 2025.

Apple could take a dual headset approach, leading the way with a high-end AR-VR headset that may be more like what Meta has done with the Quest Pro, according to Bloomberg’s Gurman. Gurman also suggests a focus on gaming, media and communication on this initial first-wave headset. In terms of communication, Gurman believes FaceTime using the rumored headset could rely on Memoji and SharePlay: Instead of seeing the person you’re talking to, you’d see a 3D version of their personalized Memoji avatar.

Eventually, Apple’s plans for this headset could become larger. The company’s «goal is to replace the ‌iPhone‌ with AR in 10 years,» Kuo explained in a note to investors, seen by MacRumors. The device could be relatively lightweight, about 300 to 400 grams (roughly 10.5 to 14 ounces), according to Kuo. That’s lighter than Meta’s Oculus Quest 2. However, it’s larger than a normal pair of glasses, with early renders of its possible design looking a lot more like futuristic ski goggles.

Read more: The Metaverse is Just Getting Started: Here’s What You Need to Know

The headset could be expensive, maybe as much as $2,000 or more, with 8K displays, eye tracking and cameras that can scan the world and blend AR and VR together, according to a report from The Information last year. That’s to be expected, considering the Quest Pro costs $1,500 and AR headsets like the Magic Leap 2 and Hololens 2 are around $3,000.

It’s expected to feature advanced processors, likely based on Apple’s recent M2 chips, and work as a stand-alone device. But it could also connect with Apple’s other devices. That’s not a surprising move. In fact, most of the reports on Apple’s headset seem to line right up with how VR is evolving: lighter-weight, with added mixed-reality features via more advanced pass-through cameras. Much like the Quest Pro, this will likely be a bridge to future AR glasses efforts.

Previous reports on Apple’s AR/VR roadmap suggested internal disagreements, or a split strategy that could mean a VR headset first, and more normal-looking augmented reality smart glasses later. But recent reports seem to be settling down to tell the story of a particular type of advanced VR product leading the way. What’s increasingly clear is that the rest of the AR and VR landscape is facing a slower-than-expected road to AR glasses, too.

VR, however, is a more easily reachable goal in the short term.

Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: this year, the year after or even further down the road. Or whether Apple proceeds with just glasses, or with a mixed-reality VR and AR headset, too.

I’ve worn more AR and VR headsets than I can even recall, and have been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple acquired VR media-streaming company NextVR in 2020 and it bought AR headset lens-maker Akonia Holographics in 2018.

I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well-aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses probably won’t be a massive surprise if you’ve been paying attention to the AR and VR landscape lately.

Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Meta is working on AR glasses too, as well as Snap and also Niantic. The landscape got crowded fast.

Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.

Apple declined to comment on this story.

Launch date: Looks likely for 2023

New Apple products tend to be announced months before they arrive, maybe even earlier. The iPhone, Apple Watch, HomePod and iPad all followed this path.

The latest reports from Kuo point to possible delays for the release of the headset to the second half of 2023, but an event announcing the headset could happen as soon as January. That timeframe would make a lot of sense, giving time for developers to understand the concept well ahead of the hardware’s release, and even possibly allowing for Apple’s WWDC developer conference (usually in June) to go over specifics of the software.

Either way, developers would need a long head start to get used to developing for Apple’s headset, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving a heads-up on its hardware well in advance of its actual arrival.

An Apple headset could be a lot like the Meta Quest, but higher end

There’s already one well-polished success story in VR, and the Quest 2 looks to be as good a model as any for where future headsets could aim. Gurman’s report makes a potential Apple VR headset sound a lot like Facebook’s stand-alone device, with controller-free hand tracking and spatial room awareness that could be achieved with Apple’s lidar sensor technology, introduced on the iPad Pro and iPhone 12 Pro.

Apple’s headset could end up serving a more limited professional or creative crowd. But it could also go for a mainstream focus on gaming or fitness. My experiences with the Oculus Quest’s fitness tools feel like a natural direction for Apple to head in, now that the Apple Watch is extending to subscription fitness training, pairing with TVs and other devices.

The Oculus Quest 2 (now officially the Meta Quest 2) can see through to the real world and extend some level of overlap of virtual objects like room boundaries, but Apple’s headset could explore passthrough augmented reality to a greater degree. I’ve seen impressive examples of this in headsets from companies such as Varjo. It could be a stepping stone for Apple to develop 3D augmented reality tech on smaller glasses designs down the road.

Right now, there aren’t any smart glasses manufacturers able to develop normal-looking glasses that can achieve advanced, spatially aware 3D overlays of holographic objects. Some devices like the nReal Light have tried, with mixed success. Meta’s first smart glasses, Ray-Ban Stories, weren’t AR at all. Meta is working on ways to achieve that tech later on. Apple might take a similar approach with glasses, too.

The VR headset could be a ‘Pro’ device

Most existing reports suggest Apple’s VR headset would likely be so expensive — and powerful — that it will have to aim for a limited crowd rather than the mainstream. If so, it could target the same business and creative professionals that more advanced VR headsets like the Varjo XR-3 and Meta Quest Pro are already aiming for.

I tried Varjo’s hardware. My experience with it could hint at what Apple’s headset might also be focusing on. It has a much higher-resolution display (which Apple is apparently going to try to achieve), can blend AR and VR into mixed reality using its passthrough cameras, and is designed for pro-level creative tools. Apple could integrate something similar to its lidar sensors. The Quest Pro does something similar, but in a standalone device without as high-end a display.

Varjo’s headset, and most professional VR headsets, are tethered to PCs with a number of cables. Apple’s headset could work as a standalone device, like the Quest 2 and Quest Pro, and also work when connected to a Mac or iPad, much like the Quest 2 already does with Windows gaming PCs. Apple’s advantage could be making a pro headset that is a lot more lightweight and seamlessly standalone than any other current PC-ready gear. But what remains unknown is how many apps and tools Apple will be able to introduce to make its headset feel like a tool that’s truly useful for creators.

Controls: Hand tracking or a small wearable device?

The Information’s previous reports on Apple’s headset suggest a more pared-down control system than the elaborate and large game controller-like peripherals used by many VR headsets right now. Apple’s headset should work using hand tracking, much like many VR and AR headsets already enable. But Apple would likely need some sort of controller-type accessory for inputs, too. Cracking the control and input challenge seems to be one of the bigger hurdles Apple could face.

Recent patent filings point to a possible smart ring-type device that could work for air gestures and motion, and maybe even work with accessories. It’s also possible that Apple might lean on some of its own existing hardware to act as inputs, too.

Could that controller be an Apple Watch? Possibly, but the Apple Watch’s motion-control capabilities and touchscreen may not be enough for the deeper interactions an Apple headset would need. Maybe iPhones could pair and be used as controllers, too. That’s how Qualcomm is envisioning its next wave of phone-connected glasses.

Future AR smart glasses may also be in the works

Getting people to put on an AR headset is hard. I’ve found it a struggle to remember to pack smart glasses, and find room to carry them. Most of them don’t support my prescription, either. Developer-focused AR glasses made by Snap that I tried at home show what everyday AR glasses could look like someday, but they’re still a work in progress.

Qualcomm’s plans for AR glasses show a wave of devices arriving between 2023 and 2025, but at this point no one has been able to crack making a perfect pair. Software, battery life and even common cross-platform interfaces remain a big challenge.

Kuo’s prediction of AR glasses coming a few years after a VR-AR goggle-type headset would line up with what other companies are promising. The challenges with AR glasses are a lot greater than VR. No one’s figured out how wearing them all the time would work, or how you’d interact with virtual objects: Hand tracking? A watch or a ring? Voice? Neural inputs?

Apple always touted the Apple Watch, first and foremost, as a «great watch.» I would expect the same from its glasses. If Apple makes prescription glasses and makes them available, Warby Parker-style, in seasonal frames from its Apple Stores, that might be enough for people if the frames look good. Apple’s VR headset, according to Gurman, will also offer prescription lenses. That could be a stepping stone to developing glasses later on.

Google acquired smart glasses manufacturer North in 2020, which made a prescription, almost normal set of eyewear. North’s concept for glasses might be too similar to Google Glass for Apple’s tastes, but the idea of AR glasses doubling as functional glasses sounds extremely Apple-like. More recently, Vuzix’s planned smart glasses for 2021 show how far the tech has shrunken down, but even those planned glasses won’t have the ability to spatially scan the world and overlay augmented reality: They’ll be more like advanced glasses with heads-up displays and 3D audio.

A report from The Information in 2020 said new AR lenses were entering a trial production phase for Apple’s AR hardware (9to5Mac also broke the report down). These lenses sound much closer to normal glasses than current AR headsets allow, but when would those be ready?

Could Apple make its first smart glasses something more basic, letting Apple slowly add more AR features over time and let newcomers settle into the experience? Or would Apple try to crack the AR challenge with its first pair of glasses? Augmented reality is a weird concept for eyewear, and potentially off-putting. Maybe Apple will aim for subtlety. The original Apple Watch was designed to be glanced at for just 5 seconds at a time.

A recent patent filing also showed Apple looking to solve vision conditions with adaptive lenses. If true, this could be the biggest killer app of Apple’s intelligent eyewear.

Are the AirPods Max a sign of how expensive a headset could be?

The business-focused HoloLens and Magic Leap cost thousands of dollars. Current VR headsets have trended towards $500 or more.

The latest price reports suggest something between $2,000 and $3000, which is in the territory of business-focused AR headsets like the HoloLens 2, or business-creative VR headsets like those from Varjo. An analysis from TrendForce published in February also estimates that an Apple headset’s hardware would cost in the thousands, and it predicts that Apple would employ a «monthly subscription-based software solution.»

Apple’s headphones, the AirPods Max, indicate that the pricing could climb high. At $549, they cost more than a PlayStation 5. And those are just headphones. A pair of smart glasses, or an advanced VR headset, would be a lot more advanced.

iPhone-connected, too?

Qualcomm’s AR and VR plans telegraph the next wave of headsets: Many of them will be driven by phones. Phone-powered glasses can be lighter and just have key onboard cameras and sensors to measure movement and capture information. Meanwhile the phone does the heavy lifting and doesn’t drain headset battery life.

Apple’s star device is the iPhone, and it’s already loaded with advanced chipsets that can do tons of AR and computer vision computation. It could already power an AR headset right now; imagine what could happen in another year or two.

Apple could also have its own high-end dedicated chip in its first wave of VR and AR headsets, as reports suggest, but they’ll also undoubtedly dovetail with more advanced processors in Apple’s phones, tablets and Macs. Over time, this could mean smaller glasses that lean on connecting to other Apple devices, or the cloud.

How Apple could blend the real world with AR and VR

Apple already dabbles with AR overlays with real world locations: QR code and NFC-enabled App Clips can launch experiences from real-world locations with a tap or scan. These micro apps are made to work with AR, too: With glasses or an AR headset, they could eventually launch interactions at a glance.

Maybe QR codes can help accelerate AR working in the «dumb» world. Apple’s iPhones also have a U1 chip that can be used to improve accuracy in AR object placement, and also to more quickly locate other Apple devices that have the U1 chip, too.

Apple’s AirTags arrived in 2021 with features similar to Samsung’s SmartTags Plus that use similar ultrawideband technology. These tags could be seen via an iPhone app using AR, which could possibly extend into Apple’s future VR or AR headsets. If all Apple’s objects recognize each other, they could act as beacons in a home. The U1 chips could also be indoor navigation tools for added precision.

Microsoft’s collaborative mixed-reality platform, Mesh, shows how meetings with people in virtual spaces could happen instantly and in work-like environments. Apple already enables multiperson AR in real places, but a necessary next step would be to allow a platform for collaboration in AR and VR like Microsoft is developing.

Apple’s depth-sensing hardware is already here

Apple is already deeply invested in camera arrays that can sense the world from short and long distances. The front-facing TrueDepth camera, which Apple has used on every Face ID iPhone since the X, is like a shrunken-down Microsoft Kinect and can scan a few feet out, sensing 3D information with high enough accuracy to be used for a secure face scan. Apple’s lidar technology on its recent iPhones and iPads can scan out much further, several meters away. That’s the range that glasses would need.

Apple’s existing lidar technology, combined with cameras, is already good enough to scan environments and 3D objects. Add to this the wider-scale lidar scanning Apple is doing in Maps to enable overlays of real-world locations with virtual objects via a technology called Location Anchors, and suddenly it seems like the depth-scanning Apple is introducing could expand to worldwide ambitions.

Apple’s new Mac chips already point toward VR-AR compatibility

Apple’s M1-enabled Macs and those since are technically a lot more capable of the power needed to run AR and VR, and they share similarities to how iPhone and iPads handle graphics. Developing a common groundwork across devices could allow a headset to feasibly run on an iPhone, iPad or Mac, making it a universal Apple device accessory.

That would be essential if Apple intends on its VR or AR headsets to have any role in creative workflows, or be used for games or apps. It’s one of the limitations of existing VR headsets, which need to run off particular Windows gaming PCs and still don’t play that well with iOS or Android phones.

Look to AirPods for ease of use — and audio augmented reality

I’ve thought about how the AirPods’ comfort — and weird design — was an early experiment in wearing Apple’s hardware directly on our faces — and it was a success. It proved that doing so could be accepted and become normal. AirPods are expensive compared to in-box wired buds, but they’re also utilitarian. They’re relaxed. If Apple’s working on AR or VR headsets, they’ll need to feel the same way.

The AirPod Pros’ spatial audio, which AirPods Max and AirPods 3 also have, points to where future ideas could head. Immersive audio is casual, and we do it all the time. Immersive video is hard and not always needed. I could see AR working as an audio-first approach, like a ping. Apple glasses could potentially do the world-scanning spatial awareness that would allow the spatial audio to work. In the meantime, Apple’s already developing the spatial audio tech that its VR headset would need.

Apple Watch and AirPods could be great companions

Apple’s already got a collection of wearable devices that connect with the iPhone, and both make sense with glasses. Its AirPods can pair for audio (although maybe the glasses have their own Bose Frames-like audio, too), while the Watch could be a helpful remote control. The Apple Watch already acts as a remote at times, for the Apple TV or for linking up with the iPhone camera. Apple’s future headsets could also look to the Watch and expand its display virtually, offering enhanced extras that show up discreetly, like a halo. Or they could use the Watch as some sort of controller.

The Apple Watch could also provide something that it’ll be hard to get from hand gestures or touch-sensitive frames on a pair of glasses: haptics. The rumbling feedback on the Watch could lend some tactile response to virtual things, possibly.

There’s already a low-cost pair of phone goggles, the HoloKit X, that explores these ideas. It uses an iPhone for the headset’s display and cameras and can channel spatial audio to AirPods, and use an Apple Watch for gesture controls. Apple could do the same.

Could Qualcomm and Apple’s reconciliation also be about XR?

Qualcomm and Apple are working together again on future iPhones, and I don’t think it’s just about modems. 5G is a key feature for phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has already been exploring how remote rendering could allow 5G-enabled phones and connected glasses to link up to streaming content and cloud-connected location data. Glasses could eventually stand on their own and use 5G to do advanced computing, in a way like the Apple Watch eventually working over cellular.

Qualcomm’s chipsets are in almost every self-contained AR and VR headset I can think of (Meta Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Will Apple’s tech dovetail at all with Qualcomm’s cross-device platforms?

Technologies

Look Up Tonight to Spot November’s Supermoon, the Brightest Moon of 2025

Has the moon been looking brighter and bigger to you for the past few days? Here’s why this month’s supermoon is even more super.

It’s already a great month for skygazers, with a trio of meteor showers and the return of the northern hemisphere winter constellations. On Tuesday night, it also features the second of four supermoons in a row. This month’s supermoon will happen on Nov. 4-5, and November’s beaver moon is special because it’ll be the brightest full moon of 2025. 

In addition to being a supermoon, November’s full moon is known as the beaver moon. There is some debate as to why it was named this way. Some believe that this was the best time of year in the old days to set beaver traps to get pelts for winter clothing. Others believe that it coincides with the busiest part of the year for beavers, who are now stocking their lodges with supplies for the upcoming winter. 

Here’s what time it’ll look its biggest and brightest, and what else you need to know about the November supermoon.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The brightest supermoon: When’s the best time to see it?

The moon will reach peak illumination at 8:19 a.m. ET on Nov. 5, making the evening of Nov. 4 and the morning of Nov. 5 the best times to view the moon.

Since moon phases shift slowly, the moon will appear almost full for nearly a week. If you are unable to view the full moon on its best night due to weather or other reasons, you can still see a mostly full moon at any point from Nov. 3 to Nov. 8. 

For all of those days, the moon will be measurably brighter in the night sky compared to any other full moon in 2025. The reason for this is because of the moon’s elliptical orbit. Since it’s not a perfect circle, the moon’s 27.3-day journey around the Earth brings it closer to us on some days, a phenomenon known as perigee. If there is a full moon during this time, it’s branded as a «perigean full moon,» which you may know better as a supermoon. 

Not all supermoons are equal, and November’s will be a little more special than others. According to The Farmer’s Almanac, the beaver moon will be a scant 221,817 miles away from Earth, making it the closest full moon of the year. That means it’ll be the biggest and brightest of the year. 

In practice, the differences are fairly minor and likely won’t be visible to the naked eye when compared side by side to other supermoons. A supermoon is only about 7% larger than a regular full moon. According to NASA, the biggest difference is when comparing a supermoon to a micromoon, where a supermoon will be about 14% larger and 30% brighter. So, if you notice that your backyard patio is lit up more than usual, it’s because of the supermoon. 

Also due to the moon’s orbit, November will also bring a micro new moon, which means the moon will be as far away from the Earth as it can get — a phenomenon known as apogee. November’s new moon occurs on Nov. 20, but you won’t be able to see it.

Continue Reading

Technologies

Stay Informed About Your Flights This Holiday Season With Your iPhone’s Tracker

Your iPhone is hiding a flight tracker. Here’s how it works.

Thanksgiving is only a few short weeks away and if you plan on flying during the holiday season, keeping up-to-date on changes to your flights is crucial. Airports can be hectic during any holiday, but with the government shutdown continuing, flights are liable to change or be cancelled more often.

Luckily, it’s never been easier to get up-to-date information about your flight. For starters, your airline probably has an app, and if not, you can check its website. If you’re in a hurry, you can Google the flight number. Or you can just use your iPhone’s built-in flight tracker that’s sneakily tucked away.

That’s right: Your iPhone has a flight tracker that you may have never known about. It’s there for when it’s needed. Below, we’ll show you have to access it in not one, but two places, so you never have to go hunting for your flight info elsewhere again. 


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source on Chrome.


For more on the iPhone, check out everything Apple announced at WWDC 2025.

How to track your flight via iMessage

Before we start, there are a few prerequisites you must meet:

  • Make sure iMessage is enabled (it doesn’t work with SMS/MMS).
  • You’ll need your flight number somewhere in your text messages, whether you’ve sent that information to someone (even yourself) or it’s been sent to you.
  • The flight number must be sent in this format: [Airline] [Flight number], for example, American Airlines 9707.

Launch the native Messages app on your iPhone and open the text message thread that contains your flight information. You’ll know the flight tracker feature works when the text with the flight information appears underlined, which means it’s actionable and you can tap on it. 

If your flight is still several months away or it’s already passed, you might see a message that says, «Flight information unavailable.» You might also see another flight that’s not yours because airlines recycle flight numbers.

You can check your flight status from Spotlight Search, too

If getting your flight information from Messages wasn’t easy enough, you can also grab the details right from your iPhone’s home screen by swiping down and adding your flight number into Spotlight Search. Even better, this works with Spotlight Search on your Mac computer, too. 

How to access the hidden flight tracker

Although the airline name/flight number format highlighted above is the best way to go, there are other texting options that will lead you to the same result. So let’s say we stick with American Airlines 9707, other options that may bring up the flight tracker include:

  • AmericanAirlines9707 (no spaces)
  • AmericanAirlines 9707 (only one space)
  • AA9707 (airline name is abbreviated and no space)
  • AA 9707 (abbreviated and space)

I would suggest you keep the airline name spelled out completely and add a space between the two pieces of information — like in the previous section — because for some airlines, these alternative options may not work.

Real-time flight tracking

Once everything is set, tap on the flight information in your text messages. If the feature works correctly, you should see the following two options appear in a quick-action menu:

  • Preview Flight: View the flight’s details. Tap this to view more information about the flight.
  • Copy Flight Code: Copy the flight code to your clipboard (in case you want to send your flight details to someone else via text or email).

If you select Preview Flight, at the top of the window, you’ll see the best part of this feature: a real-time flight tracker map. A line will connect the two destinations, and a tiny airplane will move between them, indicating where the flight is at that exact moment.

Underneath the map, you’ll see important flight information:

  • Airline name and flight number
  • Flight status (arriving on time, delayed, canceled, etc.)
  • Terminal and gate numbers (for arrival and departure)
  • Arrival and departure time
  • Flight duration
  • Baggage claim (the number of the baggage carousel)

If you swipe left on the bottom half of the flight tracker, you can switch between flights, but only if there’s a return flight.

For more travel tips, don’t miss our test on whether AI can help you fly more sustainably.

Continue Reading

Technologies

How to Get Verizon’s New Internet Plan for Just $25 Per Month

Continue Reading

Trending

Copyright © Verum World Media