Connect with us

Technologies

Apple’s AR/VR Headset: What Could Be Coming in 2023

The company’s next big product should arrive next year. Here’s what we expect.

This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.

Apple has been integrating augmented reality into its devices for years, but the company looks like it will leap right into the territory of Meta, Microsoft and Magic Leap with a long-expected mixed-reality headset in 2023.

The target date of this AR/VR headset keeps sliding, with the latest report in early December from noted analyst Ming Chi-Kuo suggesting an arrival in the second half of 2023. With an announcement event that could happen as soon as January, we’re at the point where every Apple event seems to feel like the one where it could pull the covers off this device at last. Bloomberg’s Mark Gurman reported in early January that’s he’s heard the company is aiming to unveil the headset in the spring ahead of the annual Worldwide Developers Conference in June.

2023 looks like a year full of virtual reality headsets that we originally expected in 2022, including the PlayStation VR 2 and Meta Quest 3. Apple has already laid down plenty of AR clues, hinting at what its mixed-reality future could hold and has been active in AR on its own iPhones and iPads for years.

As far as what its device could be like, odds are strong that the headset could work from a similar playbook as Meta’s recent high-end headset, the Quest Pro, with a focus on work, mixed reality and eye tracking onboard.

Here’s what we’re expecting.

Is its name Reality Pro? Is the software called xrOS?

The latest report from noted Apple reporter Mark Gurman at Bloomberg suggests the operating system for this headset could be called «xrOS,» but that may not indicate the name of the headset itself. Recent trademark filings reported by Bloomberg showed the name «Reality» showing up a lot: Reality One, Reality Pro and Reality Processor. Apple’s existing AR software framework for iOS is named RealityKit, and previous reports suggested that «Reality OS» could be the name for the new headset’s ecosystem.

No one really expected the Apple Watch’s name (remember iWatch?), so to some degree, names don’t matter at this point. But it does indicate that Apple’s moving forward on a product and software, for sure.

One of several headsets?

The headset has been cooking for a long while. Reports have been going around for several years, including a story broken by former CNET Managing Editor Shara Tibken in 2018. Apple’s been building more advanced AR tools into its iPhones and iPads for years, setting the stage for something more.

Whatever the headset might become, it’s looking a lot more real lately. A detailed report from The Information earlier this year discussed likely specs, which include what Bloomberg’s Mark Gurman says is Apple’s latest M2 chip. According to another report from Bloomberg earlier this year, Apple’s board of directors have already seen a demonstration of the mixed-reality headset.

The expected arrival of this headset has kept sliding for years. Kuo previously predicted that Apple’s VR-AR headset would arrive in the fourth quarter of 2022 with Wi-Fi 6 and 6E support. But this VR-type headset could be the start of several lines of products, similar again to how Meta has been targeting future AR glasses. Kuo has previously predicted that Apple smart glasses may arrive in 2025.

Apple could take a dual headset approach, leading the way with a high-end AR-VR headset that may be more like what Meta has done with the Quest Pro, according to Bloomberg’s Gurman. Gurman also suggests a focus on gaming, media and communication on this initial first-wave headset. In terms of communication, Gurman believes FaceTime using the rumored headset could rely on Memoji and SharePlay: Instead of seeing the person you’re talking to, you’d see a 3D version of their personalized Memoji avatar.

Eventually, Apple’s plans for this headset could become larger. The company’s «goal is to replace the ‌iPhone‌ with AR in 10 years,» Kuo explained in a note to investors, seen by MacRumors. The device could be relatively lightweight, about 300 to 400 grams (roughly 10.5 to 14 ounces), according to Kuo. That’s lighter than Meta’s Oculus Quest 2. However, it’s larger than a normal pair of glasses, with early renders of its possible design looking a lot more like futuristic ski goggles.

Read more: The Metaverse is Just Getting Started: Here’s What You Need to Know

The headset could be expensive, maybe as much as $2,000 or more, with 8K displays, eye tracking and cameras that can scan the world and blend AR and VR together, according to a report from The Information last year. That’s to be expected, considering the Quest Pro costs $1,500 and AR headsets like the Magic Leap 2 and Hololens 2 are around $3,000.

It’s expected to feature advanced processors, likely based on Apple’s recent M2 chips, and work as a stand-alone device. But it could also connect with Apple’s other devices. That’s not a surprising move. In fact, most of the reports on Apple’s headset seem to line right up with how VR is evolving: lighter-weight, with added mixed-reality features via more advanced pass-through cameras. Much like the Quest Pro, this will likely be a bridge to future AR glasses efforts.

Previous reports on Apple’s AR/VR roadmap suggested internal disagreements, or a split strategy that could mean a VR headset first, and more normal-looking augmented reality smart glasses later. But recent reports seem to be settling down to tell the story of a particular type of advanced VR product leading the way. What’s increasingly clear is that the rest of the AR and VR landscape is facing a slower-than-expected road to AR glasses, too.

VR, however, is a more easily reachable goal in the short term.

Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: this year, the year after or even further down the road. Or whether Apple proceeds with just glasses, or with a mixed-reality VR and AR headset, too.

I’ve worn more AR and VR headsets than I can even recall, and have been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple acquired VR media-streaming company NextVR in 2020 and it bought AR headset lens-maker Akonia Holographics in 2018.

I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well-aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses probably won’t be a massive surprise if you’ve been paying attention to the AR and VR landscape lately.

Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Meta is working on AR glasses too, as well as Snap and also Niantic. The landscape got crowded fast.

Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.

Apple declined to comment on this story.

Launch date: Looks likely for 2023

New Apple products tend to be announced months before they arrive, maybe even earlier. The iPhone, Apple Watch, HomePod and iPad all followed this path.

The latest reports from Kuo point to possible delays for the release of the headset to the second half of 2023, but an event announcing the headset could happen as soon as January. That timeframe would make a lot of sense, giving time for developers to understand the concept well ahead of the hardware’s release, and even possibly allowing for Apple’s WWDC developer conference (usually in June) to go over specifics of the software.

Either way, developers would need a long head start to get used to developing for Apple’s headset, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving a heads-up on its hardware well in advance of its actual arrival.

An Apple headset could be a lot like the Meta Quest, but higher end

There’s already one well-polished success story in VR, and the Quest 2 looks to be as good a model as any for where future headsets could aim. Gurman’s report makes a potential Apple VR headset sound a lot like Facebook’s stand-alone device, with controller-free hand tracking and spatial room awareness that could be achieved with Apple’s lidar sensor technology, introduced on the iPad Pro and iPhone 12 Pro.

Apple’s headset could end up serving a more limited professional or creative crowd. But it could also go for a mainstream focus on gaming or fitness. My experiences with the Oculus Quest’s fitness tools feel like a natural direction for Apple to head in, now that the Apple Watch is extending to subscription fitness training, pairing with TVs and other devices.

The Oculus Quest 2 (now officially the Meta Quest 2) can see through to the real world and extend some level of overlap of virtual objects like room boundaries, but Apple’s headset could explore passthrough augmented reality to a greater degree. I’ve seen impressive examples of this in headsets from companies such as Varjo. It could be a stepping stone for Apple to develop 3D augmented reality tech on smaller glasses designs down the road.

Right now, there aren’t any smart glasses manufacturers able to develop normal-looking glasses that can achieve advanced, spatially aware 3D overlays of holographic objects. Some devices like the nReal Light have tried, with mixed success. Meta’s first smart glasses, Ray-Ban Stories, weren’t AR at all. Meta is working on ways to achieve that tech later on. Apple might take a similar approach with glasses, too.

The VR headset could be a ‘Pro’ device

Most existing reports suggest Apple’s VR headset would likely be so expensive — and powerful — that it will have to aim for a limited crowd rather than the mainstream. If so, it could target the same business and creative professionals that more advanced VR headsets like the Varjo XR-3 and Meta Quest Pro are already aiming for.

I tried Varjo’s hardware. My experience with it could hint at what Apple’s headset might also be focusing on. It has a much higher-resolution display (which Apple is apparently going to try to achieve), can blend AR and VR into mixed reality using its passthrough cameras, and is designed for pro-level creative tools. Apple could integrate something similar to its lidar sensors. The Quest Pro does something similar, but in a standalone device without as high-end a display.

Varjo’s headset, and most professional VR headsets, are tethered to PCs with a number of cables. Apple’s headset could work as a standalone device, like the Quest 2 and Quest Pro, and also work when connected to a Mac or iPad, much like the Quest 2 already does with Windows gaming PCs. Apple’s advantage could be making a pro headset that is a lot more lightweight and seamlessly standalone than any other current PC-ready gear. But what remains unknown is how many apps and tools Apple will be able to introduce to make its headset feel like a tool that’s truly useful for creators.

Controls: Hand tracking or a small wearable device?

The Information’s previous reports on Apple’s headset suggest a more pared-down control system than the elaborate and large game controller-like peripherals used by many VR headsets right now. Apple’s headset should work using hand tracking, much like many VR and AR headsets already enable. But Apple would likely need some sort of controller-type accessory for inputs, too. Cracking the control and input challenge seems to be one of the bigger hurdles Apple could face.

Recent patent filings point to a possible smart ring-type device that could work for air gestures and motion, and maybe even work with accessories. It’s also possible that Apple might lean on some of its own existing hardware to act as inputs, too.

Could that controller be an Apple Watch? Possibly, but the Apple Watch’s motion-control capabilities and touchscreen may not be enough for the deeper interactions an Apple headset would need. Maybe iPhones could pair and be used as controllers, too. That’s how Qualcomm is envisioning its next wave of phone-connected glasses.

Future AR smart glasses may also be in the works

Getting people to put on an AR headset is hard. I’ve found it a struggle to remember to pack smart glasses, and find room to carry them. Most of them don’t support my prescription, either. Developer-focused AR glasses made by Snap that I tried at home show what everyday AR glasses could look like someday, but they’re still a work in progress.

Qualcomm’s plans for AR glasses show a wave of devices arriving between 2023 and 2025, but at this point no one has been able to crack making a perfect pair. Software, battery life and even common cross-platform interfaces remain a big challenge.

Kuo’s prediction of AR glasses coming a few years after a VR-AR goggle-type headset would line up with what other companies are promising. The challenges with AR glasses are a lot greater than VR. No one’s figured out how wearing them all the time would work, or how you’d interact with virtual objects: Hand tracking? A watch or a ring? Voice? Neural inputs?

Apple always touted the Apple Watch, first and foremost, as a «great watch.» I would expect the same from its glasses. If Apple makes prescription glasses and makes them available, Warby Parker-style, in seasonal frames from its Apple Stores, that might be enough for people if the frames look good. Apple’s VR headset, according to Gurman, will also offer prescription lenses. That could be a stepping stone to developing glasses later on.

Google acquired smart glasses manufacturer North in 2020, which made a prescription, almost normal set of eyewear. North’s concept for glasses might be too similar to Google Glass for Apple’s tastes, but the idea of AR glasses doubling as functional glasses sounds extremely Apple-like. More recently, Vuzix’s planned smart glasses for 2021 show how far the tech has shrunken down, but even those planned glasses won’t have the ability to spatially scan the world and overlay augmented reality: They’ll be more like advanced glasses with heads-up displays and 3D audio.

A report from The Information in 2020 said new AR lenses were entering a trial production phase for Apple’s AR hardware (9to5Mac also broke the report down). These lenses sound much closer to normal glasses than current AR headsets allow, but when would those be ready?

Could Apple make its first smart glasses something more basic, letting Apple slowly add more AR features over time and let newcomers settle into the experience? Or would Apple try to crack the AR challenge with its first pair of glasses? Augmented reality is a weird concept for eyewear, and potentially off-putting. Maybe Apple will aim for subtlety. The original Apple Watch was designed to be glanced at for just 5 seconds at a time.

A recent patent filing also showed Apple looking to solve vision conditions with adaptive lenses. If true, this could be the biggest killer app of Apple’s intelligent eyewear.

Are the AirPods Max a sign of how expensive a headset could be?

The business-focused HoloLens and Magic Leap cost thousands of dollars. Current VR headsets have trended towards $500 or more.

The latest price reports suggest something between $2,000 and $3000, which is in the territory of business-focused AR headsets like the HoloLens 2, or business-creative VR headsets like those from Varjo. An analysis from TrendForce published in February also estimates that an Apple headset’s hardware would cost in the thousands, and it predicts that Apple would employ a «monthly subscription-based software solution.»

Apple’s headphones, the AirPods Max, indicate that the pricing could climb high. At $549, they cost more than a PlayStation 5. And those are just headphones. A pair of smart glasses, or an advanced VR headset, would be a lot more advanced.

iPhone-connected, too?

Qualcomm’s AR and VR plans telegraph the next wave of headsets: Many of them will be driven by phones. Phone-powered glasses can be lighter and just have key onboard cameras and sensors to measure movement and capture information. Meanwhile the phone does the heavy lifting and doesn’t drain headset battery life.

Apple’s star device is the iPhone, and it’s already loaded with advanced chipsets that can do tons of AR and computer vision computation. It could already power an AR headset right now; imagine what could happen in another year or two.

Apple could also have its own high-end dedicated chip in its first wave of VR and AR headsets, as reports suggest, but they’ll also undoubtedly dovetail with more advanced processors in Apple’s phones, tablets and Macs. Over time, this could mean smaller glasses that lean on connecting to other Apple devices, or the cloud.

How Apple could blend the real world with AR and VR

Apple already dabbles with AR overlays with real world locations: QR code and NFC-enabled App Clips can launch experiences from real-world locations with a tap or scan. These micro apps are made to work with AR, too: With glasses or an AR headset, they could eventually launch interactions at a glance.

Maybe QR codes can help accelerate AR working in the «dumb» world. Apple’s iPhones also have a U1 chip that can be used to improve accuracy in AR object placement, and also to more quickly locate other Apple devices that have the U1 chip, too.

Apple’s AirTags arrived in 2021 with features similar to Samsung’s SmartTags Plus that use similar ultrawideband technology. These tags could be seen via an iPhone app using AR, which could possibly extend into Apple’s future VR or AR headsets. If all Apple’s objects recognize each other, they could act as beacons in a home. The U1 chips could also be indoor navigation tools for added precision.

Microsoft’s collaborative mixed-reality platform, Mesh, shows how meetings with people in virtual spaces could happen instantly and in work-like environments. Apple already enables multiperson AR in real places, but a necessary next step would be to allow a platform for collaboration in AR and VR like Microsoft is developing.

Apple’s depth-sensing hardware is already here

Apple is already deeply invested in camera arrays that can sense the world from short and long distances. The front-facing TrueDepth camera, which Apple has used on every Face ID iPhone since the X, is like a shrunken-down Microsoft Kinect and can scan a few feet out, sensing 3D information with high enough accuracy to be used for a secure face scan. Apple’s lidar technology on its recent iPhones and iPads can scan out much further, several meters away. That’s the range that glasses would need.

Apple’s existing lidar technology, combined with cameras, is already good enough to scan environments and 3D objects. Add to this the wider-scale lidar scanning Apple is doing in Maps to enable overlays of real-world locations with virtual objects via a technology called Location Anchors, and suddenly it seems like the depth-scanning Apple is introducing could expand to worldwide ambitions.

Apple’s new Mac chips already point toward VR-AR compatibility

Apple’s M1-enabled Macs and those since are technically a lot more capable of the power needed to run AR and VR, and they share similarities to how iPhone and iPads handle graphics. Developing a common groundwork across devices could allow a headset to feasibly run on an iPhone, iPad or Mac, making it a universal Apple device accessory.

That would be essential if Apple intends on its VR or AR headsets to have any role in creative workflows, or be used for games or apps. It’s one of the limitations of existing VR headsets, which need to run off particular Windows gaming PCs and still don’t play that well with iOS or Android phones.

Look to AirPods for ease of use — and audio augmented reality

I’ve thought about how the AirPods’ comfort — and weird design — was an early experiment in wearing Apple’s hardware directly on our faces — and it was a success. It proved that doing so could be accepted and become normal. AirPods are expensive compared to in-box wired buds, but they’re also utilitarian. They’re relaxed. If Apple’s working on AR or VR headsets, they’ll need to feel the same way.

The AirPod Pros’ spatial audio, which AirPods Max and AirPods 3 also have, points to where future ideas could head. Immersive audio is casual, and we do it all the time. Immersive video is hard and not always needed. I could see AR working as an audio-first approach, like a ping. Apple glasses could potentially do the world-scanning spatial awareness that would allow the spatial audio to work. In the meantime, Apple’s already developing the spatial audio tech that its VR headset would need.

Apple Watch and AirPods could be great companions

Apple’s already got a collection of wearable devices that connect with the iPhone, and both make sense with glasses. Its AirPods can pair for audio (although maybe the glasses have their own Bose Frames-like audio, too), while the Watch could be a helpful remote control. The Apple Watch already acts as a remote at times, for the Apple TV or for linking up with the iPhone camera. Apple’s future headsets could also look to the Watch and expand its display virtually, offering enhanced extras that show up discreetly, like a halo. Or they could use the Watch as some sort of controller.

The Apple Watch could also provide something that it’ll be hard to get from hand gestures or touch-sensitive frames on a pair of glasses: haptics. The rumbling feedback on the Watch could lend some tactile response to virtual things, possibly.

There’s already a low-cost pair of phone goggles, the HoloKit X, that explores these ideas. It uses an iPhone for the headset’s display and cameras and can channel spatial audio to AirPods, and use an Apple Watch for gesture controls. Apple could do the same.

Could Qualcomm and Apple’s reconciliation also be about XR?

Qualcomm and Apple are working together again on future iPhones, and I don’t think it’s just about modems. 5G is a key feature for phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has already been exploring how remote rendering could allow 5G-enabled phones and connected glasses to link up to streaming content and cloud-connected location data. Glasses could eventually stand on their own and use 5G to do advanced computing, in a way like the Apple Watch eventually working over cellular.

Qualcomm’s chipsets are in almost every self-contained AR and VR headset I can think of (Meta Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Will Apple’s tech dovetail at all with Qualcomm’s cross-device platforms?

Technologies

iOS 17 Cheat Sheet: Your Questions on the iPhone Update Answered

Here’s what you need to know about new features and upcoming updates for your iPhone.

Apple’s iOS 17 was released in September, shortly after the company held its Wonderlust event, where the tech giant announced the new iPhone 15 lineup, the Apple Watch Series 9 and the Apple Watch Ultra 2. We put together this cheat sheet to help you learn about and use the new features in iOS 17. It’ll also help you keep track of the subsequent iOS 17 updates.

iOS 17 updates

Using iOS 17

Getting started with iOS 17

Make sure to check back periodically for more iOS 17 tips and how to use new features as Apple releases more updates.

17 Hidden iOS 17 Features You Should Definitely Know About

See all photos

Continue Reading

Technologies

Get Ready for a Striking Aurora That Could Also Disrupt Radio Communications

Don’t expect the storm to cause a lingering problem, though.

A geomagnetic storm is threatening radio communications Monday night, but that doesn’t mean you should be concerned. In fact, it may be an opportunity to see a colorful aurora in the night sky.

The National Oceanic and Atmospheric Administration has issued a geomagnetic storm watch after witnessing a coronal mass ejection from the sun on Saturday. The watch, which was issued over the weekend and will expire after Monday, said the onset of the storm passing over Earth on Sunday night represented a «moderate» threat to communications. As the storm continues to pass through, it could deliver a «strong» threat on Monday night that could cause radio communications to be temporarily disrupted during the worst of it.

Even so, NOAA said, «the general public should not be concerned.»

A coronal mass ejection occurs when magnetic field and plasma mass are violently expelled from the sun’s corona, or the outermost portion of the sun’s atmosphere. In the vast majority of cases, the ejection occurs with no real threat to Earth. However, in the event the ejection happens in the planet’s direction, a geomagnetic storm occurs, and the Earth’s magnetic field is temporarily affected.

In most cases, geomagnetic storms cause little to no disruption on Earth, with radio communications and satellites affected most often. In extreme cases, a geomagnetic storm can cause significant and potentially life-threatening power outages — a prospect that, luckily, the planet hasn’t faced.

Switching poles

Every 11 years, the sun’s magnetic poles switch, with the north pole and south pole swapping positions. During those cycles, the sun’s activity ramps up as it gets closer to pole-switching time. The height of its activity is called solar maximum, and scientists believe we either may be entering the solar maximum or may be already in it.

During periods of heightened solar activity, sunspots increase on the sun and there’s an increase in coronal mass ejections, among other phenomena. According to NOAA, solar maximum could extend into October of this year before the sun’s activity calms and it works towards its less-active phase, solar minimum.

Even when geomagnetic storms hit Earth and disrupt communications, the effects are usually short-lived. Those most affected, including power grid operators and pilots and air traffic controllers communicating over long distances, have fail-safe technologies and backup communications to ensure operational continuity.

But geomagnetic storms aren’t only about radios. In most cases, they also present unique opportunities to see auroras in the night sky. When the storms hit, the plasma they carry creates a jaw-dropping aurora, illuminating the night sky with brilliant colors. Those auroras can be especially pronounced during the most intense phases of the storm, making for nice stargazing.

If you’re interested in seeing the aurora, you’ll need to be ready. The NOAA said the «brunt of the storm has passed» and even if it lingers into Tuesday, there won’t be much to see after Monday night. 

Continue Reading

Technologies

Last Total Solar Eclipse for 20 Years Is Coming: How to See and Photograph It

It’s your last chance until 2044.

Get your eclipse glasses ready, Skygazers: the Great American Eclipse is on its way. On April 8, there’ll be a total eclipse over North America, the last one until 2044.

A total solar eclipse happens when the moon passes between the Earth and the sun, blocking the sun and turning an otherwise sunny day to darkness for a short period of time. Depending on the angle at which you’re viewing the eclipse, you may see the sun completely shrouded by the moon (called totality) or some variation of it. The more off-angle you are and the further you are from the path of the eclipse, the less likely you’ll be to see the totality.

The 2024 total solar eclipse will happen on Monday, April 8. The Great American Eclipse will reach the Mexican Pacific coast at 11:07 a.m. PT (2:07 p.m. ET), and then traverse the US in a northeasterly direction from Texas to Maine, and on into easternmost Canada. If you want a good look at it, but don’t live in the path of totality, you shouldn’t wait much longer to book accommodation and travel to a spot on the path.

Or how about booking a seat in the sky? Delta Airlines made headlines for offering a flight that allows you to see the entire path of totality. Its first eclipse flight, from Austin, Texas, to Detroit sold out quickly. But as of Monday, Delta has added a second flight from Dallas to Detroit, which also covers the path of totality. The airline also has five flights that will offer prime eclipse viewing.

Not everyone can get on one of those elusive eclipse-viewing flights. Here’s a look at other options to nab a chance to see this rare sight and what to know about it.

Total solar eclipse path

The eclipse will cross over the Pacific coast of Mexico and head northeast over mainland Mexico. The eclipse will then make its way over San Antonio at approximately 2:30 p.m. ET on April 8 and move through Texas, over the southeastern part of Oklahoma and northern Arkansas by 2:50 p.m. ET.

By 3 p.m. ET, the eclipse will be over southern Illinois, and just 5 minutes later, will be traveling over Indianapolis. Folks in northwestern Ohio will be treated to the eclipse by 3:15 p.m. ET, and it will then travel over Lake Erie and Buffalo, New York, by 3:20 p.m. ET. Over the next 10 minutes, the eclipse will be seen over northern New York state, then over Vermont. By 3:35 p.m. ET, the eclipse will work its way into Canada and off the Eastern coast of North America.

Best places to watch the Great American Eclipse

When evaluating the best places to watch this year’s total eclipse, you’ll first want to determine where you’ll have the best angle to see the totality. The farther off-angle you are — in other words, the farther north or south of the eclipse’s path — the less of an impact you can expect.

Therefore, if you want to have the best chance of experiencing the eclipse, you’ll want to be in its path. As of this writing, most of the cities in the eclipse’s path have some hotel availability, but recent reports have suggested that rooms are booking up. And as more rooms are booked, prices are going up.

So if you want to be in the eclipse’s path, and need a hotel to do it, move fast. And Delta’s eclipse-viewing flight from Dallas to Detroit has just four seats left at the time of publication.

Eclipse eye safety and photography

 
As with any solar eclipse, it’s critical you keep eye safety in mind.

During the eclipse, and especially during the periods before and after totality, don’t look directly at the sun without special eye protection. Also, be sure not to look at the sun through a camera (including the camera on your phone), binoculars, a telescope or any other viewing device. This could cause serious eye injury. Sunglasses aren’t enough to protect your eyes from damage.

If you want to view the eclipse, you’ll instead need solar viewing glasses that comply with the ISO 12312-2 safety standard. Anything that doesn’t meet that standard or greater won’t be dark enough to protect your eyes. Want to get them for free? If you’ve got a Warby Parker eyeglasses store nearby, the company is giving away free, ISO-certified solar eclipse glasses at all of its stores from April 1 until the eclipse, while supplies last.

If you don’t have eclipse viewing glasses handy, you can instead use indirect methods for viewing the eclipse, like a pinhole projector.

Read more: A Photographer’s Adventure With the Eclipse

In the event you want to take pictures of the eclipse, attach a certified solar filter to your camera. Doing so will protect your eyes and allow you to take photos while you view the eclipse through your lens.

There’s also a new app to help you both protect your eyes and take better photos of the eclipse on your phone. Solar Snap, designed by a former Hubble Space Telescope astronomer, comes with a Solar Snap camera filter that attaches to the back of an iPhone or Android phone, along with solar eclipse glasses for protecting your eyesight during the event. After you attach the filter to your phone, you can use the free Solar Snap Eclipse app to zoom in on the eclipse, adjust exposure and other camera settings, and ultimately take better shots of the eclipse.

2024 eclipse compared to 2017

The last total solar eclipse occurred in 2017, and many Americans had a great view. Although there are plenty of similarities between the 2017 total solar eclipse and the one coming April 8, there are a handful of differences. Mainly, the 2024 eclipse is going to cover more land and last longer.

The 2017 eclipse started over the northwest US and moved southeast. Additionally, that eclipse’s path was up to 71 miles wide, compared with a maximum width of 122 miles for this year’s eclipse. Perhaps most importantly, the moon completely covered the sun for just 2 minutes, 40 seconds in 2017. This year, maximum totality will last for nearly four-and-a-half minutes.

Continue Reading

Trending

Exit mobile version