Technologies
Pixel 7 Pro Actually Challenges My $10,000 DSLR Camera Gear
My full-frame Canon camera is better, but Google’s flagship phone opens creative options far beyond snapshots.
Google got my attention by bragging about the Pixel 7 Pro‘s «pro-level zoom» and asserting that the Android phone’s photography features can challenge traditional cameras. I’m one of those serious photographers who hauls around a heavy camera and a bunch of bulky lenses. But I also love phone photography, so I decided to test Google’s claims.
At its October launch event, Google touted the Pixel 7 Pro’s telephoto zoom for magnifying distant subjects, its Tensor G2-powered AI processing, its faster Night Sight for low-light scenes and a new macro ability for closeup photos. «It cleverly combines state-of-the-art hardware, software and machine learning to create amazing zoom photos across any magnification,» Pixel camera hardware chief Alexander Schiffhauer said at the phone’s launch event. Google wants you to think of this phone as offering a continuous zoom range from ultrawide angle to supertelephoto.
As you might imagine, I got better results from my «real» camera equipment, which would cost $10,000 if purchased new today. Even though my Canon 5D Mark IV is now 6 years old, it’s hard to beat a big image sensor and big lenses when it comes to color, sharpness, detail and a wide dynamic range spanning bright and dark tones.
But the Pixel 7 Pro’s photographic flexibility challenges my camera setup better than any other phone I’ve used, even outperforming my DSLR in some circumstances and earning a «stellar» rating from CNET editor Andrew Lanxon. While my camera and four lenses fill a whole backpack, Google’s smartphone fits in my pocket. And of course that $900 smartphone lets me share a selfie, check my email, pay for the groceries and tackle the daily crossword puzzle.
With the steady annual improvement in smartphone camera hardware and image processing, a smartphone isn’t just a better-than-nothing camera. These little slices of electronics are increasingly able to nail important shots and open up new creative possibilities for those who are discovering the rewards of photography.
I’ll keep hauling my DSLR on hikes and family outings. But because I won’t always have it with me, the Pixel 7 Pro — in particular its zoom and low-light abilities — means I won’t be as worried about missing the shot when I don’t.
My Canon 5D Mark IV, which costs $2,700 new these days, most often has the $1,900 Canon EF 24-70mm f/2.8L II USM lens mounted. I also use the $2,400 EF 100-400mm f/4.5-5.6L IS II USM for telephoto shots, the $1,300 ultrawide EF 16-35mm f/4L IS USM zoom, the $1,300 EF 100mm f/2.8L Macro IS USM for closeups, and the $429 Extender EF 1.4X III for more telephoto reach when photographing birds. Here’s how that gear stacks up against the Pixel 7 Pro’s 0.5x ultrawide, 1x main camera and 5x telephoto camera.
Google Pixel 7 Pro vs. Canon 5D Mark IV, main camera
With plenty of light, the Pixel 7 Pro’s 24mm main camera does a good job capturing color and detail in its 12-megapixel images. Check the comparisons here (and note that my DSLR shoots in a more elongated 3:2 aspect ratio than the Pixel 7 Pro’s 4:3).
Pixel peeping shows the phone can’t hold a candle to my 30-megapixel DSLR when it comes to detail. If you’re printing posters or need a lot of detail for photo editing, a modern DSLR or mirrorless camera is worth it. But 12 megapixels is plenty for most purposes. Check the below cropped images to see what’s going on up close.
Google missed a chance to shoot even higher resolution photos than my 30-megapixel DSLR, though. The Pixel 7 Pro’s main camera has a 50-megapixel sensor. It takes 12-megapixel photos using an approach called pixel binning that combines each 2×2 pixel group on the sensor into one effectively larger pixel. That means better color and low-light performance when shooting at 24mm. But you can use those 50 megapixels differently by skipping the pixel binning and shooting in the sensor’s full resolution when there’s sufficient light. That’s exactly what Apple does with the iPhone 14 Pro camera, and I wish Google did the same.
Pixel 7 Pro vs. DSLR, people and pets
The Pixel 7 Pro was capable at portrait photography. I prefer shooting raw and editing the shots myself because I sometimes find the Pixel 7 Pro makes faces look a little too processed, and I find its color balance a bit cool for my tastes. With the main camera, the Pixel 7 Pro does a pretty good job finding faces, tracking them and staying focused. For 2022, the Pixel 7 Pro now can find individual eyes, the ideal focus point of a camera and a weak point on my older DSLR.
On this comparison, I find the DSLR did a better job with skin tones, but the Pixel 7 Pro capably exposed the face in tricky lighting.
Using the Pixel 7 Pro’s portrait mode, which artificially blurs photo backgrounds, I find the processing artifacts distracting, especially with flyaway hair, though that’s not a problem with the example below. The shot is workable for quick sharing and looks fine on smaller screens, but I wouldn’t make a print of it. For the DSLR shot, I used my Sigma 35mm f1.4 lens, shooting wide open at f1.4 for the smoothest possible background blur. It’s much better than the Pixel 7 Pro, though its shallow depth of field blurs the hands and plastic toys.
For pets, the Pixel 7 Pro again did a great job finding and focusing on eyes. Here’s my dog, up close. The main camera at 1x zoom, or 24mm, isn’t ideal for single subjects, though, and the camera’s performance at 2x isn’t as strong, so bear that in mind.
To see how much more detail my SLR can capture — as long as I get focus right — check the cropped views below. And note that new mirrorless cameras from Sony, Nikon and Canon do a good job with eye tracking for easier focus.
DSLR vs. Pixel 7 Pro, telephoto cameras
Telephoto lenses magnify more distant subjects, and the Pixel 7 Pro has a remarkable range for a smartphone. Its sensors can shoot at 2x, 5x and 10x zoom modes with minimal processing trickery. It’ll shoot at intermediate settings with various combinations of cropping and multi-camera image compositing that I find fairly convincing. Then it reaches up to 30x with Google’s AI-infused upscaling technology, called Super Res Zoom. Here’s the same scene shot across the Pixel 7 Pro’s full range from supertelephoto 30x to ultrawide 0.5x:
The image quality is pretty bad by the time you reach 30x zoom, an equivalent of 720mm. But even my expensive DSLR gear only reaches 560mm maximum, and venturing beyond 10x on the Pixel 7 Pro can be justified in many circumstances. Not every photo has to be good enough quality to make an 8×10 print.
Bigger telephoto photography
Telephoto lenses are big, which is why those pro photographers at NFL games haul around monopods to support their hulking optics. Canon’s RF 400mm f/2.8 L IS USM lens, popular on the sidelines, weighs more than six pounds, measures more than 14 inches long, and costs more than my entire collection of cameras and lenses. My Canon 100-400mm zoom is smaller and cheaper but doesn’t let in as much light, but it’s still gargantuan compared with the Pixel 7 Pro. I’m delighted to be able to capture useful telephoto shots on a Pixel phone, an option that previously was available only on rival Android phones from Samsung and others.
Google exploits the Pixel 7 Pro’s 50-megapixel main camera sensor for the first step up the telephoto lens ladder, a 2x zoom level good for portraits. The Pixel 7 Pro uses just the central 12 megapixels to capture a 12-megapixel photo in 2x telephoto mode, an equivalent focal length of 48mm.
The dedicated telephoto camera kicks in at 5x zoom, an equivalent of 120mm. Instead of a bulky telephoto protuberance, Google uses a prism to bend light 90 degrees so the necessary lens length and 48-megapixel image sensor can be tucked sideways within the Pixel 7 Pro’s thicker «camera bar» section. It also can use the central megapixels in its 10x mode, or 240mm, an option I think is terrific. This San Francisco architectural sight below is pretty good:
Using AI and software processing to zoom further, the camera can reach 20x and even 30x zoom, which translates to 480mm and 720mm. By comparison, my DSLR reaches 560mm with my 1.4x telephoto extender.
My DSLR would have trounced the Pixel 7 Pro for this scene of Bay Area fog lapping up against the Santa Cruz Mountains south of San Francisco, shot somewhere between 15x and 20x. (I wish Google would write zoom level metadata into photos the way my Canon records lens focal length settings.) But guess what? I was mountain biking and didn’t take my DSLR. The best camera is the one you have, as the saying goes.
Back at 10x zoom, I was pleased with this shot below of my pal Joe mountain biking. I’ve photographed people in this very spot before with smartphones, and this was the first time I wasn’t frustrated with the results.
Google’s optics and image processing methods are clever but not magical. The Pixel 7 Pro produces a 12-megapixel image, but the farther beyond 10x you shoot, the more you’ll cringe at its blotchy details that look more like a watercolor painting. That’s the glass-is-half-empty view. I’m actually on the glass-is-half-full side, appreciating what you can do and recognizing that a lot of photos will be viewed on smaller screens. Image qualityof 10x is respectable, and that alone is a major achievement.
Here’s a comparison of a rooftop party photographed with the Pixel 7 Pro at 30x, or 720mm equivalent, and my camera at 560mm, but cropped in to match the phone’s framing. The DSLR does better, of course. Even cropped, it’s an 18-megapixel image.
Practical limits on Pixel 7 Pro’s telephoto cameras
To really exercise the phone, I toted it to see the US Navy’s Blue Angels flight display over San Francisco. Buildings and fog blocking my view made photography tough, but I found new limitations to the Pixel 7 Pro.
Fiddling with screen controls to hit 10x or more zoom is slow. Framing fast-moving subjects on a smartphone screen is hard, even with the aid of the miniature wider-angle view that Google pops into the scene and its AI-assisted stabilization technology. Focus is also relatively pokey. With my DSLR, I could rapidly find the jets in the sky, lock focus, track them as they flew and shoot a burst of shots.
I didn’t get a single good photo of the Blue Angels with the Pixel 7 Pro. Google’s «pro-level zoom» works much better with stationary subjects.
DSLR vs. Pixel 7 Pro, shooting in the dark
Here’s where the Pixel 7 Pro beats out a vastly more expensive camera. There’s no way you can hold a camera steady for 6 seconds, but Pixel phones in effect can thanks to computational photography techniques that Google pioneered. Google takes a collection of photos, using AI to judge when your hands are most still, then combines these individual frames into one shot. It’s the basis of its Night Sight feature, which I’ve used many times and, at its extreme, powers an astrophotography mode I’ve used to take 4-minute exposures of the night sky.
Below is a comparison of a nighttime scene with the Pixel 7 Pro at 1x, where it’s best at gathering light, and my DSLR with its 24-70mm f2.8 lens. The DSLR has more detail up close, but the Pixel 7 Pro does well, and its deeper depth of field means the leaves in the foreground aren’t a smeary mess.
Here’s a comparison of a 2x zoom photo with the Pixel 7 Pro and the best I could do handheld with my 24-70mm f2.8 lens. The longer your zoom, the harder it is to hold a camera steady, and even with my elbows on a railing to steady the camera, the Pixel 7 Pro shot was vastly easier to capture. I had to crank my DSLR’s sensitivity to ISO 12,800 to get the shutter speed down to 1/8sec, and even then, most of the photos were duds. Image stabilization helps, but this lens doesn’t have it.
Just for kicks, I used a tripod to take three exposure-bracketed shots with my DSLR and merged them into a single HDR (high dynamic range) photo in Adobe’s Lightroom software. The longest exposure was 30 seconds. That’s how much effort it took to beat a Night Sight photo I took just standing there holding the phone for 6 seconds. Check the comparison below.
Here’s where my DSLR completely trounced the Pixel 7 Pro, even with Night Sight, though: the nearly full moon. Here’s the Pixel 7 Pro at 30x zoom vs. my DSLR at 560mm, cropped so the framing matches.
DSLR vs. Pixel 7 Pro, dynamic range
One of the best measures of a camera is dynamic range, the span between dark and light it can capture in a single scene. To exercise the Pixel 7 Pro here, I shot in raw format, which allows for more editing flexibility. Then I edited the photos, cranking the exposure up 4 stops to reveal noise problems in shadowed areas and then down 4 stops to see how well it captured detail in bright areas.
In short, I’m impressed. Google squeezes a remarkable amount of data out of its relatively small sensor with its processing methods.
Two techniques are relevant. With Google’s HDR+ system, the Pixel 7 Pro combines multiple underexposed frames and one regularly exposed frame to record shadow detail without blowing out highlights in bright areas. And Google includes this data in a «computational raw» format that packages that detail in Adobe’s very flexible DNG format. It’s not truly raw, like the single frame of data pulled from my DSLR’s image sensor is, but it’s an excellent option for smartphone photography.
Below is a cropped photo with the Pixel 7 Pro’s 1x camera, underexposed by 4 stops to see if was able to record a range of tones even in the very bright pampas grass plumes. It was.
Shooting at 2x, which uses only the central pixels on the 1x camera, poses more of a challenge when going up against my DSLR, which suffers no such degradation in hardware abilities when I zoom in. Overexposed by 4 stops, you can see a lot more noise and color problems with the Pixel 7 Pro in the comparison below. But overall, it’s got impressive dynamic range on the main camera.
DSLR vs. Pixel 7 Pro, ultrawide
Google made the ultrawide lens on the Pixel 7 Pro an even wider field of view compared with last year. What you like is a matter of personal preference, but I appreciate the dramatic perspective that you can capture with a very wide angle. When I don’t need it, the 24mm main camera still qualifies as wide angle.
Here’s a comparison of a scene shot with the Pixel 7 Pro and my DSLR’s 16-35mm ultrawide zoom.
DSLR vs. PIxel 7 Pro, macro
The new ultrawide camera now has autofocus hardware, and that opens up the world of macro photography for close-up subjects. Apple’s iPhone Pro models got this ability in 2021, and I’ve loved macro photos for years as a way to shoot flowers, mushrooms, toys and other small subjects, so I’m delighted to see it on the higher-end Pixel phones.
As with the iPhone, though, the macro is useful as long as the subject fits in the central portion of the frame. Note in this comparison below how blurred the image gets toward the periphery of this butterfly coaster with the Pixel 7 Pro.
No, it’s not as good as my DSLR. But with macro abilities, Night Sight and a zoom range from ultrawide to super telephoto, the Pixel 7 Pro is more than just useful for snapshots. It lets you start exploring a much bigger part of photography’s creative realm.
Technologies
iOS 17 Cheat Sheet: Your Questions on the iPhone Update Answered
Here’s what you need to know about new features and upcoming updates for your iPhone.
Apple’s iOS 17 was released in September, shortly after the company held its Wonderlust event, where the tech giant announced the new iPhone 15 lineup, the Apple Watch Series 9 and the Apple Watch Ultra 2. We put together this cheat sheet to help you learn about and use the new features in iOS 17. It’ll also help you keep track of the subsequent iOS 17 updates.
iOS 17 updates
- iOS 17.4.1 Fixes These Issues on Your iPhone
- iOS 17.4 Brings These New Features to Your iPhone
- Why You Should Download iOS 17.4 Right Now
- iOS 17.3.1 Fixes This Issue on Your iPhone
- iOS 17.3: All the New Features on Your iPhone
- Why You Should Download iOS 17.3 Right Now
- iOS 17.2.1: What You Should Know About the iPhone Update
- iOS 17.2 Brings These New Features to Your iPhone
- What iOS 17.1.2 Fixes on Your iPhone
- iOS 17.1.1 Patches These iPhone Issues
- What New Features iOS 17.1 Brings to Your iPhone
- What to Know About iOS 17.0.1
- Apple Made an iPhone 15 Mistake, but iOS 17.0.2 Is Here to Fix It
- iOS 17.0.3 Fixes This iPhone 15 Pro Problem
Using iOS 17
- Three iPhone Settings to Change After Downloading iOS 17
- iOS 17’s Best New Features
- The iOS 17 Features We’re Excited About
- iOS 17 Is Filled With Delightful Features, Intuitive Improvements and More
- 17 Hidden iOS 17 Features You Shouldn’t Miss
- iOS 17 Upgrades Your iPhone’s Keyboard
- You Can Tag Your Pets In Your ‘People’ Album With iOS 17
- How to Create Live Stickers in iOS 17
- How to Set Up Contact Posters in iOS 17
- How to Automatically Delete Two-Factor Verification Codes in iOS 17
- What to Know About iOS 17’s Unreleased Journal App
- How Good Are Offline Maps in iOS 17?
- How to Use iOS 17’s Live Voicemail Feature
- You Can Change Your Private Browsing Browser in iOS 17
- Hidden iOS 17 Feature Makes It Easier to Send Photos and Videos
- You Can Clone Your Voice with iOS 17. Here’s How
- Are Audio Message Transcripts in iOS 17 Any Good?
- Sharing AirTags in iOS 17 is Easy. Here’s How
- How to Create Camera Shortcuts in iOS 17
- What You Need to Know About the Improved Autocorrect in iOS 17
- Use This Hidden iOS 17 Feature to Reduce Eye Strain
- How to Enable Sensitive Content Warnings on Your iPhone
- Let Your Loved Ones Know You’re Safe With This iOS 17 Feature
- Simplify Your Grocery List With iOS 17
- How to Turn Off FaceTime Reactions in iOS 17
- What Is iOS 17’s Journal App and How Does It Work?
- You Can Use Albums for Photo Shuffle on Your Lock Screen
- Play Daily Crosswords in Apple News With iOS 17
- How to Turn Off the Most Annoying iOS 17 Features
- iOS 17.2 Brings Better Wireless Charging to These iPhones
- How to Turn Inline Predictive Text Off With iOS 17.2
- How to Enable Contact Key Verification With iOS 17.2
- Don’t Like Your iPhone’s Default Alert Tone? Here’s How to Change It
- The Latest Security Features in iOS 17.3
- How to Secure Your Data With Stolen Device Protection
- Apple Music’s Collaborative Playlists Are Here. This Is How You Use Them
- People in the EU Can Download Other App Stores Soon
- All the New Emoji Your iPhone Just Got
- How to Give Your iPhone’s Stolen Device Protection a Boost
- What to Know About Podcast Transcripts on Your iPhone
- How to Enable Siri to Read Texts in Multiple Languages
- Where to Find your Apple Cash Virtual Card Numbers
Getting started with iOS 17
- iOS 17 Review: StandBy Mode Changed My Relationship With My iPhone
- Whether or Not Your iPhone Supports iOS 17
- Do This Before Downloading iOS 17
- How to Download iOS 17 to Your iPhone
Make sure to check back periodically for more iOS 17 tips and how to use new features as Apple releases more updates.
Technologies
Get Ready for a Striking Aurora That Could Also Disrupt Radio Communications
Don’t expect the storm to cause a lingering problem, though.
A geomagnetic storm is threatening radio communications Monday night, but that doesn’t mean you should be concerned. In fact, it may be an opportunity to see a colorful aurora in the night sky.
The National Oceanic and Atmospheric Administration has issued a geomagnetic storm watch after witnessing a coronal mass ejection from the sun on Saturday. The watch, which was issued over the weekend and will expire after Monday, said the onset of the storm passing over Earth on Sunday night represented a «moderate» threat to communications. As the storm continues to pass through, it could deliver a «strong» threat on Monday night that could cause radio communications to be temporarily disrupted during the worst of it.
Even so, NOAA said, «the general public should not be concerned.»
A coronal mass ejection occurs when magnetic field and plasma mass are violently expelled from the sun’s corona, or the outermost portion of the sun’s atmosphere. In the vast majority of cases, the ejection occurs with no real threat to Earth. However, in the event the ejection happens in the planet’s direction, a geomagnetic storm occurs, and the Earth’s magnetic field is temporarily affected.
In most cases, geomagnetic storms cause little to no disruption on Earth, with radio communications and satellites affected most often. In extreme cases, a geomagnetic storm can cause significant and potentially life-threatening power outages — a prospect that, luckily, the planet hasn’t faced.
Switching poles
Every 11 years, the sun’s magnetic poles switch, with the north pole and south pole swapping positions. During those cycles, the sun’s activity ramps up as it gets closer to pole-switching time. The height of its activity is called solar maximum, and scientists believe we either may be entering the solar maximum or may be already in it.
During periods of heightened solar activity, sunspots increase on the sun and there’s an increase in coronal mass ejections, among other phenomena. According to NOAA, solar maximum could extend into October of this year before the sun’s activity calms and it works towards its less-active phase, solar minimum.
Even when geomagnetic storms hit Earth and disrupt communications, the effects are usually short-lived. Those most affected, including power grid operators and pilots and air traffic controllers communicating over long distances, have fail-safe technologies and backup communications to ensure operational continuity.
But geomagnetic storms aren’t only about radios. In most cases, they also present unique opportunities to see auroras in the night sky. When the storms hit, the plasma they carry creates a jaw-dropping aurora, illuminating the night sky with brilliant colors. Those auroras can be especially pronounced during the most intense phases of the storm, making for nice stargazing.
If you’re interested in seeing the aurora, you’ll need to be ready. The NOAA said the «brunt of the storm has passed» and even if it lingers into Tuesday, there won’t be much to see after Monday night.
Technologies
Last Total Solar Eclipse for 20 Years Is Coming: How to See and Photograph It
It’s your last chance until 2044.
Get your eclipse glasses ready, Skygazers: the Great American Eclipse is on its way. On April 8, there’ll be a total eclipse over North America, the last one until 2044.
A total solar eclipse happens when the moon passes between the Earth and the sun, blocking the sun and turning an otherwise sunny day to darkness for a short period of time. Depending on the angle at which you’re viewing the eclipse, you may see the sun completely shrouded by the moon (called totality) or some variation of it. The more off-angle you are and the further you are from the path of the eclipse, the less likely you’ll be to see the totality.
The 2024 total solar eclipse will happen on Monday, April 8. The Great American Eclipse will reach the Mexican Pacific coast at 11:07 a.m. PT (2:07 p.m. ET), and then traverse the US in a northeasterly direction from Texas to Maine, and on into easternmost Canada. If you want a good look at it, but don’t live in the path of totality, you shouldn’t wait much longer to book accommodation and travel to a spot on the path.
Or how about booking a seat in the sky? Delta Airlines made headlines for offering a flight that allows you to see the entire path of totality. Its first eclipse flight, from Austin, Texas, to Detroit sold out quickly. But as of Monday, Delta has added a second flight from Dallas to Detroit, which also covers the path of totality. The airline also has five flights that will offer prime eclipse viewing.
Not everyone can get on one of those elusive eclipse-viewing flights. Here’s a look at other options to nab a chance to see this rare sight and what to know about it.
Total solar eclipse path
The eclipse will cross over the Pacific coast of Mexico and head northeast over mainland Mexico. The eclipse will then make its way over San Antonio at approximately 2:30 p.m. ET on April 8 and move through Texas, over the southeastern part of Oklahoma and northern Arkansas by 2:50 p.m. ET.
By 3 p.m. ET, the eclipse will be over southern Illinois, and just 5 minutes later, will be traveling over Indianapolis. Folks in northwestern Ohio will be treated to the eclipse by 3:15 p.m. ET, and it will then travel over Lake Erie and Buffalo, New York, by 3:20 p.m. ET. Over the next 10 minutes, the eclipse will be seen over northern New York state, then over Vermont. By 3:35 p.m. ET, the eclipse will work its way into Canada and off the Eastern coast of North America.
Best places to watch the Great American Eclipse
When evaluating the best places to watch this year’s total eclipse, you’ll first want to determine where you’ll have the best angle to see the totality. The farther off-angle you are — in other words, the farther north or south of the eclipse’s path — the less of an impact you can expect.
Therefore, if you want to have the best chance of experiencing the eclipse, you’ll want to be in its path. As of this writing, most of the cities in the eclipse’s path have some hotel availability, but recent reports have suggested that rooms are booking up. And as more rooms are booked, prices are going up.
So if you want to be in the eclipse’s path, and need a hotel to do it, move fast. And Delta’s eclipse-viewing flight from Dallas to Detroit has just four seats left at the time of publication.
Eclipse eye safety and photography
As with any solar eclipse, it’s critical you keep eye safety in mind.
During the eclipse, and especially during the periods before and after totality, don’t look directly at the sun without special eye protection. Also, be sure not to look at the sun through a camera (including the camera on your phone), binoculars, a telescope or any other viewing device. This could cause serious eye injury. Sunglasses aren’t enough to protect your eyes from damage.
If you want to view the eclipse, you’ll instead need solar viewing glasses that comply with the ISO 12312-2 safety standard. Anything that doesn’t meet that standard or greater won’t be dark enough to protect your eyes. Want to get them for free? If you’ve got a Warby Parker eyeglasses store nearby, the company is giving away free, ISO-certified solar eclipse glasses at all of its stores from April 1 until the eclipse, while supplies last.
If you don’t have eclipse viewing glasses handy, you can instead use indirect methods for viewing the eclipse, like a pinhole projector.
Read more: A Photographer’s Adventure With the Eclipse
In the event you want to take pictures of the eclipse, attach a certified solar filter to your camera. Doing so will protect your eyes and allow you to take photos while you view the eclipse through your lens.
There’s also a new app to help you both protect your eyes and take better photos of the eclipse on your phone. Solar Snap, designed by a former Hubble Space Telescope astronomer, comes with a Solar Snap camera filter that attaches to the back of an iPhone or Android phone, along with solar eclipse glasses for protecting your eyesight during the event. After you attach the filter to your phone, you can use the free Solar Snap Eclipse app to zoom in on the eclipse, adjust exposure and other camera settings, and ultimately take better shots of the eclipse.
2024 eclipse compared to 2017
The last total solar eclipse occurred in 2017, and many Americans had a great view. Although there are plenty of similarities between the 2017 total solar eclipse and the one coming April 8, there are a handful of differences. Mainly, the 2024 eclipse is going to cover more land and last longer.
The 2017 eclipse started over the northwest US and moved southeast. Additionally, that eclipse’s path was up to 71 miles wide, compared with a maximum width of 122 miles for this year’s eclipse. Perhaps most importantly, the moon completely covered the sun for just 2 minutes, 40 seconds in 2017. This year, maximum totality will last for nearly four-and-a-half minutes.
-
Technologies1 год ago
Tech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года ago
Verum, Wickr and Threema: next generation secured messengers
-
Technologies2 года ago
Olivia Harlan Dekker for Verum Messenger
-
Technologies3 года ago
Google to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies3 года ago
iPhone 13 event: How to watch Apple’s big announcement tomorrow
-
Technologies2 года ago
Made in the USA: Baseball bats, sticky notes, kitchen mixers and more
-
Technologies2 года ago
Black Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies2 года ago
Regen-COV might work before COVID exposure, trial shows: What to know about monoclonal antibodies