Connect with us

Technologies

Here’s What I Learned Testing Photoshop’s New Generative AI Tool

Adobe’s Firefly AI feature brings new fun and fakery to photos. It’s a profound change for image editing, though far from perfect.

Adobe has bulit generative AI abilities into its flagship image-editing software, releasing a Photoshop beta version Tuesday that dramatically expands what artists and photo editors can do. The move promises to release a new torrent of creativity even as it gives us all a new reason to pause and wonder if that sensational, scary or inspirational photo you see on the internet is actually real.

In my tests, detailed below, I found the tool impressive but imperfect. Adding it directly to Photoshop is a big deal, letting creators experiment within the software tool they’re likely already using without excursions to MidjourneyStability AI’s Stable Diffusion or other outside generative AI tools.

With Adobe’s Firefly family of generative AI technologies arriving in Photoshop, you’ll be able to let the AI fill a selected part of the image with whatever it thinks most fitting – for example, replacing road cracks with smooth pavement. You can also specify the imagery you’d like with a text prompt, such as adding a double yellow line to the road.

Firefly in Photoshop also can also expand an image, adding new scenery beyond the frame based on what’s already in the frame or what you suggest with text. Want more sky and mountains in your landscape photo? A bigger crowd at the rock concert? Photoshop will oblige, without today’s difficulties of finding source material and splicing it in.

The feature, called generative fill and scheduled to emerge from beta testing in the second half of 2023, can be powerful. In Adobe’s live demo, the tool was often able to match a photo’s tones, blend in AI-generated imagery seamlessly, infer the geometric details of perspective even in reflections and extrapolate the position of the sun from shadows and sky haze.

Such technologies have been emerging over the last year as Stable Diffusion, Midjourney and OpenAI’s Dall-Ecaptured the imaginations of artists and creative pros. Now it’s built directly into the software they’re most likely to already be using, streamlining what can be a cumbersome editing process.

«It really puts the power and control of generative AI into the hands of the creator,» said Maria Yap, Adobe’s vice president of digital imaging. «You can just really have some fun. You can explore some ideas. You can ideate. You can create without ever necessarily getting into the deep tools of the product, very quickly.»

But you can’t sell anything yet. With Firefly technology, including what’s produced by Photoshop’s generative fill, «you may not use the output for any commercial purpose,» Adobe’s generative AI beta rules state.

Photoshop’s Firefly AI imperfect but useful

In my testing, I frequently ran into problems, many of them likely stemming from the limited range of the training imagery. When I tried to insert a fish on a bicycle to an image, Firefly only added the bicycle. I couldn’t get Firefly to add a kraken to emerge from San Francisco Bay. A musk ox looked like a panda-moose hybrid.

Less fanciful material also presents problems. Text looks like an alien race’s script. Shadows, lighting, perspective and geometry weren’t always right.

People are hard, too. On close inspection, their faces were distorted in weird ways. Humans added into shots could be positioned too high in the frame or in otherwise unconvincingly blended in.

Still, Firefly is remarkable for what it can accomplish, particularly with landscape shots. I could add mountains, oceans, skies and hills to landscapes. A white delivery van in a night scene was appropriately yellowish to match the sodium vapor streetlights in the scene. If you don’t like the trio of results Firefly presents, you can click the «generate» button to get another batch.

Given the pace of AI developments, I expect Firefly in Photoshop will improve.

It’s hard and expensive to retrain big AI models, requiring a data center packed with expensive hardware to churn through data, sometimes taking weeks for the largest models. But Adobe plans relatively frequent updates to Firefly. «Expect [about] monthly updates for general improvements and retraining every few months in all likelihood,» Adobe product chief Scott Belsky tweeted Tuesday.

Automating image manipulation

For years, «Photoshop» hasn’t just referred to Adobe’s software. It’s also used as a verb signifying photo manipulations like slimming supermodels’ waists or hiding missile launch failures. AI tools automate not just fun and flights of fancy, but also fake images like an alleged explosion at the Pentagon or a convincingly real photo of the pope in a puffy jacket, to pick two recent examples.

With AI, expect editing techniques far more subtle than the extra smoke easily recognized as digitally added to photos of an Israeli attack on Lebanon in 2006.

It’s a reflection of the double-edged sword that is generative AI. The technology is undeniably useful in many situations but also blurs the line between what is true and what is merely plausible.

For its part, Adobe tries to curtail problems. It doesn’t permit prompts to create images of many political figures and blocks you for «safety issues» if you try to create an image of black smoke in front of the White House. And its AI usage guidelines prohibit imagery involving violence, pornography and «misleading, fraudulent, or deceptive content that could lead to real-world harm,» among other categories. «We disable accounts that engage in behavior that is deceptive or harmful.»

Firefly also is designed to skip over styling prompts like that have provoked serious complaints from artists displeased to see their type of art reproduced by a data center. And it supports the Content Authenticity Initiative‘s content credentials technology that can be used to label an image as having been generated by AI.

Today, generative AI imagery made with Adobe’s Firefly website add content credentials by default along with a visual watermark. When the Photoshop feature exists beta testing and ships later this year, imagery will include content credentials automatically, Adobe said.

People trying to fake images can sidestep that technology. But in the long run, it’ll become part of how we all evaluate images, Adobe believes.

«Content credentials give people who want to be trusted a way to be trusted. This is an open-source technology that lets everyone attach metadata to their images to show that they created an image, when and where it was created, and what changes were made to it along the way,» Adobe said. «Once it becomes the norm that important news comes with content credentials, people will then be skeptical when they see images that don’t.»

Generative AI for photos

Adobe’s Firefly family of generative AI tools began with a website that turns a text prompt like «modern chair made up of old tires» into an image. It’s added a couple other options since, and Creative Cloud subscribers will also be able to try a lightweight version of the Photoshop interface on the Firefly site.

When OpenAI’s Dall-E brought that technology to anyone who signed up for it in 2022, it helped push generative artificial intelligence from a technological curiosity toward mainstream awareness. Now there’s plenty of worry along with the excitement as even AI creators fret about what the technology will bring now and in the more distant future.

Generative AI is a relatively new form of artificial intelligence technology. AI models can be trained to recognize patterns in vast amounts of data – in this case labeled images from Adobe’s stock art business and other licensed sources – and then to create new imagery based on that source data.

Generative AI has surged to mainstream awareness with language models used in tools like OpenAI’s ChatGPT chatbot, Google’s Gmail and Google Docs, and Microsoft’s Bing search engine. When it comes to generating images, Adobe employs an AI image generation technique called diffusion that’s also behind Dall-E, Stable Diffusion, Midjourney and Google’s Imagen.

Adobe calls Firefly for Photoshop a «co-pilot» technology, positioning it as a creative aid, not a replacement for humans. Yap acknowledges that some creators are nervous about being replaced by AI. Adobe prefers to see it as a technology that can amplify and speed up the creative process, spreading creative tools to a broader population.

«I think the democratization we’ve been going through, and having more creativity, is a positive thing for all of us,» Yap said. «This is the future of Photoshop.»

Editors’ note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post.

Technologies

Tariffs Explained: Latest on Trump’s Shifting Import Tax Plan, and What It Means

Continue Reading

Technologies

Apple, I’m (Sky) Blue About Your iPhone 17 Air Color

Commentary: The rumored new hue of the iPhone 17 Air is more sky blah than sky blue.

I can’t help but feel blue about the latest rumor that Apple’s forthcoming iPhone 17 Air will take flight in a subtle, light-hued color called sky blue.

Sky blue isn’t a new color for Apple. It’s the featured shade of the current M4 MacBook Air, a shimmer of cerulean so subtle as to almost be missed. It’s silver left too close to an aquarium; silver that secretly likes to think it’s blue but doesn’t want everyone else to notice.

Do Apple employees get to go outside and see a real blue sky? It’s actually vivid, you can check for yourself. Perhaps the muted sky blue color reflects a Bay Area late winter/early spring frequent layer of clouds like we typically see here in Seattle.

«Who cares?» you might find yourself saying. «Everyone gets a case anyway.» I hear you and everyone else who’s told me that. But design-focused Apple is as obsessive about colors as they are about making their devices thinner. And I wonder if their heads are in the clouds about which hues adorn their pro products.

Making the case for a caseless color iPhone

I’m more invested in this conversation than most — I’m one of those freaks who doesn’t wrap my phone in a case. I find cases bulky and superfluous, and I like to be able to see Apple’s design work. Also, true story, I’ve broken my iPhone screen only twice: First when it was in a «bumper» that Apple sent free in response to the iPhone 4 you’re-holding-it-wrong Antennagate fiasco, and second when trying to take long exposure starry night photos using what I didn’t realize was a broken tripod mount. My one-week-old iPhone 13 Pro slipped sideways and landed screen-first on a pointy rock. A case wouldn’t have saved it.

My current model is an iPhone 16 Pro in black titanium — which I know seems like avoiding color entirely — but previously I’ve gone for colors like blue titanium and deep purple. I wanted to like deep purple the most but it came across as, in the words of Patrick Holland in his iPhone 14 Pro review, «a drab shade of gray or like Grimace purple,» depending on the light.

Pros can be bold, too

Maybe the issue is too many soft blues. Since the iPhone Pro age began with the iPhone 11 Pro, we’ve seen variations like blue titanium (iPhone 15 Pro), sierra blue (iPhone 13 Pro) and pacific blue (iPhone 12 Pro).

Pacific blue is the boldest of the bunch, if by bold you mean dark enough to discern from silver, but it’s also close enough to that year’s graphite color that seeing blue depends on the surrounding lighting. By comparison, the blue (just «blue») color of the iPhone 12 was unmistakably bright blue.

In fact, the non-Pro lines have embraced vibrant colors. It’s as if Apple is equating «pro» with «sophisticated,» as in «A real pro would never brandish something this garish.» I see this in the camera world all the time: If it’s not all-black, it’s not a «serious» camera.

And yet I know lots of pros who are not sophisticated — proudly so. People choose colors to express themselves, so forcing that idea of professionalism through color feels needlessly restrictive. A bright pink iPhone 16 might make you smile every time you pick it up but then frown because it doesn’t have a telephoto camera.

Color is also important because it can sway a purchase decision. «I would buy a sky blue iPhone yesterday,» my colleague Gael Cooper texted after the first rumor popped online. When each new generation of iPhones arrive, less technically different than the one before, a color you fall in love with can push you into trading in your perfectly-capable model for a new one.

And lest you think Apple should just stick with black and white for its professional phones: Do you mean black, jet black, space black, midnight black, black titanium, graphite or space gray? At least the lighter end of the spectrum has stuck to just white, white titanium and silver over the years.

Apple never got ahead by being beige

I’m sure Apple has reams of studies and customer feedback that support which colors make it to production each year. Like I said, Apple’s designers are obsessive (in a good way). And I must remind myself that a sky blue iPhone 17 Air is a rumored color on a rumored product so all the usual caveats apply.

But we’re talking about Apple here. The scrappy startup that spent more than any other company on business cards at the time because each one included the old six-color Apple logo. The company that not only shaped the first iMac like a tipped-over gumdrop, that not only made the case partially see-through but then made that cover brilliant Bondi blue.

Embrace the iPhone colors, Apple.

If that makes you nervous, don’t worry: Most people will put a case on it anyway.

Continue Reading

Technologies

Astronomers Say There’s an Increased Possibility of Life on This Distant Planet

Using the James Webb Space Telescope, astronomers are working to confirm potential evidence of life on a distant exoplanet dubbed K2-18b.

Astronomers are nearing a statistically significant finding that could confirm the potential signs of life detected on the distant exoplanet K2-18b are no accident.

The team of astronomers, led by the University of Cambridge, used data from the James Webb Space Telescope (which has only been in use since the end of 2021) to detect chemical traces of dimethyl sulfide (DMS) and/or dimethyl disulfide (DMDS), which they say can only be produced by life such as phytoplankton in the sea. 

According to the university, «the results are the strongest evidence yet that life may exist on a planet outside our solar system.»

The findings were published this week in the Astrophysical Journal Letters and point to the possibility of an ocean on this planet’s surface, which scientists have been hoping to discover for years. In the abstract for the paper, the team says, «The possibility of hycean worlds, with planet-wide oceans and H2-rich atmospheres, significantly expands and accelerates the search for habitable environments elsewhere.»

Not everyone agrees, however, that what the team found proves there’s life on the exoplanet.

Science writer and OpenMind Magazine founder Corey S. Powell posted about the findings on Bluesky, writing, «The potential discovery of alien life is so enticing that it drags even reputable outlets into running naive or outright misleading stories.» He added, «Here we go again with planet K2-18b.Um….there’s strong evidence of non-biological sources of the molecule DMS.»

K2-18b is 124 light-years away and much larger than Earth (more than eight times our mass), but smaller than Neptune. The search for signs of even basic life on a planet like this increases the chances that there are more planets like Earth that may be inhabitable, with temperatures and atmospheres that could sustain human-like lifeforms. The team behind the paper hopes that more study with the James Webb Space Telescope will help confirm their initial findings.

More research to do on finding life on K2-18b

The exoplanet K2-18b is not the only place where scientists are exploring the possibility of life, and this research is still an early step in the process, said Christopher Glein, a geochemist, planetary researcher and lead scientist at San Antonio’s Southwest Research Institute. Excitement over the significance of the research, he said, should be tempered.

«We need to be careful here,» Glein said. «It appears that there is something in the data that can’t be explained, and DMS/DMDS can provide an explanation. But this detection is stretching the limits of JWST’s capabilities.»

Glein added, «Further work is needed to test whether these molecules are actually present. We also need complementary research assessing the abiotic background on K2-18b and similar planets. That is, the chemistry that can occur in the absence of life in this potentially exotic environment. We might be seeing evidence of some cool chemistry rather than life.»

The TRAPPIST-1 planets, he said, are being researched as potentially habitable, as is LHS 1140b, which he said «is another astrobiologically significant exoplanet, which might be a massive ocean world.»

As for K2-18b, Glein said many more tests need to be performed before there’s consensus on life existing on it.

«Finding evidence of life is like prosecuting a case in the courtroom,» Glein said. «Multiple independent lines of evidence are needed to convince the jury, in this case the worldwide scientific community.» He added, «If this finding holds up, then that’s Step 1.»

Continue Reading

Trending

Copyright © Verum World Media