Adobe may have been slightly late to the generative AI party, but it's been dropping bombshell after bombshell in recent weeks. Shortly after releasing its versatile Adobe Firefly generative AI tools in beta and their integration in Google Bard, it's now just added the tech to its flagship image editor, Photoshop.
The addition means that users of what remains the industry standard image editing software can now quickly add, transform or remove elements in images using simple text prompts. Adobe Firefly is already top of our list of the best AI art generators and this integration makes it even more practical to use (see our pick of the best AI art tutorials and the best Photoshop tutorials for more on how it works).
Adobe Firefly's generative AI technology is available in Photoshop beta for all subscribers from today. The headline feature is Generative Fill, which will allow users to select a portion of an image using the lasso or other selection tool and fill it with new imagery generated using a text prompt (see the tutorial above) The tool will automatically match the perspective, lighting and style of the existing image, adding details like shadows or reflections where it thinks they're needed.
The newly generated content is added in non-destructive layers so that edits can always be reversed without impacting the image (see how layers can be stacked and the importance of their order in the costume makeover tutorial below). Other new additions to Adobe Photoshop beta include around 30 new adjustment Presets. These are filters that users can apply to an image to achieve a particular look and feel.
There's also the new Remove Tool, a brush that uses Adobe Sensei AI to quickly eliminate unwanted objects, saving potentially hours of manual work. Meanwhile, a Contextual Task Bar is designed to make common functions more accessible by recommending relevant next steps in several workflows and Enhanced Gradients introduce new on-canvas controls.
The new features feel like a logical and inevitable development in Adobe's expansion of its generative AI tools. We've already seen that Google Bard will be integrating Adobe Firefly, and it makes sense that Adobe's own flagship products get the same tech. Adobe has already been adding AI-driven tools to Photoshop, with neural filters, which can transform facial features.
"Adobe has a long and established history of AI innovation and the exciting new integration of Firefly into Photoshop (beta) will enable creatives to transform the way they work," Rufus Deuchler, Director of Worldwide Creative Cloud Evangelism at Adobe, told Creative Bloq. "Adobe Firefly is the only AI service currently that produces high quality professional content that is also commercially viable and can be embedded in creative workflows. With Firefly now supporting Generative Fill, Photoshop users will be able to easily extend images and add or remove objects using text prompts, providing a level of control that was unthinkable until today. Generative fill in the Photoshop beta is truly a game-changer."
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
Photoshop’s Generative Fill looks set to make photo-bashing and collaging a lot quicker and easier, automatically matching the context of an image to save a lot of the more tedious work of adding details like shadows and reflections. It should also save a lot of the time it takes to find the images you want to add to a composition – it might no longer be necessary to trawl through the best stock photo libraries for everything you need.
But perhaps most significantly, it makes experimentation quick and easy, allowing users to test out off-the-wall ideas instantly (or at least as fast as they can type), which will make many users more inclined to try our wild ideas.
Adobe says Firefly has been one of the most successful beta launches in its history, and stresses that Firefly is designed to be safe for commercial. It's trained on Adobe Stock images rather than on images from the wider net. Generative Fill supports Adobe's new Content Credentials 'nutrition labels' – tags designed to clarify that an image has been created by or edited using AI.
How to use Adobe Firefly generative AI in Photoshop
Adobe says that Firefly Generative Fill will be coming to Photoshop as standard later in the year, but for now, you'll need to use Photoshop's beta features. This is easy to do. First, you'll need to have a Photoshop subscription of some kind, either as an individual app as part of Adobe's photography package (see how to download Photoshop).
You'll need to open the Creative Cloud desktop app and click on 'Beta apps' in the column on the left. Look for the Photoshop (Beta) app and click the 'Install' button. Once installed, Photoshop beta will appear listed under Installed beta apps. Click the Open button and check you're running the beta version by opening Help > About Photoshop in the menu bar in Windows (on a Mac, you'll see Photoshop (Beta) in the Menu Bar where it normally just says Photoshop).
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1
Joe is a regular freelance journalist and editor at Creative Bloq. He writes news, features and buying guides and keeps track of the best equipment and software for creatives, from video editing programs to monitors and accessories. A veteran news writer and photographer, he now works as a project manager at the London and Buenos Aires-based design, production and branding agency Hermana Creatives. There he manages a team of designers, photographers and video editors who specialise in producing visual content and design assets for the hospitality sector. He also dances Argentine tango.