Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!


Adobe Firefly — How generative AI is revolutionizing the way marketers work
Generative AI   Latest   Machine Learning

Adobe Firefly — How generative AI is revolutionizing the way marketers work

Last Updated on April 17, 2023 by Editorial Team

Author(s): Thomas Kraehe

Originally published on Towards AI.

At the recent Adobe Summit in Las Vegas, Adobe unveiled a new AI tool called Firefly. Firefly is Adobe’s answer to the latest generation of generative artificial intelligence. Interested parties can test the tool in a public beta phase.

How does Adobe Firefly work?

The way Adobe Firefly works is quite simple. The complexity of the AI-model and respective calculations are hidden behind an easy-to-use interface. You tell the artificial intelligence what you want it to generate via text input, and it spits out the desired result in no time at all.

Numerous impressive examples are available for a quick start. Since this is a test phase, all testers can rate the result, comment on it and suggest it for the gallery of examples. As you can see in the cover image of this post, I used Firefly to generate exciting Teams backgrounds for my video calls.

Firefly Gallery

Image generation with Adobe Firefly

Let’s take a closer look at what Firefly currently offers. The text input is only available in English for now. After I wanted to generate a nice background for my Teams meetings, I instructed Adobe Firefly to generate an image for me on the theme of a “cozy office with wooden furnishings and a stunning mountain view”. As a content type I chose a photo so that the whole thing looks as realistic as possible.

Adobe Firefly generates images by text input

There are also other content types available such as graphics or art. In addition, the format can be selected, e.g., square 1:1, 4:3 or 16:9. Adobe Firefly now creates 4 proposals. The results are really impressive, in my opinion.

Adobe Firefly generated image: Cozy office with wooden furniture and stunning mountain view

Besides just entering text, you can also choose various styles, color presets and exposures. In my example, I selected the “Synthwave” style. I liked the result very much.

Synthwave style in Adobe Firefly

Content Credentials with Adobe Firefly

When you are satisfied with the result, you can download the generated image. You have to accept the hint that the image may not be used commercially because this is a test phase. In addition, the image is provided with so-called content credentials for the sake of transparency. These are metadata in which the information is stored that this image is not real but was generated by artificial intelligence.

Adobe Firefly-generated images are tagged with content credentials to make it clear that they were generated by artificial intelligence.

The content credentials are part of a large-scale initiative called Content Authenticity Initiative for more transparency on the Internet and against fake news. At you can view the metadata of the generated image. A little gag: In this example, I told Adobe Firefly to generate an image of the restaurant at the end of the universe, which you might know from the book The Hitchhiker’s Guide to the Galaxy.

Verification of content credentials in an image generated by Adobe Firefly

Text effects with Adobe Firefly

The second feature set currently available in Adobe Firefly is text effects. The creation is very similar to image generation. First, you enter the text that you want to be designed. Then you use a text prompt to specify how the text should look, i.e., what style or texture it should get. In this case, I chose one of the examples from the gallery and generated a kind of snakeskin as font texture. You can additionally select various effects, colors, and, of course, fonts if you like.

Text generation with Adobe Firefly

What does Adobe plan to do with Firefly?

As I said, Firefly is currently just a kind of test, of course, with the goal of developing the product to full market maturity. However, at that point, the features will flow into many of Adobe’s existing products, such as Photoshop, Experience Manager, Campaign, etc.

The use cases are manifold. For example, designers can have Photoshop generate elements to their project that weren’t there before. For example, if you want a lighthouse in a beach photo, just let Adobe Firefly generate it for you. Of course, the whole thing should be context-sensitive, i.e., the lighthouse should fit naturally into the picture.

How Adobe Firefly will change the way marketers work

At the Adobe Summit, several visions of how generative AI will be integrated into Experience Cloud products were presented. For example, when marketers create a new email campaign with Adobe Campaign, they will be able to describe the purpose of the campaign in a short text. Adobe Firefly will then generate matching text and images that can be used for the campaign.

The integration into Adobe’s CMS/DAM, the Experience Manager, is similar. There, new content can also be generated via text prompt. Those assets can then be published on a website, for example. In eCommerce, there will be options for automatically inserting cropped product images into individually generated background images that fit the context of the page.

The new possibilities of generative artificial intelligence will really shake up the everyday life of marketers. Many things that currently require a lot of manual work can be automated. But there are also completely new opportunities for individualization and personalization to offer customers the best possible experience.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓