Listen to this story
The launch of Midjourney, Stable Diffusion, and DALL-E last year disrupted several creative fields. While this bunch of scrappy startups changed the face of design, everyone was looking to market leaders such as Adobe and Canva for their response. After nearly a year, Adobe has broken its long silence.
The design giant has finally thrown its hat into the GenAI ring with the launch of Adobe Firefly, a set of generative AI models integrated into the Creative Cloud Suite. While this product came a lot later than its competitors, there is a lot of talk around whether it can actually make a difference in an already-saturated market. Is Adobe’s ‘Firefly’ too little too late or does it have something new to offer?
Under Firefly’s hood
Firefly marks an inflection point for Adobe, moving it from a company using AI sparingly in some of its products to embracing generative AI in a significant way. A trial by Adobe showed the model spitting out three alternate versions for a lighthouse in an artwork, driven completely by text prompts. It can also be used to generate custom vectors, brushes, and textures.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
As part of the Adobe–NVIDIA partnership, Firefly is built on NVIDIA Picasso. This cloud service contains a set of models created by NVIDIA called ‘Edify’ and offers algorithms for text-to-image, text-to-video, and text-to-3D. Edify builds on NVIDIA’s research, which has given us over 600 state-of-the-art models such as Magic3D (text-to-3D), eDiff-I (text-to-image) and GauGAN (image generation).
Although integrating generative AI into workflows has begun to be adopted in design, users had to pick up a new set of skills to adapt to the AI wave. A market leader like Adobe integrating generative AI into their products, however, is nothing to scoff at because this might result in an unprecedented normalisation and adoption of this technology.
Playing to the right market
Adobe is playing a numbers game. According to the company, over 90% of the world’s professionals use ‘Adobe Photoshop’. Majority entrants to creative industries are also trained on Adobe’s suite of products. Along with a handy student discount, this makes new talent to creative fields familiar with one suite of products and one name—Adobe.
Reinforcing the idea of Adobe monopoly, the CC suite has nearly 30 million paying subscribers, with countless pirated copies. A study by Venngage also found that out of 75% of job listings for ‘Graphic Design’ required knowledge of Photoshop, Illustrator, or InDesign. Therefore, adding Firefly to this suite marks the launch of generative AI to one of the world’s biggest captive audiences.
Furthermore, this is an audience uniquely positioned to make the most of generative AI. A user with knowledge of Adobe’s suite of products was already positioned to get more out of standalone models such as Stable Diffusion due to their knowledge of image manipulation techniques. However, with Firefly, the model is embedded even deeper into the workflow, allowing designers to leverage them meaningfully without needing to learn a new set of skills.
Impact on competitors
Even though the product launched months after designers had already familiarised themselves with foundational generative models, it might still prove to be a competition to these models. Firefly cuts out the middleman, putting genAI right into Photoshop and Illustrator. This not only removes the additional step of prompt engineering in other AI models—which are paid services—but also creates a more organic approach to using generative AI elements.
According to Adobe, Firefly can cut down on a lot of work designers need to do. On the optimisations that Firefly can bring to creative workflows, Rufus Deuchler, the Principal Manager of Creative Cloud Evangelism at Adobe, stated in a Reddit thread,
“I see these new AI tools as a support to human creativity and not a replacement. Imagine you need bits and pieces that are “just perfect” for your [sic] design, wouldn’t it be awesome to have that inside of your app? I am actually optimistic that graphic designers do have a future and a whole new set of tools to be creative with.”
Firefly’s launch could also eat into the market share of Stable Diffusion and Midjourney. It is more likely for designers to use an integrated model which was responsibly trained on copyrighted data rather than going to another application which was trained on web scrapes of unattributed artwork.
Additionally, even easy-to-use alternatives such as Canva might have to watch their back. Tools like Canva have been indispensable for those without the time or the inclination to learn Adobe’s complex suite of software. However, Adobe has been trying to wrest back the amateur market with Express. With the integration of Firefly, Express might even leapfrog Canva and take back market share of amateur users.