AIM Banners_978 x 90

Find the Real AI Artists with This Powerful New Tool 

Stable Attribution deconstructs an AI image to reveal source images. To address artist copyrighting issues.

With text-to-image AI tools such as DALL.E 2 and Stable Diffusion, images are scraped off the internet and used without proper attribution. The artist’s information is unavailable and the process largely remains shrouded in anonymity. This has led to artists accusing these platforms of stealing their artwork

Enter Stable Attribution, which looks to address this by backtracking the human element to the AI created art. It works on reverse engineering principles to extract the images that were used to create an AI image. The platform also allows attribution to the artists whose images were used for the creation. Stemming from the policies of infringement and copyrighting, the tool helps safeguard the rights of the owner and at the same time, build a repository of artists’ works. 

Chroma, a startup trained to make AI solutions from machine learning, created Stable Attribution. The application was predominantly built by Jeffrey Huber, alongside Anton Troynikov and others. In Stable Attribution, if you post an AI image from an AI-generative application, say, Stable Diffusion, the source images that were used to generate the AI image will pop up. 

In the below example, Stable Diffusion created the AI image from the text prompt, “Stormtroopers having ice cream in Central Park”. Upon uploading that image in Stable Attribution, the source images used for creating it appears. There is an option to attribute each picture with the artist’s link.  

Screenshot of Stable Attribution tool with source images

Stable Attribution works on models such as LAION and Stable Diffusion where the datasets are available. However, a tool like Dall-E 2 by OpenAI does not have its training dataset exposed, which blocks Stable Attribution from crawling. The application is still learning from the datasets it has. The algorithm of Version 1 of Stable Attribution deconstructs an AI image by matching with the most similar images from the dataset it has.

As it is still learning, the repository is limited and the images it pulls out are the most influential and visually matching. Apart from addressing an artist’s copyright issues, this application has limited use in the current version. Unless they add new features, Stable Attribution is a mere image deconstruction tool. 

📣 Want to advertise in AIM? Book here

Picture of Vandana Nair
Vandana Nair
As a rare blend of engineering, MBA, and journalism degree, Vandana Nair brings a unique combination of technical know-how, business acumen, and storytelling skills to the table. Her insatiable curiosity for all things startups, businesses, and AI technologies ensures that there's always a fresh and insightful perspective to her reporting. She now hosts her tech segment 'Point Break' on AIM Tv.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed