Getty Images Sues Stability AI; Accuses Stealing of Images to Train AI System

0
49

[google-translator]

Stock photo provider Getty Images has sued artificial intelligence company Stability AI, accusing it in a lawsuit made public on Monday of misusing more than 12 million Getty photos to train its Stable Diffusion AI image-generation system.

The lawsuit, filed in Delaware federal court, follows a separate Getty case against Stability in the United Kingdom and a related class-action complaint filed by artists in California against Stability and other companies in the fast-growing field of generative AI.

Getty declined to comment on the Delaware lawsuit. Representatives for Stability did not immediately respond to a request for comment Monday. Reuters News competes with Getty in the market for images for editorial use.

London-based Stability AI released Stable Diffusion, an AI-based system for generating images from text inputs, and image generator DreamStudio last August. The company announced in October that it had raised over $100 million (nearly Rs. 830 crore) in funding, and has been valued at $1 billion (nearly Rs. 8,280 crore).

Seattle-based Getty accused Stability of copying millions of its photos without a license and using them to train Stable Diffusion to generate more accurate depictions based on user prompts.

Getty said its pictures are particularly valuable for AI training because of their image quality, variety of subject matter and detailed metadata.

Getty said it has licensed “millions of suitable digital assets” to other “leading technology innovators” for AI-related purposes, and that Stability infringes its copyrights and competes with it unfairly.

The lawsuit also accuses Stability of infringing Getty’s trademarks, citing images generated by its AI system with Getty’s watermark that Getty says could cause consumer confusion.

Getty asked the court to order Stability to stop using its pictures and requested money damages that include Stability’s profits from the alleged infringement.

© Thomson Reuters 2023
 

 


Affiliate links may be automatically generated – see our ethics statement for details.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here