The Authorship Wall: Andersen v. Stability AI
The epicenter of the war between human creators and Generative AI. As discovery concludes, this class-action is moving toward a landmark trial that could force AI companies to license every image in their training sets.
This class-action suit (Case No. 3:23-cv-00201) in the Northern District of California has become the epicenter of the war between human creators and Generative AI. As discovery concludes in early 2026, the case is moving toward a landmark trial scheduled for April 2027.
The "Compressed Copy" Argument
The legal battle centers on how AI models "learn." Unlike previous copyright cases involving literal copying, this case examines "statistical copying."
The Derivative Work Claim
Visual artists argue that "Stable Diffusion" is essentially a $5 billion "collage tool." They contend that because the model was trained on their copyrighted images, every output it generates is an unauthorized "derivative work."
The Fair Use Defense
Stability AI and Midjourney argue that the models don't store images; they store "mathematical weights." They claim the training process is "transformative"—a key pillar of Fair Use—similar to a human artist looking at a painting and learning a style.
Why This Matters
This docket represents the survival of the creative class. A win for the artists would force AI companies to license every single image in their training sets, potentially bankrupting the current "open" model of AI development.
Explore This Case
Use AskLexi to search the actual court documents from this case.