
Warner Bros is suing Midjourney, arguing the image generator pumps out (and even encourages) AI art and video of its characters – think Superman, Batman, Bugs Bunny – while side-stepping guardrails meant to block obvious IP. Midjourney’s response: the classic fair use defense, plus a jab that studios themselves lean on AI across film and TV production. This caught my attention because AI art has quietly crept into every corner of game development – from pitch decks and mood boards to Steam key art, mod thumbnails, and even indie asset packs. A court decision here won’t just hit Silicon Valley; it could change how your favorite games look, how fast they ship, and what you’re allowed to share online.
Warner’s core claim is straightforward: Midjourney can generate images and video that depict distinctive, copyrighted characters, and its system lets users bypass guardrails designed to prevent that. If you’ve ever browsed prompt marketplaces or Discord threads, you’ve seen it — “style” tags, character-adjacent nicknames, or LoRA packs that land uncomfortably close to the real thing. From a rights-holder’s perspective, that looks like an AI-powered infringement vending machine.
Midjourney’s counter is the familiar line: training on publicly available data is transformative and thus fair use; the outputs are new works; and anyway, Hollywood uses AI too. The uncomfortable truth is that U.S. courts haven’t given a definitive ruling on whether training itself is fair use, and recent rulings around “transformative” use have narrowed the safe zone. Even if training survives, outputs that are “substantially similar” to protected characters can still be infringing. That means the battle may hinge less on abstract AI doctrine and more on what the tool actually enables when real users start prompting.
On the AAA side, expect legal teams to clamp down. Studios will update vendor agreements, require provenance for every texture and brush stroke, and push internal tools with heavier filters. The upside is fewer PR blowups; the downside is slower iteration and less flexibility for teams that genuinely used AI as a sketchbook.

For indies, the stakes feel bigger. A lot of small teams leaned on Midjourney for pitch decks, mood boards, placeholder NPC portraits, and even store capsules. If your Steam page “accidentally” echoes a famous cowl or emblem, you’re inviting a takedown or a terrifying email. Practical playbook right now:
Engine makers are in the blast radius too. Unity, Unreal, and DCC tools integrating gen‑AI will harden filters and update terms. That means more “blocked prompt” errors and less freedom to riff — inconvenient, but inevitable if lawsuits start pinning liability on tool providers.
Historically, companies tolerated fan art and cosplay because it fueled the fandom. Automated mass production flips that equation. When AI lets anyone churn out thousands of near-official posters in minutes and sell them on Etsy or Patreon, enforcement ramps up. Expect more DMCA waves, more Discord bot filters, and more asset pack removals on indie marketplaces.

Mods and UGC are especially exposed. A Fortnite UEFN island, a Roblox experience, or a community-made skin that leans on AI art of a WB character is low-hanging fruit for moderation. Even if you don’t sell it, platforms will tighten rules to avoid secondary liability. Steam already expects developers to verify rights around AI content; don’t be surprised if creator platforms formalize similar checks across the board.
Early motions could telegraph the court’s appetite: does the judge frame this as a training-data question, or focus on the outputs and guardrails? Discovery might surface how Midjourney handles blocked terms and whether it knowingly tolerated workarounds — details that matter for liability.
If the court draws a line that training can be fair use but output can infringe when it targets protected characters, expect prompt filters to get draconian and platforms to police “character-adjacent” phrasing. If Midjourney wins broadly, brace for a wave of bolder AI features — and a counter-wave of licensing deals as IP holders try to monetize rather than block.

Warner Bros is taking Midjourney to court over AI images of its characters. The legal question of training data isn’t settled, but the immediate impact will be tighter guardrails, tougher platform policies, and more scrutiny on AI-made assets.
If you make or share game content, assume AI outputs that resemble famous characters are radioactive. Keep your pipeline clean, your prompts generic, and your artists paid.
Get access to exclusive strategies, hidden tips, and pro-level insights that we don't share publicly.
Ultimate Gaming Strategy Guide + Weekly Pro Tips