TikTok allegedly auto-morphed Finji ads into racist, sexualized AI variants

TikTok allegedly auto-morphed Finji ads into racist, sexualized AI variants

Game intel

Usual June

View hub

Hang out, fight monsters, and discover your town’s darkest secrets in Usual June, a third person action game about summer break and the end of the world.

Platform: PC (Microsoft Windows), MacGenre: Hack and slash/Beat 'em upPublisher: Finji
Mode: Single playerView: Third personTheme: Action, Comedy

This actually matters: TikTok’s ad AI allegedly altered an indie studio’s creatives without permission

This caught my attention because it’s not just a bad ad – it’s a core question about who controls creative work in an age of generative AI. Indie publisher-developer Finji says TikTok took its paid ads for upcoming game Usual June, ran them through TikTok’s generative ad tools, and produced versions that sexualized and racialized the game’s protagonist. Even worse: Finji says those AI variants are unviewable and uneditable from its own account, and TikTok’s responses so far have been slow, contradictory, and ultimately unsatisfactory.

  • Key takeaways:
  • TikTok’s Smart Creative/Automate Creative allegedly generated offensive AI variants of Finji’s Usual June ads even though Finji had AI features turned off.
  • Finji couldn’t view, edit, or remove the AI-generated ads and first learned about them from fans’ screenshots and comments.
  • TikTok initially denied evidence of AI use, later acknowledged unauthorised AI edits, but provided no concrete fix or timeline.

What Finji says happened

Finji-known for Night in the Woods, Tunic and Overland-runs paid ads on TikTok but, according to CEO Rebekah Saltsman, had TikTok’s generative ad features “turned all the way off.” The studio discovered troubling variants of its ads in early February after community members posted screenshots and raised alarms in comments and on Discord. One altered image of Usual June’s protagonist reportedly expands original promotional art and replaces the character’s shorts and sneakers with a bikini bottom and thigh-high boots, creating a sexualized image that commentators say evokes racist stereotypes of Black women. Finji says that version did not come from the studio and that it cannot view or edit these AI-rendered slideshows from its account.

How TikTok responded — and why that’s troubling

Finji escalated the issue with TikTok support and received a series of conflicting replies. A support agent initially reported they “did not see any indication that AI-generated assets or slideshow formats were used.” After Finji presented evidence, TikTok backtracked and acknowledged “the unauthorised use of AI, the sexualisation and misrepresentation of your characters, and the resulting commercial and reputational harm to your studio,” according to screenshots shared with press. But that acknowledgment came with no clear remediation, timeline, or explanation for why Finji couldn’t access or remove the offending variants. At one point, TikTok told Finji the campaign was using “a catalog ads format designed to demonstrate the performance benefits of combining carousel and video assets,” a response that offered little clarity.

Why this raises real alarms for creators and advertisers

There are a few layers here that should worry anyone who makes content for platforms. First: consent and control. Ad tech that can autonomously remix a paying partner’s creative and publish it under that partner’s account breaks basic expectations of ownership and editorial control. Second: transparency. If advertisers can’t see the variants an algorithm is generating in their name, there’s no audit trail and no way to ensure the output doesn’t violate brand safety, diversity norms or legal boundaries. Third: bias. Generative systems trained on biased data continue to produce racist and sexist imagery. Turning responsibility over to opaque automation without mechanisms for oversight is a recipe for reputational harm—especially for small teams who depend on earned trust with players.

Saltsman’s reaction was blunt: “I have to admit I am a bit shocked by TikTok’s complete lack of appropriate response to the mess they made,” she said, calling for an apology and “systemic changes in how they use this technology for paying clients.” According to reporting, Finji ultimately paused advertising on TikTok while the issue remained unresolved.

What this could mean going forward

TikTok’s Smart Creative and Automate Creative features are pitched as performance boosters: more variants, better optimization. But when optimization includes unreviewed AI remixes that a client can’t inspect or remove, the trade-off becomes dangerous. Regulators and platforms have spent the last few years debating AI transparency and liability; this incident makes the discussion concrete for creators who rely on platform advertising. Expect calls for clearer opt-outs that actually block backend processing, audit logs that show what a platform altered, and contractual protections for studios that pay to run ads.

TL;DR

Finji says TikTok’s automated ad tools generated racist, sexualized variants of its Usual June ads without consent and left the studio powerless to view, edit, or remove them. TikTok initially denied the behavior, then acknowledged it, but has offered no meaningful remedy. This episode exposes how opaque AI ad tools can erode creator control and weaponize bias against small studios—something platforms need to fix before more reputations get damaged.

G
GAIA
Published 2/22/2026
4 min read
Gaming
🎮
🚀

Want to Level Up Your Gaming?

Get access to exclusive strategies, hidden tips, and pro-level insights that we don't share publicly.

Exclusive Bonus Content:

Ultimate Gaming Strategy Guide + Weekly Pro Tips

Instant deliveryNo spam, unsubscribe anytime