What happens if your AI-built app uses AGPL code?
AGPL is the strongest copyleft license in mainstream use. It's specifically designed to cover network services, which means deploying an app — not just shipping a binary — can trigger the source-disclosure requirement. AI coding tools occasionally reproduce AGPL-licensed code from their training data. Here's what happens when those two facts collide.
Why AGPL is different from GPL
Standard GPL (v2 or v3) requires you to publish source code only if you distribute the binary. For a SaaS app that's never "distributed" — users only interact with it over a network — GPL doesn't trigger the disclosure requirement. This is called the "ASP loophole" and it was deliberate.
AGPL closes that loophole. AGPL §13 says: if you operate a modified version of an AGPL'd program over a network, you must offer the source code to every user of that network service. SaaS counts as network use. Internal-tool use counts. Anything where a user interacts with the code through a network triggers the obligation.
If you incorporate AGPL code into your SaaS, you must publish your entire derivative-work source code under AGPL, available to your users. This applies even if the user never asks. The standard remedy is to add a "Source" link in the UI pointing to your repo.
Major AGPL projects you might unknowingly use
AGPL is more common in some ecosystems than others. The ones most likely to show up in a vibe-coded app:
- Ghost (publishing platform) — AGPL-3.0 since 2014. Code chunks sometimes appear in CMS scaffolding.
- Grafana (observability) — AGPL-3.0 since April 2024 (formerly Apache). Anyone using Grafana JS components needs to check.
- Mattermost (chat) — AGPL-3.0 for community edition.
- Nextcloud (file sharing) — AGPL-3.0.
- Bonita BPM (workflow) — AGPL-3.0.
- Plausible Analytics — AGPL-3.0 (note: self-host only; their hosted service is allowed).
- Element Web (Matrix chat client) — AGPL-3.0.
- Several Three.js extensions and visualization libraries — AGPL-3.0.
AI coding tools tend to reproduce snippets from these projects when a user asks for similar functionality. The reproduced snippet doesn't come with a license header attached — the tool just writes the code. The license obligation still travels with the code, even when the visible attribution doesn't.
Related: Does GitHub Copilot own your code? (and the Doe v. GitHub case) →How to detect AGPL contamination
There's no perfect way to detect AGPL code that's been mixed into your bundle by an AI tool. The practical approaches:
- Dependency tree scan — `npx license-checker` lists every direct and transitive dependency with its license. Catches AGPL packages you've installed via npm. Doesn't catch code that's been copy-pasted from an AGPL project without a package boundary.
- Bundle fingerprinting — winnowing-fingerprint your bundle and compare against a corpus of AGPL projects. This is what Comply Code's scanner does — fingerprints AGPL packages, compares against your bundle, flags matches. Catches AI-reproduced code that npm-tree scanning misses.
- Manual review — for high-stakes commercial code, have a developer review the codebase against AGPL projects in your space. Time-consuming but conclusive.
What you can do if AGPL code is in your bundle
Four options, ranked from easiest to hardest:
- Replace it. Find the equivalent functionality in an MIT / Apache-2.0 / BSD-licensed package and swap. For most AGPL libraries there's a permissive alternative — Plausible's analytics features have a permissive equivalent in PostHog; Ghost's editor pattern has equivalents in Tiptap; etc.
- Comply. Publish your full source code under AGPL and add a "Source" link in your UI. This is what some open-core companies do (e.g., Bear, some self-hosted dev tools). Viable if your business model doesn't depend on code secrecy.
- Negotiate a commercial license. Most AGPL projects also offer a paid commercial license for users who don't want to comply. Grafana, Nextcloud, MongoDB (pre-SSPL) all offered this. Worth a sales email.
- Remove and prove. Excise the contaminated code, document the remediation, retain logs of what was found and when. Useful for acquirers or future audits.
The Doe v. GitHub angle
Doe v. GitHub is partly about whether AI tools that reproduce copyleft-licensed training data create derivative works. If the case ultimately holds that Copilot/Cursor/Lovable output is a derivative work of training-set code, the license obligations of that training set (including AGPL) might travel to the output by default.
This is the worst-case scenario for AI tool vendors and for users. It's not the most likely outcome — most legal commentators expect a narrower ruling — but it's plausible enough that prudent operators check their bundles regardless.
Related: Acquisition diligence checklist — what acquirers check on AI code →Bottom line
AGPL is the only common open-source license where SaaS operation alone triggers obligations. AI tools occasionally reproduce AGPL code without preserving its license. If you ship to production with AGPL contamination, you either need to publish your source or replace the contaminated code. Detecting it is cheap; ignoring it gets expensive at the worst possible moment (acquisition diligence, a competitor audit, or a public airing on a community forum). The 30-second scan is worth running.
Common questions.
Is using an AGPL service the same as using AGPL code?
No. Using Grafana as a service, or Plausible's hosted analytics, doesn't trigger AGPL obligations — you're a user, not a deployer. The obligations only fire when you incorporate AGPL code into your own deployed app.
What if I just don't tell anyone?
Risky. The standard route to discovery is: someone audits your bundle and matches it against AGPL projects. This happens at acquisition diligence, in competitor investigations, or sometimes in community forums (the open-source community is vigilant about license compliance). When it's discovered, the remediation is harder than fixing it early.
Does the LGPL or MPL have the same risk?
Less severe but real. LGPL only requires you to allow users to relink against modified library versions — not to publish your full source. MPL is file-level copyleft, so only the modified MPL files need to be open-sourced. Neither has AGPL's network-use clause.
Will my AI tool warn me if it reproduces AGPL code?
GitHub Copilot has a setting to suppress suggestions that match public code, which catches some cases. Cursor, Lovable, Bolt, and most others don't have equivalent filters at the time of writing. You can't rely on the tool to catch this — you need a bundle scan.