Unauthorized cloning of face, voice, image, posture, or movement.
Replica Theft treats identity cues as a substitute product, using synthetic likeness to stand in for the real person in media, marketing, or public attention.
Example: A brand deploys a soundalike or digital body double that captures the market effect of the original source without permission.
Imitation of style, cadence, taste, analysis, or communication patterns.
Signature Theft captures the recognizable way someone creates value: their rhythm, synthesis, language, timing, and judgment architecture.
Example: A system generates commentary that mirrors a founder's distinctive decision language closely enough to borrow authority and audience trust.
Use of a person's archive, performance history, or digital trace to train systems that can compete with them.
Training Theft focuses on the extraction layer itself: the use of accumulated work, data, and behavior as raw material for downstream synthetic products.
Example: A creator's entire publishing archive becomes the substrate for a model that can reproduce their tone at industrial scale.
Synthetic systems capture trust, traffic, recognition, or monetization that should flow to the original source.
Attribution Theft is not only about resemblance. It is about diverted belief, attention, and economic value caused by synthetic confusion.
Example: An AI-generated endorsement or track benefits from public assumption that the original person approved, made, or stands behind it.
A person's archives, public identity, or institutional knowledge are treated as a free extraction layer for products.
Infrastructure Theft happens when identity systems are reduced to reusable inputs across data pipelines, archives, platforms, and enterprise knowledge bases.
Example: An organization mines years of public interviews, clips, frameworks, and metadata to construct a synthetic advisor without building a relationship with the source.