,

The Erosion of Decency and Human Dignity: When Digital Replicas Replace the Real

By Sally Vazquez-Castellanos, Esq.

Perspectives: Technology, Global Privacy & Data Protection

Published on October 16, 2025. Revised at 4:59 pm.

This morning’s conversation with ChatGPT.

In Zero to One, Peter Thiel poses a deceptively simple question:

What matters more—the advance of technology or the spread of globalization?

His answer, though often interpreted through the lens of entrepreneurship, speaks directly to the transformation now reshaping the law: technology changes how we live; globalization changes where ideas move. One scales the human mind, the other duplicates it.

In my earlier Perspectives essay, “Personhood and the Law,” I argued that the legal system has long struggled to define what it means to be a person—whether in the context of citizenship, corporate identity, or reproductive rights. Today, artificial intelligence extends that struggle into the digital realm. We have reached the point where technology does not simply amplify human capacity; it replicates human identity.

If globalization once mirrored humanity’s drive to connect, the age of digital replication exposes our impulse to copy. The same forces that democratized communication are now eroding the very individuality globalization once promised to celebrate. In this new order, the question is no longer who owns the means of production—but who owns the image of the human being itself.

This is the frontier that California’s new digital-replica laws attempt to govern. They remind us that progress without boundaries risks hollowing out what Thiel might call the “zero” before the “one”: the moral starting point that gives innovation its human meaning.

I. The Fragile Boundary Between Art and Exploitation

Once upon a time, computer-generated imagery (CGI) represented artistic wonder — the stuff of Hollywood storytelling and creative innovation. It brought us galaxies far away, photorealistic dinosaurs, and emotionally expressive digital worlds that felt human without ever claiming to be.

Today, the line between creation and imitation has grown perilously thin. What began as CGI — a technical tool for visual storytelling — is now bleeding into the territory of digital replicas: highly realistic, AI-driven reproductions of real human beings. In this new world, a person’s face, voice, or presence can be resurrected or repurposed without physical consent — and the moral cost is profound.

II. What Is CGI?

Computer-Generated Imagery (CGI) refers to any image or scene created through digital rendering software. It can depict fantastical characters, architectural spaces, or realistic human forms. The purpose of CGI has always been creative — to tell a story, enhance special effects, or visualize the impossible.

CGI becomes controversial only when it crosses into the representation of real people. A digital character modeled loosely after a performer is still an artistic creation; a fully rendered simulation that looks and sounds exactly like that performer — speaking new lines they never recorded — becomes something else entirely.

III. The Rise of the Digital Replica

California law now gives this phenomenon a name: the digital replica.

Under California Labor Code § 927(c)(1) and Civil Code § 3344.1(b) (as amended by Assembly Bills 2602 and 1836, effective January 1, 2025), a digital replica is defined as:

“A computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual, embodied in a sound recording, image, audiovisual work, or transmission, in which the actual individual either did not actually perform or appear, or the performance or appearance has been materially altered.”

This definition acknowledges a legal and ethical truth: technology can now reconstruct our likenesses so convincingly that audiences may not distinguish the living from the simulated.

The statutes divide protection along two lines:

Labor Code § 927 protects living performers from contract clauses allowing the use of their digital replicas without informed consent or representation.

Civil Code § 3344.1, as amended by AB 1836, protects deceased personalities — the estates of actors, musicians, or public figures — from unauthorized post-mortem recreations.

Both provisions represent California’s attempt to impose guardrails on the unchecked commercialization of human identity.

IV. The Case of Paul Walker — and the Posthumous Performance Problem

Few examples capture the complexity of digital resurrection better than Paul Walker, the late Fast & Furious actor digitally re-created through CGI and motion capture for Furious 7.

While the Walker family consented to that use, the case revealed how blurred the boundaries had become. Future projects could, in theory, insert his likeness into entirely new films, political messages, or advertising campaigns. Without statutory control, a deceased artist’s face could become a perpetual brand — performing long after the person, and their moral agency, is gone.

California’s amended Civil Code § 3344.1 now prohibits such unauthorized uses. It requires the consent of the estate for any digital replica of a deceased personality. The protection endures for 70 years after death, reflecting the state’s recognition that reputation and legacy survive the grave.

V. The Ordinary Citizen — Unprotected, Yet Exposed

For the regular ol’ Joe — the teacher, nurse, or parent with no celebrity status — there is no explicit “digital replica” statute. Yet the risks are increasingly real.

Deepfake videos, voice cloning, and synthetic impersonations threaten not only privacy but also personal safety. A cloned voice can trick a spouse into transferring money; an AI-generated video can ruin reputations or relationships in minutes.

Ordinary citizens must instead rely on a patchwork of existing laws:

California Civil Code § 3344 (right of publicity for living persons), which prohibits unauthorized commercial use of one’s name, voice, photograph, or likeness.

Common-law privacy torts, including misappropriation, false light, and intrusion upon seclusion.

California Penal Code § 647(j)(4), which criminalizes non-consensual distribution of digitally altered or intimate images (“deepfake pornography”).

California Privacy Rights Act (CPRA), which extends data-protection rights to biometric information, including facial or voice data.

Federal Wire Fraud and Identity Theft statutes (18 U.S.C. § 1343, § 1028) when voice or image cloning is used for deception.

These frameworks provide reactive remedies, not proactive safeguards. They address harm after it happens — not the technological architectures that enable it.

VI. The Broader Ethical Divide: Consent and the Collapse of Dignity

The moral danger is not simply the unauthorized image; it is the illusion of consent.

When a machine generates a likeness so lifelike that viewers believe the subject willingly performed, the very idea of human autonomy is eroded. This threatens two pillars of democratic culture: decency and dignity.

Decency demands restraint — the cultural willingness to stop short of exploitation, even when technology allows it. Dignity demands recognition of personhood — the understanding that a human being is not a raw material for synthetic speech or spectacle.

By reanimating the dead or digitally coercing the living, we begin to treat identity as programmable property. In that sense, the erosion of dignity is not metaphorical; it is encoded — in pixels, algorithms, and neglected consent forms.

The disparity is stark: fame brings stronger legal armor than anonymity. The very individuals most vulnerable to digital impersonation — the ordinary public — remain least protected by statute.

VII. Restoring a Human Standard

California’s new framework is an essential first step, but it is not a cure.

Lawmakers must extend digital-replica consent requirements beyond Hollywood, recognizing that every face and voice carries moral ownership.

The pending federal No Fakes Act, if enacted, would do precisely that — codifying national rights of consent for both famous and non-famous individuals. Combined with the California Privacy Rights Act and FTC Section 5 authority over deceptive AI practices, the law could begin to rebuild a baseline of accountability.

Until then, the duty of restraint lies with creators, technologists, and consumers alike. Just because we can replicate someone does not mean we should.

VIII. Human Dignity in the Age of the Replica

When we can no longer tell the difference between a man’s art and his algorithm, we risk losing not only truth but empathy itself.

The law can define “digital replica,” but decency — the unwritten contract between creators and the created — must come from us.

Legal References

California Labor Code § 927 (2025) — enacted by Assembly Bill 2602 (2024), regulating contractual use of digital replicas.

California Civil Code § 3344.1 (2025) — amended by Assembly Bill 1836 (2024), protecting post-mortem publicity rights from digital replicas.

California Civil Code § 3344 — right of publicity for living persons.

California Penal Code § 647(j)(4) — prohibition on non-consensual distribution of altered or intimate images.

California Privacy Rights Act (Cal. Civ. Code § 1798.100 et seq.) — protects biometric and image data as personal information. 18 U.S.C. § 1028 (Identity Theft) and 18 U.S.C. § 1343 (Wire Fraud) — federal statutes addressing voice cloning and impersonation used for deception.

No Fakes Act, S. 2890 (2023-24) (proposed) — federal bill to prohibit unauthorized AI replicas of any individual’s voice or likeness.

A special thank you to ChatGPT and ChatGPT’s source material from Manatt, Phelps & Phillips, LLP, website, which was accessed on October 16, 2025 at 5:41 pm and Davis & Gilbert’s website, which was also accessed on October 16, 2025 at 5:45 pm. As well as websites http://www.law.justia.com and http://www.legiscan.com, which were also accessed on October 16, 2025 at 6:11 pm.

Thank you all!

This article is provided for educational and informational purposes only and does not constitute legal advice. Readers should consult an attorney for guidance regarding specific circumstances.


Discover more from PERSPECTIVES

Subscribe now to keep reading and get access to the full archive.

Continue reading