Bryan Cranston Praises OpenAI for Tackling Deepfake Misuse on Sora 2

Bryan Cranston Praises OpenAI for Tackling Deepfake Misuse on Sora 2

Bryan Cranston Praises OpenAI for Tackling Deepfake Misuse on Sora 2

Bryan Cranston, best known for his iconic role as Walter White in Breaking Bad , has publicly thanked OpenAI for taking swift action against deepfakes created using its generative video platform, Sora 2 . The actor’s likeness and voice had been recreated without his consent, sparking serious concerns about the growing misuse of AI technology in replicating real people.

The issue came to light shortly after Sora 2’s launch, when users were reportedly able to generate videos that included Cranston’s image. One clip even showed a “synthetic Michael Jackson” taking a selfie with a digital version of Cranston—something the actor never authorized. Deepfakes like these, though technologically impressive, raise ethical questions about identity, privacy, and ownership in the digital age.

Cranston immediately reached out to Sag-Aftra , the actors’ union, expressing concern not just for himself but for all performers who could face similar risks. He emphasized that artists’ voices and faces—core parts of their professional identity—should never be reproduced without explicit permission. In a statement released through the union, Cranston said he was grateful to OpenAI for improving its guardrails to ensure that such unauthorized generations don’t happen again.

Also Read:

Initially, reports from major outlets like The Hollywood Reporter and The Wall Street Journal suggested that OpenAI had asked talent agencies to “opt out” of Sora 2 rather than “opt in,” meaning likenesses could have been used by default. However, OpenAI disputed that claim, saying its policy had always been to require consent before any real person’s voice or image was replicated. The company has since apologized for the “unintentional generations” and strengthened its safeguards.

In response, OpenAI , along with United Talent Agency (UTA) , Creative Artists Agency (CAA) , Sag-Aftra , and the Association of Talent Agents , released a joint statement reaffirming their shared commitment to protecting actors’ rights. They agreed to collaborate on ensuring performers have full control over whether or not their likeness can be simulated through AI tools.

Sean Astin, the newly elected president of Sag-Aftra, praised Cranston for taking action, noting that many artists are at risk of having their image and voice misused. He also highlighted the importance of the proposed NO FAKES Act , currently being reviewed by the U.S. Congress. The bill would prohibit the creation and distribution of AI-generated replicas of individuals without their consent—a measure OpenAI has voiced support for.

While Sora 2 still allows the generation of “historical figures,” OpenAI has started to pause or restrict depictions of recently deceased public figures, such as Martin Luther King Jr., after discussions with their estates.

Cranston’s case serves as a pivotal reminder that while AI continues to push creative boundaries, it must do so responsibly—respecting both the artistry and humanity of those it seeks to emulate.

Read More:

Post a Comment

0 Comments