Tilly Norwood: A Discussion about Sexual Scenes and AI “Actors”

Side profile of a digital human head made from holographic lines.

Photo Credit: blackdovfx

In late September 2025, a new “actress” entered the chat. Tilly Norwood is the first AI creation to be marketed as a performer. She is a photo-realistic depiction of a young woman, and her existence has sparked discussion about what repercussions this could have for the film industry. At RALIANCE, we’re taking this story to open a larger discussion about what AI “actors” could mean for the way stories about sex and sexual abuse are told on screen.

How Performers Feel About Tilly Norwood

When news about Tilly Norwood broke, actors immediately pushed back. Oscar winner Ariana DeBose stated:

“The artistry and dedication and real lived human experiences of SAG-AFTRA performers is what has fueled the motion picture industry for more than a century. Synthetic performers, on the other hand, are created using technology that is only as effective as the content it is trained on. This technology will never create anything original and it’s technology that mines from the work of human creatives without consent or pay.”

Her point is salient. AI is not capable of creating something truly original and artistic, because it does not have a unique perspective. It derives what it knows from what we know. Taking art out of the hands of an artist and giving it to an AI removes its humanity and authenticity, and profits from the uncredited originality of the performers it mimics.

An “Actress” That Can’t Say “No”

In the best filmmaking environments, a performance is the active collaboration between the performers and the filmmakers. With the rise of the #MeToo and #TimesUp movements, there have been significant improvements and cultural shifts in discussing ways to ethically create scenes depicting sex and sexual assault. One of the most important has been integrating intimacy coordinators into sets. The intimacy coordinator helps choreograph the scene and advocate for the actor if there’s a form of contact that they don’t want included in the scene. This gives agency to the performer and authenticity to the scene. But what happens when we remove the actor altogether?

Tilly Norwood would be a bad director’s greatest dream come true. Actress Chelsea Edmundson remarked that she was, “Not surprised that the first major ‘AI actor’ is a young woman that they can fully control and make do whatever they want.” The threat of an AI actress opens up potential for reducing roles for actors that refuse to do what can be manipulated in AI. As a workplace, this would widen the power differential between directors, studios, and performers. It would create an environment that reduces the ability for people to advocate for their rights, which would include a workplace free from sexual misconduct, abuse, and harm.

Taking the actor out of the situation entirely also harms the way we tell stories.

If Tilly Norwood were directed to do a scene involving sex or sexual assault, she would not push back. If the scene was written in poor taste or was degrading, she would refuse nothing. This is bad for audiences everywhere.

The Ethics of Animated Intimacy

As animation has improved, so has animated depiction of sexual intimacy. Love, Death, and Robots, for example, ventures into graphic sexual territory that has sparked its own discussion about how female characters are represented and the value of those intimate images. The difference between that and a scene Tilly Norwood would be involved in, however, is the way it’s presented to the audience.

Tilly Norwood looks like us. She looks like she belongs in our environment. Where Love, Death, and Robots is stylized and presented as a make-believe environment, Tilly Norwood fits squarely in our reality and gives the perception of a grounded realness that an audience would unknowingly and unquestioningly absorb.

A badly intentioned male director might take Tilly Norwood and use her to show an audience, “This is what women like. This is how they want to be treated. This is how they will react to you.” In a sexual assault scene, they might determine that they can create the scene as brutally as a studio might sign off on, feeling that they can’t cause any harm if there’s no real performer involved. This, of course, removes agency or negotiation a human performer might exercise to tell a story that is more sensitive to a woman’s body and to survivors at home that watch that scene.

The Real-World Impact of Deepfakes

This is further complicated if an AI is modeled after a real person. Nashville-based musician Stella Hennen has come forward with how disturbed she is that Tilly Norwood is essentially her doppelgänger. The company that created Tilly Norwood argues that the AI “actress” was created through original design and that they do not use anyone’s likeness without consent and compensation. How can we be sure of that, though? AI is a self-learning model using images online. An AI designer may input certain directions in their program, but the program without their knowledge may draw heavily from examples they do not know about. This theft or exploitation could be embedded in the code. To clarify any confusion, AI companies creating these “actors” would have to be very transparent with the public about their creation process and what safeguards they implement to prevent this from happening.

If Tilly Norwood was used in a sexual or sexually degrading scene, Hennen could very well feel it as a violation. A photo-realistic version of yourself engaging in sexual activity you have no role in is a sexual violation, as many victims of deepfake pornography have said. If AI needs to enter the equation, at minimum we need more voices to enter the room, not less.

Where Do We Go From Here?

As technology continues to advance in film, the industry will have to come together to decide the most ethical uses for it. RALIANCE’s former grantee, the Hollywood Commission, works in “defining and implementing best practices that eliminate sexual harassment and bias for all workers, especially marginalized communities, and actively promote a culture of accountability, respect, and equality.” Perhaps they, and other outspoken advocates in the Time’s Up movement, can be brought to the table to advocate for human performers and their role in depicting ethical sexual activity or responsible portrayal of sexual abuse on screen. They could help make compromises about likeness protections. These performers and filmmakers bring insight that could ensure any use of AI is done with compassion, respect to all industry professionals, and responsible to the audience it creates film for.

What we do know is this: human beings give meaning. As more technology is introduced, what do we want our art and the creation of it to say about the human experience? What does art mean to us, for us, and about us?

RALIANCE is a trusted adviser for organizations committed to building cultures that are safe, equitable, and respectful. RALIANCE offers unparalleled expertise in serving survivors of sexual harassment, misconduct, and abuse which drives our mission to help organizations across sectors create inclusive environments for all. Visit our website or our grant page for more information.


  

Subscribe to Our Newsletter

Subscribe