Artificial Intimacy
AI-Guided Intimacy Choreography
Working at the Edges of Consent, Connection, and Code
A Research Project by: Matt Denney
Click below to play the game:
Is it AI Prompted or Not?
BIG QUESTION:
CAN AI STAGE MOMENTS OF INTIMACY?
SPECIAL THANKS:
This project is funded and supported by the Confluencecenter for Creative Inquiry at the University of Arizona.
All photos by: Lance Thorn
Models: Julia Waters & Benjamin Evans






What is it?
This project brings together two unlikely collaborators: human intimacy choreographers and artificial intelligence. Through a series of portraits and movement workshops, performers explored emotionally charged scenes—once guided by trained intimacy professionals, and then again using choreography generated by AI trained on real intimacy protocols.
The goal? To see what happens when technology steps into the rehearsal room. Can AI help shape moments of connection, care, and vulnerability on stage? This creative experiment opens up bold new questions about the future of performance and how we tell human stories—with a little help from machines.
WHAT IS THE PROCESS?
-
Utilizing Poe AI we generated 3 AI Prompts starting with the question:
“Can you create a scenario for a moment of intimacy that would happen in a stage play? Using specific choreography for two performers ”
From that prompt, I would also stage the scene in a way that I would choreograph it as a Professional Intimacy Director.
-
When searching for a photographer, I was searching for someone to capture the essence of moments of intimacy combined with the stillness of portraits. I was recommended Lance Thorn by a friend, and we found to be great collaborators who were willing to explore this dynamic together.
In terms of the performers, I wanted to work with folks who were familiar with Intimacy Direction and the ability to physically express emotion with one another. So they had to at least know one another if they were going to be vulnerable with one another physically. I previously worked with both Julia Waters and Benjamin Evans not only in the community theatre, but also as students at Pima Community College, where I serve as the Resident Intimacy Director.
-
It felt very weird at first since none of us had seen the prompts prior to the photoshoot (done intentionally). It felt disjointed and cold when staging more of the AI-Centered choreography moments.
We started by doing a boundary check-in with one another (which is a standard Intimacy Direction practice).
We then started with the AI-generated choreography as a base for choreography. I as the Intimacy Director would then stage my version of the choreography that AI gave.
Lance would ask a lot of questions in terms of photo style, and I was also in a sense art-directing and providing more information than what AI could provide (see findings below).
Julia and Benjamin at times felt disjointed with the choreography due to the amount of time they were given, but overall supported one another in the process both physically with breaks in between positions, but also to fill in the gaps that were needed from the performance side.
-
In conversation with others about this project. Much of the narrative around AI and Performance has been viewed as negative. (See our Findings section below).
I utilized journaling and poetic inquiry as my main forms of qualitative data post-photoshoot. Determining from an outside perspective what makes these moments work and not work.
I enlisted the help of my wife to take a look at the preliminary gallery and determine which photos struck her the most. She chose the pictures selected for further editing by our photographer, and were ultimately the ones chosen— which were mainly Human Choreographed.
WHAT did i struggle with?
-
The University of Alberta calls upon us as researchers to think about Ethical Considerations for Using Generative AI due to AI’s large environmental footprint on our society, and it’s impacts on our waterways. particularly within Data Centers
Have I weighed my intended use of generative AI against the environmental/ecological impacts?
In terms of which AI system to use, we decided on Poe AI due to it’s current fight for data privacy, non-discrimination, and human rights compliance, while also striving to ensure the platform benefits humanity by providing users the capability to be on their own cloud server. Many servers exist, but with written code from many developers
What are the copyright implications of using others' content in a generative AI tool?
Due to the nature of original material, we did not have to worry about storytelling as a copyright
How will I verify the generated content to make sure it is credible?
Due to the nature of this project only generating 3 AI-responses for research, there was no need for credibility. The responses served as a “Point in Time” AI knowledge data set.
How will I address the biases that may shape the generated content?
In the brief summary section below, there were clearly biases when generating AI responses. More specifically around the gender of performers doing the scenes of intimacy as well as ability.
-
In conversation with others about this project. Much of the narrative around AI and Performance has been viewed as negative. Many other are quick to put AI and Theatre within two different buckets, without the ability to connect to one another (look at the brief summary to see what we say about AI assisting in the process vs. product.)
While majority of professional theatre and higher education research has skewed negative, there have been some positive research within Theatre Education, and producing artistic methods with assistance in the High School Theatre setting.
In the era of creating adaptations for students with developmental, mental, or physical disabilities, the arts have struggled with students who do not have the ability to ideate or generate concepts that are more dynamic. (see National Art Education Association’s statement on AI in the Art Classroom)
In short, the intersection of theatre and AI is still misunderstood, but has a history of harm.
WHAT DID WE FIND? (brief summary)
LACK of COMMUNICATIon
Something unexpected emerged during the AI-led sessions: a sense of disconnection. While staging the photos and choreographing the movements, it became increasingly difficult to gauge the appropriate distance between performers—how close was too close? How far was too far? Without the ability to ask clarifying questions or receive real-time feedback, the performers found themselves frequently resetting, pausing, and second-guessing. It began to feel like a one-way conversation, with the AI dictating, but not listening.
This revealed a deeper truth about the role of an Intimacy Director. It’s not just about giving instructions—it's about responding, adapting, checking in, and holding space. Human-led intimacy direction is built on trust, communication, and care. In contrast, the AI could only offer static guidance, lacking the emotional nuance and responsiveness that’s essential when navigating scenes of vulnerability.
AI has trouble generating human emotion
AI can offer structure. It can follow patterns, mimic language, even spit out instructions based on intimacy protocols. But what it can’t do—at least not yet—is feel. It doesn’t pick up on hesitation in someone’s breath, the shift in someone’s posture, or the energy between two people navigating vulnerability. When human performers need space to pause, ask questions, or check in, the AI has nothing to offer back. It doesn't read the room; it just delivers commands. The result is something that feels hollow—like staging connection without the connection.
We learned that emotion isn’t just a detail—it’s the core of intimacy. And that’s something no algorithm has been able to replicate.
Still, this project isn't about proving AI can’t be involved in intimate storytelling. It’s about asking better questions. How can tech assist without replacing? How do we keep humanity at the center of innovation? And what happens when we invite the glitch, the silence, and the awkward pause to speak just as loudly as the code?
THE NEED FOR THE THEATRE PROCESS AND AI TO FIND COMMON GROUND
It all begins with an idea. Maybe you want to tell a new kind of story. Maybe you're curious about what happens when creativity meets code. For us, the idea was simple but bold: Can AI help shape emotionally vulnerable scenes on stage?
Much of the current conversation around AI and theatre focuses on harm. And rightfully so. From AI-generated actors replacing human labor to AI-designed sets that erase the nuance of lived experience, there has been a growing awareness of the trauma, erasure, and exploitation tied to how these tools are developed and used. Many of these critiques are rooted in real concerns about consent, authorship, representation, and labor.
But this project asked a different question: What if AI isn’t the product, but a collaborator? What if its role is not to replace humans, but to assist them?
When used not as a substitute but as a supporting voice, AI can spark new ideas and help break creative blocks. In rehearsal, we found that AI could generate frameworks, prompting movement, offering alternative sequences, or remixing existing intimacy protocols. But we also discovered its limits. It struggles profoundly with nuance, emotion, presence, and timing. The kind of embodied wisdom and ethical care that human intimacy directors bring cannot be reproduced by algorithms. When performers needed space to ask, clarify, or feel, the AI fell silent. It gave instructions but didn’t listen. It couldn’t adjust or notice a trembling hand, a caught breath, or a look of hesitation. The result often felt robotic, disconnected, even uncanny.
Still, this doesn’t mean AI has no place in the creative process. Instead, it reveals a deeper opportunity, a justice-oriented goal worth pursuing.
Labs like the Stanford Human-Centered AI Research Lab are exploring how AI can enhance rather than displace human collaboration. In that same spirit, our work asks: How can we design AI tools that serve the artistic process rather than dominate it? How can AI support access, equity, and care in rehearsal rooms?
By repositioning AI as an assistant, not an authority, we open up space for more collaborative, inclusive creation. We can use these tools to generate prompts, test possibilities, and support under-resourced artists while keeping the heart of the work firmly human. This is not about speeding up rehearsal or automating art. It is about imagining how technology might serve justice, deepen empathy, and help us tell stories that feel more alive, not less.