Chiasmic Play: Haptic theatre

2024-25

Genre

Real-time Technology development

My Role

Creative Game Technologist

C++ Python

Tech Stack

Performance: Unreal Engine 5 Unreal Engine 5
Installaiton elements: Arduinio IDE & ESP32

About The Project

Chiasmic Play is an R&D collaboration between Abertay University and Scottish Youth Theatre that merges live theatre, virtual production and cutting-edge haptic technology.

Haptic technology was first developed to simulate the sense of touch for the user in a virtual environment through vibrations or electrical pulses. While this sort of technology isn't particularly new – being used in gaming as early as the 1970s and more recently in medical and aviation training – its use in a theatre context remains in its infancy.

In this R&D, we questioned how haptic sensations may enhance the experiences of audience members as they witnessed live performance. Using full body Teslasuits and custom-built haptic devices, we tested ways audiences perceived different elements of storytelling; from movement to narration, music to motion capture.

The performance was shared at a showcase event at Abertay's state of the art production studios at Water's Edge in Dundee on the 4th of June 2025, inviting industry experts, theatre practitioners and emerging technology specialists to immerse themselves in an experimental world.

This has also led to a publication in JDMI which will be made public early in 2026.

My Role

Technical Lead & Project Initiator
This project was my personal highlight of 2025, defined by a rewarding collaboration with the Scottish Youth Theatre. I spearheaded the project from the early ideation phase, proposing the integration of Teslasuits into live theatre, which we collaboratively refined to secure XRNetwork+ funding in August 2024.

  • Team & Production: Recruited the lead artist and a multidisciplinary team from Abertay, overseeing the creation of a Virtual Production set within Unreal Engine.
  • Hardware Innovation: Developed "Chiasmic Spheres"—bespoke haptic devices embedded in cushions—to overcome hardware limitations and expand sensory feedback beyond the Teslasuit.
  • Systems Architecture: Engineered a custom TCP-based communication system utilizing multi-threading and distributed processing. This ensured seamless, real-time synchronization between the haptics (Spheres and Teslasuits) and the Unreal scene over a local WiFi network.

Funders

XR Network+ (2024-2025)
CoSTAR Realtime Lab (Water's Edge Studio)

Links

Documentation