The National Theatre’s recent production, Draw Me Close blurs the line between audience and performer, using virtual reality and inertial motion-capture to portray the intimacy of a mother and her child. Performed at the Young Vic Theatre in London, audience members take on the role of a child named Jordan. VR headsets transport them into an animated world with a new virtual mother—played by an on-stage actress—that interacts physically and verbally to conjure the experience of a childhood memory.
From the outside, the performance takes place in an open room with physical cues and markers, but with the headset on, audience members are making eye contact with a digital human, taking part in a scene which develops uniquely each time. We sat down with the National Theatre’s Head of Digital Development, Toby Coffey, discussing the inspiration, the production process, and why they used inertial motion capture.
“David Hoffenheim is a producer at National Film Board of Canada. We started a conversation when he was in the UK and he approached me with some project concepts. We wanted to find an opportunity to collaborate and we were interested in the intersection between theatre, creative non-fiction, and VR.” explained Toby.
Wearing the Xsens MVN suit, the actress’s movements are captured and animated in real-time and with the performance remaining live throughout, the National Theatre needed a mo-cap solution that maintained the freedom and authenticity of normal on-stage production. Xsens MVN Animate provided the answer.
“When we first started planning the production, we needed to find the right solution for the project’s financial investment. But once our idea started forming concretely, we knew inertial motion capture was the way forward.”
Due to the technological demand, the setup and design of the stage took precedence for the production team. The team uses a Unity environment to create the VR animation and the headset is tracked by five lighthouses in the room. The audience is connected by a tethered HTC Vive Pro and the mother is dressed in the Xsens suit with an added Vive tracker for spatial positioning.
“The Xsens Suit is recording body movements and the tracker is mapping space – the doors and colouring pens are also tracked in the environment. Similarly to a standard theatre setup, with a person initiating the sound and lighting cues, we have someone operating this for the sound and lighting experienced in VR.”
“It runs like a traditional show in that sense because each performance will alternate in time due to its immersive and improvised nature. Participants can engage or quietly take part, and we need the cues to be equally fluid.”
Being a live production, minimising the amount of tech on-stage became a necessary part of the set-up to circumvent device interference. With only one tech week for testing, and one performer’s week for preparation, the performers and props needed to be logistically arranged to maximise the effectiveness of motion-capture.
“We have to work with the performers and tell them ‘When you’re in this position, you need to do this, and move here’, to stop tracking interference. This is modern day puppetry and your performer learns how to operate as the puppet.”
Xsens MVN technology heightens the realism of the experience, bringing the digital mother to life through the eyes of the viewer. Inertial motion capture suits allow for freedom on stage and the animation follows the actress in real-time, bringing the performance closer to the audience. Draw Me Close is heavily inspired by dreamscapes and memory, attempting to replicate these experiences in VR. You might know you’re in a performance beforehand, but with VR and inertial motion capture driving the visuals, does it feel real?
“There are very strong emotional themes in the piece which strike a chord with many people. You start out as a five-year-old boy and progress to 15 and then 25. People had unique experiences with strong personal attachments that we hadn’t anticipated. There’s a point when you’re five and you get tucked into bed by an animated mother. You fall in love within the spate of five minutes!”
“Sometimes, people talked about the way the seasons integrated through sound and visuals, they heard the music and it was already their favourite song; somebody else had that song sung to them by their real mother. There were so many real-world parallels that the audience drew upon, it brought the performance to life.”
The intuitive usability of the MVN suits assisted the production efforts, allowing for a succinct transition for the performers.
“Through trialing the technology, we learned first-hand the difference between pre-recorded and live motion capture. But that was all apart of finding the boundaries and developing the best possible experience.”
“Once the performers get into the suit, they’re natural – the suits don’t get in the way of the performance at all. As soon as they realised that their acting wasn’t going to be inhibited, it became a great opportunity for them to achieve something fresh and exciting.”
As real-time technologies using VR and inertial motion capture continue to improve and converge, the on-stage curtain separating audience and performer will—likewise—continue to dissipate; viewers will be able to physically experience stories that previously resided in their imagination. Draw Me Close is a great example of modern theatre embracing motion capture to produce stand-out experiences and Toby is confident that theatre will continue to expand upon this new-found territory.
“It’s less about the creative development and more about the new technical roles and performances models. It was essentially a preview because it was the first time something like this has been tested. We’re confident that we’ve produced something great and original, there’s definitely more to come from us!”