Originally founded by Andy Serkis, The Imaginarium Studios is dedicated to the invention of believable, emotionally engaging digital characters using performance capture technology. Needless to say, this studio thrives in our current pop culture landscape of superheroes and apocalyptic action.
We sat down with CEO Matthew Brown to discuss virtual production technology, the future of Shakespeare, and of course, Xsens.
Can you tell me a bit about yourself and your role at the Imaginarium Studios?
We're a performance capture studio with a permanent set-up in London. The company was created in 2012 but I took over about a year ago as CEO. I get involved in all aspects of what happens at the studio, from client relationships to the bidding process, buying in tech, and working with a team on developing pipeline workflow. All that good stuff!
The team has worked on projects ranging from Avengers: Age Ultron to Planet of the Apes. We even did a music video for Coldplay’s Adventure of a Lifetime single, which I think is their most watched video ever.
Right now we're doing a lot of work in virtual and mixed reality. We're also looking into a range of different film and games projects, which we unfortunately can’t talk about right now.
Can you tell me about the average motion capture session at the Imaginarium Studios?
We have two stages here. Our main stage is about 8x10 meters, complete with 60 Vicon cameras for optical body performance capture. We do stunt work on stage, so we're set-up to bring in wire rigs and other stunt equipment.
Beyond optical capture, we have a lot of experience with Xsens. We regularly use the suit for testing and for some capture work. It's almost a rite of passage for every team member to try on an Xsens MVN suit. Whether we’re using optical or inertial capture, we always have a bit of a warm up with our performers. In a motion capture session, there's no single camera that you’re performing to; your movements are being recorded from all angles. So it’s important to get performers used to this 360 scenario and the skin-tight suit on top of that.
We have a full virtual production pipeline too, so we can give actors, producers and directors a real-time view corresponding to their CG character. We use Epic's Unreal Engine for the rendering, along with a bunch of custom plugins and tools that we've developed over the years.
When did you first bring Xsens into your toolset, and why?
We have two complete systems and loads of MVN suits that have been used in lots of different situations at The Imaginarium Studios.
Our most well-known Xsens project was a collaboration with the Royal Shakespeare Company (RSC) on The Tempest. We worked with the actors and creative director to come up with a feasible way for actor Mark Quartley to turn into all the manifestations of Ariel, which involved wearing an Xsens suit!
For most of the play, Ariel is an invisible whisper in the ear of Prospero, but at times Ariel becomes a sea nymph, a harpy, or generally taking whatever form he wishes. We worked with Xsens and the RSC to develop a workflow which allowed Mark to drive those digital characters live on stage. The stage crew could control all the parameters of what those avatars would look like. Basically, we delivered everything the crew needed to run Xsens and manifest a range of fantastical avatars on a nightly basis!
I think that keeping Mark on-stage alongside the digital creations was a great idea as the audience got to see him playing each of these avatars. By wearing the Xsens MVN suit underneath his costume, Mark could drive the digital creature’s movements while also enjoying the freedom of movement and flexibility to have fun with his role.
I saw the show three or four times. It's all set within a ship, so you have this intricate structure along the stage. There’s one scene where Mark eavesdrops a conversation by hanging upside down from the spars in the hull of the ship. The flexibility of Xsens MVN suits allows him to get creative like that – and the audience loved it!
What are the advantages of Xsens’ MVN suits in particular?
Performers don't have to wear an optical or active marker suit, giving them great range and flexibility for artistic expression. As far as applications of Xsens, the suits work incredibly well in a live setting and allows you to achieve real-time results. What’s more, Xsens provides the right level of fidelity ideal for in-person events, which is extremely important.
What would you say are the advantages of an on-body system, compared to optical?
The main advantage of Xsens’ on-body system is that you can take it out in a car park, or to any location, and swiftly have one or perhaps two performers streaming data. There are pros and cons to both technologies, however. When it comes to selecting your toolkit for a particular project, we always look at the story our client wants to tell and then find the best method to tell it. That may be inertial or optical.
What's next for The Imaginarium and Xsens?
We’re looking at how inertial capture can fit into our optical workflow because there is a lot of potential in combining both technologies. Sometimes, it just doesn't make sense to haul our optical system and collection of cameras to external sets and it doesn't make sense to invest in the larger crew that setup requires. It's ideal to use Xsens’ inertial system in those situations. For The Imaginarium Studios, it's all about finding more opportunities to do so.
Would you like to know more about the Xsens motion capture solution?
You can request a demonstration here: