Dreamscreen Australia, is a Melbourne-based production company specializing in real-time LED virtual production. We spoke with CEO and Co-Founder, Clayton Jacobson, about the studio’s conception, its future, and how Xsens has played an integral role.
Clayton has been in the TV and Film business for over thirty years, starting his career working on music videos for acts like U2 and Crowded House. From there, he moved on to more commercial work, turning his attention to short film and feature film productions.
The restrictions following COVID-19 forced Clayton to put a new feature film project on hold, but the imminent lockdown offered a silver lining. With Xsens motion capture technology now matching his production standards, Clayton was able to pursue his vision, 3-years in the making – to set up a specialist LED virtual production studio, Dreamscreen. To achieve this, he collaborated with local and international industry entities such as Tracklab (Xsens partner for Australia), Unreal Engine and NVIDIA.
Clayton shared, “When VR exploded in 2016 I saw the potential of Dreamscreen straight away and had been keenly waiting for the gaming environments to improve to a more photoreal standard. As technology catches up, we see Xsens as a crucial and integral part of our arsenal of tools.”
Breathing life into the vision, Clayton began pulling a team of experts together to test the real-time virtual production of augmented environments yielding excellent results.
Take a look at the test showreel:
Real-time virtual production
By enabling directors to place live-action talent into a real-time, virtual world without the use of post-production, Dreamscreen sets itself apart from your average production studio.
"Traditional virtual production, motion capture can be a great way to quickly capture baseline animation. But, for real time LED virtual production, you have to find ways to blend the virtual with the physical without the ability to fall back on post-production to fix or alter the movement. Xsens enables to capture very naturalistic movements in-camera, in real time, which tricks the eye into thinking the background is in fact real," Clayton explains. He adds, "The results Xsens provides are very robust and satisfying and were crucial to us completing effective in-camera final pixels - to ignore such technology would be foolish."
Xsens within the workflow
Dreamscreen utilizes Xsens technology as part of their performance capture pipeline. The team processes the on-set camera motion data with Unreal Engine in real-time to create the artificial environment parallax shifts that one would expect to see on a live-action location shoot.
"When using LED virtual production, we can stream live Xsens data into the Unreal Engine for real time interaction and rigging with a 3D character/avatar directly on screen. We can also combine the body capture with facial data to deliver a complete performance."
Dreamscreen’s particular approach to virtual production is already picking up a lot of attention both locally and globally. The studio hopes to work on bigger and increasingly ambitious projects as the studio refines its workflows and continues to progress and expand. Clayton looks forward to utilizing Xsens for future projects.
“The marriage of real-time performance interacting with live avatars opens up enormous possibilities for creative production, and the technology we use for this has reached a level that provides data or assets at the standard you would expect from a professional production. As this will only continue to improve, the future for us is very exciting!”
Find out more about Dreamscreen at their website.
Xsens MVN Animate
Dreamscreen used Xsens MVN Animate as their solution. It enables you to mocap anywhere, at anytime. Want to know what we can do for you? Get in touch!
Xsens Motion Capture Data Files
Are you actively looking for a motion capture system and want to compare data? Download Xsens motion capture data files to convince you about the quality of our data.