The Perfect Virtual YouTuber Setup

The Virtual YouTubers (or Vtuber) have taken the internet by storm. Virtual YouTuber is here to stay after becoming trend in Japan beginning of 2018. There are many of them with a big following, some of them have millions of subscribers, making them very interesting for merchandise such as nendroids, official promotional material, and collaborations with other YouTubers.

But what is the technology behind these virtual anime girls?

The setup of a Virtual YouTuber mostly involves facial recognition, gesture recognition and animation software. Combining these technologies can be tricky. The best known issue with this technology is the revealing of Noracat true identity in a live broadcast.

Well known companies in the Virtual YouTuber space like Cygames and CAPTUREROID use this typical Vtuber setup. In this post we will describe their Perfect Virtual YouTuber setup.

One of the most famous Vtuber at the moment is Kizuna Ai, with millions of subscribers to her channel. She is also the spokeswoman for the Japan National Tourism Organization. English speaking Vtbers like Codemiko are also coming up.

virtual-youtuber-setup-vtuber

The Perfect Virtual YouTuber Setup:

1. 3D Avatar

First of all, you need an Vtuber avatar and this sounds easier then it actually is. A full-body avatar needs to act and move naturally and having a unique avatar is not easy to make. It needs to be fully 'rigged' before it can move in a natural way. The easiest way to start, is to download a model from a pages like TurboSquid, Sketchfab or CGTrader.

2. 3D animation software

To pull it all together you need 3D animation software such as Unreal, Unity 3D or iClone. It is real easy to stream Xsens' motion capture data into all major 3D animation software packages. An overview of the Xsens integrations can be found here.

3. The full-body Xsens motion capture system

In this setup you see the full-body motion capture system from Xsens, including the 'MVN Link' hardware (suit). The live motion capture data can be streamed into Unity using Xsens' MVN Animate software to gives you the best live quality data you can get.

I would like to know more about MVN Animate

4. An iPhoneX with face recognition software attached to a head mount

There are several resources online on how to get facial data into Unity, iClone or Maya.

5. Gloves

The ManusVR finger tracking data in integrated in MVN Animate and can be streamed into Unity or Unreal. Same counts for the StretchSense gloves.

 

There is of course more technology available to get your YouTuber avatar live on screen, the Vtuber market is developing rapidly. Xsens motion capture technology has already been a proven technology for many years and has a long track record in live and streaming animations.

Xsens has many Vtuber users and to give you an example of what they do, check out these stories: 

Code Miko setup with Live Link in Unreal Engine

Codemiko is a Vtuber mostly active on Twitch, go check her out here. She has shared many videos showing her full tech setup.codemiko-xsens

 

Kite & Lightning setup with an iPhoneX, Unreal Engine, IKINEMA and Xsens

Kite & Lightning live full body performance setup using Xsens mocap in tandem with an iPhoneX, live-streamed via IKINEMA LiveAction to Unreal Engine in real time.

 

One Piece Vtuber

One Piece voice actors Mayumi Tanaka and Kappei Yamagushi are in the Xsens MVN motion capture system doing a live Vtuber show as their characters Luffy and Usopp. 

 

More information about Cygames and Captureroid can be found in the special Vtuber edition of CGWorld.jp.