The genesis of the VR Investors Day – Avatars

For the Virtual Reality Investors Day of Montega and Donner & Reuschel on 08 June 2021, we will have 6 presenters pitch to investors in a virtual environment. The planning as well as the concept was done by Montega in close cooperation with our developer team to achieve the best possible result and to make the virtual investors day a unique experience.

The 6 presenters of the companies freenet, home24, wallstreet:online, va-Q-tec, q.beyond and NeXR as well as the host, Pierre Groening of Montega were scanned in our body scanner 3D Instagraph Pro. The Instagraph Pro is our powerful body scanner designed for the development of lifelike avatars for games, VR and AR applications and does not measure body metrics like the 3D Instagraph Fusion III.

The scanner takes temporally more than 100 photos of the scanned person which are then assembled to an avatar by our AI.

Postprocessing of the avatars

Depending on how well our AI has assembled the avatars, our 3D artists may need to rework or color correct some areas to achieve an accurate result.

Through our work on NeXR Fashion, we achieve very good results when it comes to digitally changing avatars’ clothing. For this reason, we offer the scanned people to be scanned in comfortable clothes and we can change the avatars clothes after the completion. So they can get different cut suits in different colors or even ties, shoes and accessories.

Preparing the avatars for the event

The avatars do not need to be animated for VR Investors Day, as the presenters animate their avatars independently during the event using motion capture technology.

However, to bring the avatars to life they need a skeleton so that the software knows exactly where joints and bones are located on the avatar. This is called rigging.

Points are placed at the various joints, such as the elbows, shoulder blades, knees, but also at the individual finger joints, etc. In the motion capture suit, reflectors are placed exactly where the avatar’s joints are marked, so that the software recognizes exactly how the person is moving and transfers these movements to the avatar.

For the facial expressions, our motion capture studio works with the software ARkit, which delivers the results for this project quickly and in very good quality. The software is located on an iPhone, which is clamped in front of the presenter’s face. The camera recognizes the face and divides it into polygons. This division allows the avatar to transmit exactly the facial expression that the person has at the time. The same software is used on the iPhone to create Animojis (animated emojis). Anyone with an iPhone X or newer can try this out for themselves and use it to add facial expressions to animals or even their own configured avatar.

This animation of facial expressions and gestures will be streamed in real time into the virtual environment via the Unreal Engine to the various channels, Oculus Quest 2 (VR) and Twitch (2D).

For more info on this event, as well as ways to participate in it virtually, click here.

In the upcoming blog post you will learn how our virtual environments are created and what possibilities there are to present and interact in the virtual environment.