09 Dec 2016

FaceRig Studio is now Live and more!

Awesome People,

We are proud to announce that Facerig Studio is now Live!
This year has been amazing and challenging for us and next year we have so many more planned.
Here are the highlights:

 2016 was about:

  • Extra tracking options for hands.
  • Mobile FaceRig (Android and iOS),
  • FaceRig being used for far more than online funsies (health, theatrical plays, live events and more)
  • Amazing user-made Live2D avatars on the Steam Workshop
  • FaceRig Studio 1.0 going live

2017 will be about:

  • FaceRig Studio improvement: Time-Line Editor
  • FaceRig Studio improvement Full body tracking with Perception Neuron
  • Migrating our efforts to Unity for easier 3D custom avatar import.

And now about the details:

This past year we have focused on a lot of FaceRig improvements for streamers and YouTubers, such as adding new optional tracking methods for hands (LeapMotion, Intel® RealSense™).  We have also worked hard to get FaceRig working on mobiles, (available on Android and iOS), and that version is running on the Unity Engine (more about that later).
FaceRig did more than generate a lot of laughs and giggles, it was featured in Live2D Alive 2016 conference in Japan this summer, it was in the top five in the wordwide finals of Creative Business Cup in Copenhagen, Denmark this November.
It has been used in innovative ways at various live events, for instance Lord Burgor starred as virtual actor in an actual theatrical play in Poland (depicting a computer virus of all things:) ). It is being used even in health, we’ve been told that  FaceRig avatars are being used to reach out to children with autism, who open up and talk to avatars, but do not talk to humans, which is, we think, absolutely amazing. The FaceRig Live2D module was also featured at the Tokyo Comic-con by the Kyoto Institute of Design and the list could go on.

 Now the time has arrived to turn our focus to more complex creation efforts, namely FaceRig Studio and its motion capture capabilities. FaceRig Studio 1.0  is now live  which has as a key differentiator fbx animation export from an existing performance, so you can actually use it to get facial motion capture data to other software (Max, Maya, Blender etc), with just a laptop and a webcam.
 Unlike FaceRig Classic and Pro, FaceRig Studio output ( video or animation) will  have no commercial usage limitations (you can use in any kind of projects, commercial or not, online or offline, broadcasting, medical, educational, cultural , marketing etc). More info on that on the Store page here.  Naturally, all the backers that have contributed to a Studio-tier will receive licenses for this standalone version by e-mail.
 The fbx animation export is just the start of FaceRig-Studio centric features, we have more of them in the works, but they are not polished enough for release. Here is the highlight of what will be available in January/February 2017.

– An animation time-line editor, that will allow you to adjust and clean up recorded performances, and even mix them together, before exporting as a fbx or as a video. On this time-line yat first you will be able to tweak the animation data for the avatar, but then we will extend its functionality to include lighting parameters, post-processing parameters, camera motion parameters (position, target, Field of View, Depth of Field) and so on in order to enable you to create far more polished results.

Time Line Editor

-We are working hard to add full body support for the excellent and inexpensive mo-cap suit Perception Neuron. So far, the kind of performances that can be captured in real time in FaceRig by combining a head-mounted webcam for expression tracking and the Perception Neuron Suit  for body tracking are nothing short of amazing. The level of character embodiment that is felt by the actor  driving such an avatar is unlike anything we have experienced so far. Take a look here

-We are starting with FBX animation export at 30 Hz, but we are also working on Collada export that will also support animation capture and export at 60 hz.
After the above are done, we are considering ways to streamline the 3d avatar import process (we are committed to making the import process simpler, and sorting out support for avatars that use blend-shapes for expressions instead of bone-driven motion. This should make FaceRig compatible with a plethora of humanoid creation software out there. One way of achieving this is migrating the underlying engine to Unity.  This will probably come later though, mid-2017.
Here you can find the best practices for FaceRig Studio mo-cap by clicking here and some sample data to test out the fbx export by clicking here
We will also update FaQ section on the website with FaceRig Studio information.
That’s about it. We’re always available if you have more questions either on social channels or at info[at]facerig[dot]com

 Best Regards,

FaceRig Team