We are proud to announce that Facerig Studio is now Live!
This year has been amazing and challenging for us and next year we have so many more planned.
Here are the highlights:
2016 was about:
- Extra tracking options for hands.
- Mobile FaceRig (Android and iOS),
- FaceRig being used for far more than online funsies (health, theatrical plays, live events and more)
- Amazing user-made Live2D avatars on the Steam Workshop
- FaceRig Studio 1.0 going live
2017 will be about:
- FaceRig Studio improvement: Time-Line Editor
- FaceRig Studio improvement Full body tracking with Perception Neuron
- Migrating our efforts to Unity for easier 3D custom avatar import.
And now about the details:
This past year we have focused on a lot of FaceRig improvements for streamers and YouTubers, such as adding new optional tracking methods for hands (LeapMotion, Intel® RealSense™). We have also worked hard to get FaceRig working on mobiles, (available on Android and iOS), and that version is running on the Unity Engine (more about that later).
FaceRig did more than generate a lot of laughs and giggles, it was featured in Live2D Alive 2016 conference in Japan this summer, it was in the top five in the wordwide finals of Creative Business Cup in Copenhagen, Denmark this November.
It has been used in innovative ways at various live events, for instance Lord Burgor starred as virtual actor in an actual theatrical play in Poland (depicting a computer virus of all things:) ). It is being used even in health, we’ve been told that FaceRig avatars are being used to reach out to children with autism, who open up and talk to avatars, but do not talk to humans, which is, we think, absolutely amazing. The FaceRig Live2D module was also featured at the Tokyo Comic-con by the Kyoto Institute of Design and the list could go on.
Now the time has arrived to turn our focus to more complex creation efforts, namely FaceRig Studio and its motion capture capabilities. FaceRig Studio 1.0 is now live which has as a key differentiator fbx animation export from an existing performance, so you can actually use it to get facial motion capture data to other software (Max, Maya, Blender etc), with just a laptop and a webcam.
Unlike FaceRig Classic and Pro, FaceRig Studio output ( video or animation) will have no commercial usage limitations (you can use in any kind of projects, commercial or not, online or offline, broadcasting, medical, educational, cultural , marketing etc). More info on that on the Store page here. Naturally, all the backers that have contributed to a Studio-tier will receive licenses for this standalone version by e-mail.
The fbx animation export is just the start of FaceRig-Studio centric features, we have more of them in the works, but they are not polished enough for release. Here is the highlight of what will be available in January/February 2017.
– An animation time-line editor, that will allow you to adjust and clean up recorded performances, and even mix them together, before exporting as a fbx or as a video. On this time-line yat first you will be able to tweak the animation data for the avatar, but then we will extend its functionality to include lighting parameters, post-processing parameters, camera motion parameters (position, target, Field of View, Depth of Field) and so on in order to enable you to create far more polished results.