Mocap Fusion [ VR ]

Mocap Fusion [ VR ]

Motion Capture Fusion [VR] is an immersive roomscale mocap sandbox for artists and animators who wish to create and export motion capture animations_, or create live content_, using conventional VR hardware. With as little as a single VR HMD and two controllers users may create mocap on their own avatars. Advanced users may create more detailed motion capture including full body tracking and can be combined with other sensors (eg. Apple IPhone Truedepth sensor and Oculus Quest 2 optical finger tracking) to connect avatars to simultaneous inputs. This fusion of multiple sensor can combine many layers of motion capture in a single take; including: full body tracking, face capture, lipsync, gaze tracking, optical finger tracking.

Highlights

The term “getting into character” may apply here as literally connecting one’s self to an avatar as completely as possible and seeing them selves in a (VR) mirror while acting out a script.

Users may include their own completely custom characters (avatars) and use the same avatar throughout the production workflow. This eliminates the need for retargeting and ensures the mocap data always fits 1:1 without causing any offsets in the final results.

One of the unique features of Mocap Fusion is that it has the ability to export motion capture data and reconstruct the scene in Blender, making it available for final rendering in minutes .

Compatible Headsets (VR HMDs)

  • Valve Index

  • HTC Vive (and Vive Pro Eye).

  • Oculus Quest (1 and 2).

Optional Tracking Hardware

  • SteamVR Vive trackers.

  • IPhone Truedepth sensor (facecap and eye tracking).

  • Oculus Quest 2 (full optical finger tracking).

Capabilities

  • Export mocap and create scenes in Blender™ instantly.

  • HTC™ Vive Trackers (Up to 11 optional points) full body tracking.

  • Ability to record, playback, pause, slomo, scrub mocap in VR.

  • Customizable IK profiles and avatar parameters.

  • SteamVR Knuckles support for individual finger articulation.

  • Quest 2 optical finger tracking app for individual finger articulation and finger separation.

  • Vive Pro Eye blink and gaze tracking support.

  • Sidekick IOS Face capture app (Truedepth markerless AR facial tracking).

  • User customizable Worlds, Avatar and Props may be built for mocap using the APS_SDK.

  • Compatible with existing Unity3D™ avatars and environments.

  • Supports custom shaders on mocap avatars.

  • DynamicBone support for adding hair, clothing and body physics simulation to avatars.

  • Breathing simulation for added chest animation.

  • Add/Record/Export VR Cameras for realistic camera mocap (eg. VR Cameraman effect).

  • Optimization for exporting mocap (.bvh) data to Daz 3D.

  • Placement of “streaming” cameras for livestreaming avatars to OBS or as desktop overlays.

  • Microphone audio recording with lip-sync visemes and recordable jaw bone rotation.

  • Storyboard mode, save mocap experiences as pages for replaying or editing later.

  • Animatic video player, display stories and scripts, choreograph movement.

  • Dual-handed weapon IK solvers for natural handling of carbines.

  • Recordable VTOL platform for animating helicopter flight simulation (eg. news choppers).

  • VR Camcorders and VR selfie cams may be rigidly linked to trackers.

  • VR props and firearms may be rigidly linked to trackers.

  • Ghost curves for visualizing the future locations of multiple avatars in a scene.

Gameplay

The experience depends on the user’s PC and the tracking hardware used. The recommended SteamVR headsets are the Valve Index or the HTC Vive. A Quest HMD may also produce reasonable results. It is also possible to use the software without an HMD (eg. when livestreaming). Full body tracking is only available when using feet and hip trackers (and optional elbows, knees, chest). Users may achieve more realistic tracking results when using body trackers. Body trackers are optional and standing mocap is supported. Further realism my be achieved on compatible avatars by also enabling face capture or using a Vive Pro Eye for gaze and blink tracking.

History

Originally this was designed as an intuitive way for users to create virtual training videos and presentation in an immersive VR environment for added realism and then export their animation for rendering. The project was made available to a community for beta testing and since has received feedback and many feature requests which has helped add to the utility of the software for a verity of different creators.


Read More: Best Free to Play Animation & Modeling Games.


Mocap Fusion [ VR ] on Steam

Fuse

Fuse

NOTE: This is a review for version 1.0 of FUSE (though I’ve been toying with it since its initial release)

TO POTENTIAL BUYERS:

In truth this product is a mixed bag of nuts. Mixamo is trying to do something truly great here and if they adjust their approach to customer management and marketing with this product they just might build a very powerful indie-focussed tool to quickly build out a cast of characters on a fairly reasonable budget. There is certainly a market for this, and with enough customer buy-in they could become an industry staple in this regard. With that said, the tool isn’t there just yet.

Real player with 248.2 hrs in game


Read More: Best Free to Play Animation & Modeling Games.


First of all - almost all the negative reviews I see here are wildly misinformed! You should know what you’re getting with this software (unlimited free character models with textures - and 2 free rigs a week) - and it is an ENORMOUS time and money saver for those who need custom character models on a budget.

Let me clear a couple things up:

1. You can ABSOLUTELY export your characters as OBJ files and load them in anywhere

2. If you want to rig/skin your characters for animation you upload them to Mixamo - and get 2 free per week.

Real player with 236.8 hrs in game

Fuse on Steam