Autorigging

Part 1 - Skeletonization

Pitch: Develop a way to build a skeleton and rig a mesh with no skeleton or rig.

Design Problem: During the initial rounds of the AutoRigging investigation process, the goal was to allow users to upload whatever (bipedal) character they wanted. Since we had a very specific skeleton built into our existing avatars, we wanted to find a way to fit our own skeleton into a given avatar mesh, regardless of whether the original uploaded asset had it’s own rig.

What I Did:
  • Retopologized a given mesh to ensure that the entire avatar was represented as a single continuous mesh (since some characters were badly modeled or had separated limbs)
  • Given the combined mesh, created a representation of the mesh as a set of voxels
  • The voxel positions can then be used as a tree point cloud, and a “skeleton” (centralized nerve representation of the mesh) can be extracted using voxel thinning
  • The Genies Avatar skeleton is aligned and fitted to the voxelized “skeleton”, to determine the position of the each joint as it would be on the mesh.
  • The mesh is then skinned to the refitted Genie skeleton!
The Result:
  • While the prototype was done in Blender, most of this algorithm was performed in numpy, meaning it was very fast and incredibly easy to make headless! This made it the core of our Game-ready Avatar Processing (GAP) pipeline, allowing fast testing and user turn-around.


Part 2 - Autorigged Avatars into Unity

- Pitch: - The Problem: - What I Did: - Pitch: Bring the autorigged avatars, represented as USDs, into Unity.

The Problem: The version of Unity we were working with didn’t support USDs — while we were in the process of updating the Unity version, as well as developing a new Avatar pipeline, it would still be a long time before we’d be able to see the results of the GAP pipeline in-app.

What I Did:
  • Converted the USD assets to gLTFs — involved also converting the textures and rebuilding shaders accordingly
  • Extended the Genie Avatar system to support a new “sub-species” — the original system was set up to only use one Avatar mesh across all Avatars in the game, and needed to be updated to support mesh switching
  • Save out and store information for each additional Avatar that’s added (e.g. in order to avoid recalculating the binding matrices, proxy mesh, and refitting)
  • Update refitting mechanism to ensure clothing items change fit when the body mesh is changed, so that the Avatar retains their outfit
  • Recalculate blendshapes for different avatars so that facial/body variation can be applied to uploaded body meshes
The Result:
  • While the prototype was done in Blender, most of this algorithm was performed in numpy, meaning it was very fast and incredibly easy to make headless! This made it the core of our Game-ready Avatar Processing (GAP) pipeline, allowing fast testing and user turn-around.
this website was made with love by me!