Originally a Guildhall@SMU project introducing students to skeletal animation, it kickstarted my interest in animation processing and blending.
At Obsidian, I wrote a custom vertex blending solution, controlled via weights through FaceFX. This allowed us to do hit reacts, detailed facial animation, and finely tuned effects. Additionally, using granny, I rewrote the animation system to be more efficient, flexible, and easier to use by both animators and gameplay programmers. It allowed blends based on acceleration and full 3D direction so we could achieve effects such as leaning forward or back when running up and down hills or leaning left and right when making turns. Combined with the bone masking system, we had full and fluid movement blended with attacks, dodges, hit reacts, and partial to full ragdolls.
At Robot Entertainment, I wrote an animation system that decoupled the programmers playing the animations from the animators and technical artist creating and inserting them into the game. This allowed the artists to do their full work cycle without the need for programmer intervention. I also wrote several custom blending methods, including an N-directional blend for seamless movement, a data driven bone mask definition, and custom state and action nodes for the animation tree.