Sunday, 14 December 2014

Rigging The Human

The first step for rigging the model of Captain Atom was to create the skeleton. The bones for the legs, the arms and the spine were created on their own, the legs were then parented to the pelvis bone and an extra bone was created as a clavicle for each arm to which the arm was parented. The clavicle was then parent to a bone part way up the spine. For the head, a neck bone, bone for the centre of the head, one for the top of the head and finally two for the jaw were needed. This was then parented to the top of the spine. The finger were also created individually before all being parented up to the arm.
The first part which I rigged were the legs. Three IK handles were created, on from the hip joint to the ankle, one from the ankle to the ball and one from the ball to the toes. I then created a controller underneath the foot and this was point and orient constrained to the ankle IK. This allowed the whole leg to be moved, and the ankle to be rotated. The last thing needed to make the leg work correctly was a locator for the need. By constraining the ankle IK to this locator with a pole vector it meant the knee direction could be easily set by the user.
     








Next, four attributes were added to the foot controller called Ball Lift, Toe Lift, Toe Roll Y and Toe Roll X. Set Driven Keys (SDKs) were then used to add different movements for the foot which are all shown below.

Ball Lift Attribute


Toe Lift Attribute


Toe Roll X



Toe Roll Y

The next part I rigged were the arms. Only one SDK was needed to rig the arm and was put between the shoulder joint and the wrist joint. A controller was created at the wrist and this was parented to the arm IK allowing the arm to move. This controller was also orient constrained to the wrist bone to allow the hand to rotate properly. Just like the leg, a locator was used to keep the elbow pointing in the right direction.



Next I rigged the fingers and thumb for the hand. Each finger needed to curl individually and all the fingers needed to spread out so another controller was created above the hand and attributes for controlling the fingers were added to it. Attributes were also added to allow the thumb to curl and move up and down. The fingers and thumb were then rigged using SDKs.

Finger Curl


Finger Spread

Thumb Curl and Up/Down


The next thing which I rigged was the spine. First a Spline Solver IK handle was added from the top to the bottom joints in the spine. Then two clusters were added, one to the top two control vertexes on the spine curve and one to the bottom two. The pivot for the top cluster was moved to the spine joint just below the pecs and the pivot for the lower cluster was moved to the pelvis bone. Two controllers were created around the torso, one at the pelvis and one just below the pecs and these were parented to the corresponding clusters. These controllers allow for a lot of move in the torso at two different points.
           

                Lower Spine Controller


















                                                             Upper Spine Controller

Next I added to bones between the spine and the arms to act as the clavicle, A clavicle was added between the clavicle bone and the shoulder bone. This was then parented to a controller created on the should and allows for a small amount of movement.


Next was the head and neck. These were more straightforward as they didn't require any IKs. The neck was rigged by parent constraining the neck bone to a control, allowing for a small amount of movement. The head was rigged by orient constraining the head bone to a seperate controller, allowing the head to rotate in all directions.

Neck Movement

Head Rotation

The jaw was rigged by orient constraining the jaw bone to the jaw controller. This was then limited so the jaw couldn't open unnaturally far.

The final bones which were added were loose bones above the eyes and were to be used to add expressions to the face. Controllers for these were created above the head and parented to the central head joint to make the controller always move with the head. SDKs were then used between the controllers and the loose bones to allow some expression to be added.

Loose Bones

The final thing that needed to be rigged were the eyes themselves. In order to make the eyes open and close a blend shape was used. This involved making a copy of the mesh and moving verts into the new places.

Once all of the movement rigging work was done, the final step was to skin the model. After the automatic skinning was done, there were many areas where the mesh didn't move correctly. In order to fix this I used the component editor and by selecting all of the vertices which needed work and changing how they were influenced by different joints I was eventually able to change most of the skin weight to a point where I was happy with how they moved. Below are a selection of images showing different parts of the model with the skin weightings fixed.









 Overall rigging the human was challenging, with many issues encountered along the way. The arms and legs were fairly simple to rig correctly, as was the spine. The head and neck area were fairly easy to rig after initial trouble caused by controllers affecting the wrong joints. The skinning was the single most time consuming part of this. Using the component editor change the influences on the mesh was difficult to work out how to use and first and required a lot of initial experimentation. However once i'd work out how to use it, actually getting the majority of the skin to look right went well.






No comments:

Post a Comment