To recap, in my last post I decided that using a combination of face keying and Liveface facial mocap would be best suited to the lipsync portion of my animation. I recorded the lipsync separately, on a different project file, over a few different attempts, and saved it as an animation. This is a great feature of iClone 7; any asset you build (whether it be a piece of furniture, a movement, or a lipsync track) can be exported and saved in your "custom" folder for future use. So that's what I decided to do for my facial mocap recording - perfect it as much as I could in a separate project file to be used at a later date.
Concurrently, in my main project file (that includes my Jamie avatar and living room interview set-up) I started on movement animation. Before beginning, as always, I found a few helpful tutorials to get me started on the basics of how to animate using iClone's timeline keyframes.
Real Illusion "iClone's Beginner's Guide: Animating with Motion Clips" (00:18:05)
Good overview of the pre-existing motion assets that come with the program. These don't cover ALL the motion you'll probably want to use, but basics movement such as sitting, walking, talking hand gestures, and posing.
Real Illusion "iClone 7 Tutorial - Timeline: Enhanced Timeline Features" (00:13:12)
Though the title of this tutorial suggests these are "enhanced" features (i.e. you may or may not use them) I would argue this is still very much a "basics" overview (i.e. essential if you want to do ANY animation in iClone 7).
I must say that although I did dig into the aforementioned tutorials, a lot of the timeline features came pretty naturally to me because I have worked in video editing for over 10 years now. I was a 17-year-old self-taught Final Cut Pro and Adobe Photoshop user, and I credit both platforms for giving me a wide range of transferable skills in video editing, photo editing, web design, animation, and beyond. So iClone's timeline and keyframe features were barely a learning curve. That being said, the entire process of body animation still proved challenging for someone like myself who never took an interest in the art of drawing and painting. There's a reason why animators study human anatomy so carefully.... and here I am trying to recreate photorealistic footage having never taken a life drawing class. So that was an entirely different challenge that had nothing to do with technical skills in virtual production platforms.
There is a reason why animators take human anatomy and life drawing classes. Though I found this aspect of animation really neat, it was still a hurdle for an artistically-challenged individual like myself who still cannot draw a human hand.
Almost every single bone in your avatar's body can be animated. You can manipulate a single toe if you wanted to. Although the keyframes do help smooth out animation, if you don't put enough keyframes you started to seriously lack a lot of movement information and your output ends up looking overly smooth and robotic. Trying to mimic the subtle jittery movements I could witness in my reference footage took a lot of trial and error. I wanted to strike the right balance of natural and smooth and unfortunately that is quite a tedious and time-consuming task.
Timelapse of head-turning animation and eye-tracking keyframes
In addition to key-framing animation, iClone also boasts a "puppeting" tool that lets you animate motion in real-time. You select one part of your avatar's body (i.e. an arm) then play through your animation timeline and use your mouse to "puppet" the limb. Your puppeteering is recorded directly onto your animation timeline. Although one might surmise this would save you a ton of time, I found that using my computer mouse as a puppet string was incredibly faulty. I don't know how anyone would ever successfully use this feature - unless you buy a specialized joystick (similar to what some Machinima artists use to puppet their avatars). I also found a few articles about people also using game consoles for animation puppeteering, though I didn't have much time to look into this workflow too deeply. It might be an option I revisit further down the line if keyframing becomes too tedious and time-consuming for my production timeline. Also, additionally, some of the pre-existing animations (motion clips) will also come in handy further down the line when I shoot more complex scenes in my virtual production pipeline. For the moment, all I had to focus on was head-turning and raising my hands.
Here is what 7 hours worth of key-framing animation looks like (without facial lipsync):
There are still a few jitters to smooth out before the final output. Also, some of this animation may need to change depending on what the lipsync animation does to the head/face.
Komentáře