EP – Animation and Rendering test

After creating the geometries, I imported the HDRI image that I took in my room using a Theta camera to create the reflection on the character’s body, especially on the face. To match the direction, I rotated it slightly based on the footage.

When I started animating, I noticed that it took too long to animate the bend deformer. To save time, I asked Manos how to link the controller to the option in the bend deformer. He showed me how to use the expression editor to link them. Following his tutorial, I successfully connected the curvature attribute to the controller’s Y translate. I added a total of three controllers for each ear and tail.
Finally, I began animating with the latest version of the model file. This is the result of my first animation test for this project. I rendered the CG and composited it in Nuke to see how the rendered elements looked. I realized the geometries I added based on the video weren’t perfectly aligned, causing the shadows to mismatch. Additionally, when the character emerges from the screen, the inside part should be masked by the screen geometry, which it wasn’t. To solve this problem, I found an option in Arnold renderer to use the geometry as a mask object. After enabling this option, the character was masked correctly.
When I showed it to Manos, he suggested extending the scene where the character is in the monitor and looking around. That was understandable, so I’ll try to extend the scene. However, I need to check the matchmove setting and animation again. Now, I’m going to continue animating, and when it’s done, I’ll render it again and test it on the footage in Nuke.