"Richard the Storck" © 2O17 Knudsen & Streuber, Ulysses, Walking The Dog, Mélusine Productions, Den siste skilling. All rights reserved.

“Richard the Stork” Making-of

Are you already looking for a suitable Christmas present for your children? “Überflieger – Kleine Vögel, großes Geklapper” – a beautiful animated film for little film fans – was released on DVD and Blu-Ray in October. In this article by DP author Rayk Schroeder, you can find out how the film was made at Rise.

Compositing

Compositing plays a different role in an animated film than in a normal feature film, where it is often just a matter of adding a lens flare, some depth of field and more fog, for example, or correcting the colour of certain areas. It’s more about technical things and the correction of many rendering errors is often shifted to compositing due to time constraints.

Some problems have already been addressed in the lighting/shading section. However, before these problems were fixed in a complicated way in 3D, it was usually quicker for the compositing artists to simply patch a missing leg. But this always had to be done carefully because the project was stereoscopic and the quick fix had to work for both eyes. There were also often flickering highlights that appeared in some renderings.

As the quality settings in Houdini’s Mantra could not be increased to infinity or the render times had to be kept within a tolerable range, the only option in the end was to denoise or, in the worst case, paint away the annoying flickering highlights. It was often the case that these highlights only appeared in the left or right rendering due to the parallax. In most cases, it only affected individual passes so that it was possible to intervene and remove noise or distracting highlights. One of the directors’ main requests was to remove all highlights from the eyes, as long as they were not caused by a direct light source such as the sun. Although the film was a stereo project, the left eye was used as the main view and each shot was initially only edited on the left. The artists also only rendered the left eye until the look was approved in order to save render time and storage space on the server. In compositing, however, the setups were already set up with a view to the subsequent right eye. The nuke setups were set up as stereo setups and the naming conventions were set up in such a way that the right eye to be rendered later was automatically integrated. Some small automations were developed to optimise the work and make life easier for the artists.

In order not to make subsequent stereo compositing excessively difficult, roto shapes were only used as rough holdout masks – if at all – and paint strokes were avoided as far as possible. This is because every roto and every paint stroke had to be offset for the right eye. The artists created masks for colour corrections using the World Position Pass, which maps the X, Y and Z coordinates using RGB values. This allows the position of a specific pixel in space to be determined retrospectively. The advantage of masks based on the world position pass was that they did not have to be adapted for the right eye because there was also a world position pass with the same colour values. This also meant that masks could be created for multiple shots in the same set without having to recreate them each time. There was also no need to animate the masks, as the coordinates are absolute values in 3D space and remain the same regardless of the camera position. Many shots took place on the same set. Unfortunately, the set was sometimes moved from shot to shot during modelling, so that the masks no longer fitted.

However, this could be remedied with a grade node created using a Python script – because if the world position pass was graded accordingly, the masks fitted again. The compositing artists received the X-Y-Z offset from the 3D department, by which the set was shifted in 3D space. In Nuke, they could enter these values in a window and click to attach a grade node with the offset values to the read node for the world position pass. This meant that the RGB values were correct again and the same masks could be used. The render camera was often also required for compositing for many shots. For the lighting/shading artists, this would have meant additional work each time to export the camera again.

The render settings in Houdini were therefore adapted so that the camera information was written to the metadata of the utility passes to be rendered. These could be read out with a Python script in Nuke and used to create a stereo camera that corresponded exactly to the render camera used. Many other smaller automations were created in the course of the project: For example, a gizmo with default settings to denoise particularly problematic render passes, skydomes that used the shot camera to show the correct section of the sky, and a script to replace all file paths in the CG read nodes for easy copying of complete Nuke setups from shot to shot. This meant that the basic look of many similar shots could be set up very quickly and details could be edited much faster.

Information on the stereo workflow and the communication process on page 5