VFX supervisor Martyn Culpitt, asset supervisor Barry Poon and R&D lead Andrew Kaufman all gained experience in creating large CG creatures that no longer exist on “Jurassic World”. Here, the three reveal the challenges they faced with the fantastic beasts.
DP: What key skills did the production team use to decide that Image Engine were right for the job?
Martyn Culpitt: After “Jurassic World”, Warner Bros. approached us with the “Fantastic Beasts” enquiry. We had created almost all the raptor shots and they were very impressed with the result. Warner Bros. was looking for a studio that was able to realise complex creatures that required a lot of thought to bring them to life. Something we love to do at Image Engine.



DP: What specific challenges did the Graphorn family present you with?
Martyn Culpitt: Many, across all departments. Mainly because they were so large in scale but still had to fit into the Fantastic Beasts landscape. The original concept for the Graphorns was drastically changed during the animation process. It was difficult to make the CG animals look like a rhino and look like a lion at the same time. Finding the balance was complicated, but also a lot of fun.
Barry Poon: From an asset point of view, the complexity of the skin and the look that goes with it was a big challenge. The family had to look predominantly like a rhinoceros in terms of body strength, muscles and skin. Ultimately, however, the graphorns were to be based on many different real animals. Mixing all these animal species was very challenging. Fortunately, even though each member of the Graphorns family is unique, we were able to reuse some of the elements for the three family members.
DP: How did working on the Graphorns differ from working on the Jurassic World dinosaurs? What was more difficult and why?
Martyn Culpitt: In some ways this project was more complicated because we were working partly with dinosaur limbs on Jurassic World. We had a 3D model that was already finished and we rebuilt our own asset for our pipeline. In terms of the movements and the science behind them, we were able to use the first part of the film as a reference; with the graphorns, on the other hand, we had to start from scratch. So it took some time to get everything right. We played a lot with the concepts to give the family the right look – including complicated details such as the male’s horn. Funnily enough, this is based on that of a horned caterpillar. During the design process, we picked up many small elements from the real world. The most interesting thing was to make something that nobody has ever seen before appear real.




enormous challenges.
DP: Did you use the same pipeline for “Fantastic Beasts” as for “Jurassic World”? Was it expanded?
Andrew Kaufman: Yes, we used Houdini, Jabuka, Shotgun, Caribou, Gaffer, Nuke and Maya – all of these tools were used in Jurassic World. We also tested new software, such as the muscle simulation system Ziva and the Vital Skin system. The Maya plug-in Ziva is developed and distributed by Ziva Dynamics and is a muscle system based on the finite element method. We were among the beta testers of Ziva – in the course of which we decided to use it for the Graphorns. Vital Skin is developed by Vital Mechanics; it is also a Maya plug-in for which we were beta testers.
DP: How did the animation of the graphorns go and how did the plug-ins perform?
Martyn Culpitt: The graphorns were built in three stages: We did the animation in Maya, then we ran the muscle and connective tissue simulation over it based on our muscle cache and added simulation as a secondary movement. I think the use of the plug-ins helped to increase the believability of the muscles and simulations and their interaction. In cases where the muscles needed to be more legible, sculpting was added. The plug-ins are great products, but not yet perfect – once the simulation was running, the team spent a lot of time cleaning up certain areas of the shot. What’s great about the system is that you get an animation that does different things. However, the animals still have a very complex setup, as the fantasy creatures are made up of elements from real animal models, whose body parts had to move in relation to each other in a believable way.
DP: What challenges were there in Swooping Evil?
Martyn Culpitt: The rigging and construction of this creature were particularly complex. Swooping Evil had to be able to unfold from the size of an avocado to a wingspan of over a metre. We needed some time to be able to switch between these two states with the rig and to find out how to do this.
Barry Poon: Its head consists of a skull – blending this into the rest of the body was not easy. Finding the transparency of the wings and the right look for them was also exciting when realising Swooping Evil.
Martyn Culpitt: It also has a tail that wraps around Newt’s fingers so he can fling it like a yo-yo. Together with the client, we went through an extensive iteration process and many variations of movement were discarded until we finalised how he should act. I think the idea that he can suck people’s brains out is very funny. We are all very happy with his performance in the film.

DP: How did you approach Porpentina’s memory sequence in the water chair?
Martyn Culpitt: It was created using Houdini and the 3Delight render engine. In this sequence we spent the most time designing the magic liquid. It needed to be a bubbling, tar-like mass that was highly reflective. The client also wanted the simulation to run backwards so that the bubbles would reverse – which was a lengthy process. Two months before the end of the project, the client decided that the look should be changed again: In the final, approved sequence, the black tar potion was to become mercury-like. Achieving these characteristics presented us with major challenges. In particular, the desired timing and the rotation of the liquid around the chair proved to be complex to realise. The previous, mainly simulation-based approach was relatively difficult to control and therefore slow – which was not compatible with the time frame available to us. Instead, we solved the problem in our own way using procedural, noise-based modelling techniques, the output of which was then used to steer the simulations in the desired direction. Once the timing and scale of the potion was finalised, we added secondary simulations to make it look like drops of water running down faces.

DP: Why did you choose 3Delight for the rendering?
Andrew Kaufman: The main reason Image Engine has been using 3Delight for so long: Our relationship with the development team is very good. They provide us with exceptionally good support, including bug fixes or complete feature rebuilds, depending on the schedule. Nowadays most renderers have a physics-based shading approach – but sometimes the advantages for efficiency are not so much decided by technology, but by the right relationships.
DP: Which Image Engine CG creations will we be seeing in the cinema soon?
Martyn Culpitt: We are currently working on “Power Rangers” and have completed the Wolverine film “Logan”. You can also see creatures from us there: Humans. Readers will understand what we mean when they watch the films. The complexity of the effects for “Logan” in particular pushed our pipeline to its limits compared to our work on previous projects.