We spoke to Alex Nowotny, one of the FX Supervisors at Weta FX. Now based in New Zealand for over a decade, his previous roles include Effects TD at Dneg, 3D Technical Director at Arri, and more. This has led to several awards – including an Annie Award for “Outstanding Achievement for Animated Effects in an Animated Production” in 2023. Since graduating from the University of Applied Sciences Wiesbaden in 1999, he has worked on “Lord of the Rings” (3), “John Carter”, “The Hobbit”, “Avatar 2”, “King Kong” and a handful of Marvel films. As well as the last two “Planet of the Apes” films, of course.

DP: Hello Alex! When did you join the team for “Planet of the Apes: New Kingdom”?
Alexander Nowotny: I joined the Planet of the Apes team in October 2023, when the production was slowly moving from the development phase into full production mode.
DP: How big was your team for this film?
Alexander Nowotny: Due to the size of the project (around 1500 VFX shots), we split the FX work between three FX supervisors. My team was mainly involved in editing three major sequences of the film and consisted of about 25 FX technical directors at the peak times. At Weta, an FX supervisor usually handles the tasks for a VFX supervisor. The TDs can move back and forth between teams depending on current needs, but the FX supervisors generally stay with one VFX supervisor, allowing them to build a close, creative working relationship over the course of a long project.

DP: You’d think that with the fourth film in the same universe, everything would go by itself. What did you do differently this time? Were you able to reuse tools, rigs or simulation/environments from “Avatar: The Way of Water” or the previous “Planet of the Apes” films?
Alexander Nowotny: Since completing work on the third film in the Apes series, the working methods and tools in the FX department have changed fundamentally. In 2017, we were still largely a Maya-based department. In the meantime, we switched all our FX workflows to Houdini. The other fundamental change came from the development of Loki, our framework for simulation tasks at Weta, which is used for both effects simulation and creature FX. Loki was developed during the long production phase for “Avatar: The Way of Water” and it offers the user a range of very stable solvers for almost all types of effects: Fluid simulations, calculation of elastic materials, combustion and solid state simulations. The approach for all these tools in Loki has always been to use physically correct simulation models. For the work on “Kingdom of the Planet of the Apes” we were in the fortunate position to benefit from these established tools and build on their development. The “Planet of the Apes” films have always aspired to go a step further and aim for a more convincing level of photorealism. This is also needed to integrate the artificial ape characters into a real-looking environment. We therefore had to tease out an even higher level of detail and precision from our simulations for fire, plants and especially water.
DP: As far as I can see, there’s a lot of environmental storytelling this time around, with overgrown and run-down locations – and no item that hasn’t passed through at least 50 hands. Given the time jump in the storyline, how did you create the “very very very used look”?
Alexander Nowotny: For most of the projects I’m involved in at Weta, my main interest is working on artificial environments. I was therefore fascinated by how much attention was paid to the treatment of the “lost world” to make it look as if centuries had passed, where buildings and streets could fall into disrepair and be overgrown by plants. Weta has built up a rich collection of plant assets over the years and our modelling tools also allow us to simulate artificial plant growth based on physical parameters.

DP: Moving on to the actual production: Can you run us through the Weta FX standard pipeline for a moment? What are your standard tools for FX, and what extensions, renderers, simulation tools, databases and asset managers are involved?
Alexander Nowotny: In our FX team, Loki has now established itself as the standard tool for all our simulation tasks. The integration into Houdini works very well and the many templates and HDAs that we have developed make it very easy for the artists to use these tools in the same way as all the other components in Houdini. In some situations we also use Loki in combination with standard Houdini solvers like Bullet or Vellum. An important feature of Loki is the ability to link several different solvers together and run coupled simulations. It is therefore possible to combine an RBD simulation with a fluid solver, for example, to drop the fragments into the water when an asset is destroyed so that they subsequently float on the surface. For the sake of simplicity, our TDs often do their test renderings in Mantra, but in most cases the final rendering of the effects is done using Weta’s own renderer: Manuka.
DP: And did you adapt or develop anything specifically for this film? If not (or if you’re not allowed to talk about it), what are the milestones in the current setup that can be used to define a way of working? So, a new version of Odin, Manuka, Facets and so on
Alexander Nowotny: There were a few cases in this project for which we had to rethink our usual approaches or expand them with new methods. In this respect, the river sequence was particularly challenging. To create a continuous stream of water with large waves and very specific formations, we had to build our simulation from several stages of low, medium and high resolution and then combine and blend them together depending on the camera angle of the respective shot. For a scene involving a chase through a field of tall pasture grass, we had to significantly optimise our Loki plant solver so that it could handle the large number of plants in the shots and we could still access Loki’s high-quality collision management.

DP: What creative tasks did you have on this film, and what was particularly interesting? Were there any shots and elements that were challenging?
Alexander Nowotny: The river sequence was certainly one of the biggest challenges in this project. Both technically and creatively. This stream had to tell its own story and be developed accordingly. It had to look convincingly real and give the impression of a dangerous environment. We spent a lot of time reviewing reference footage of rushing rivers and looking for templates for the specific formations we needed. The TDs in my team began to simulate individual dramatic-looking waves and rapids and then developed techniques to plug these building blocks into our large river simulation and blend the transitions cleanly. A simplified geometry version of this “riverscape” was then sent to our animation team, giving them a 3D reference for the terrain in which to animate the characters. Once the animation was approved, we were able to run the actual hero simulation in which the characters interact with the water. The result of this hero sim was then passed on to the creature department, who were able to use it to determine the behaviour of the characters’ long hair in the water flow. In a final step, our FX-TDs were now able to run a very high-resolution simulation in which we calculated the behaviour of the water and small drops over the fur. Because of this long chain of dependencies, the whole process was very time-consuming and took a lot of time for each new run. But the result was worth all the effort. The shots look great and are rich in detail. Another side effect of this method was that we accumulated a huge amount of data. During the production phase, the head of our data management team was always happy to point out to me that my team used more than a petabyte of hard drive space for this flow sequence alone. That’s roughly equivalent to what was needed for the entire production of the first Avatar film. ?

DP: What does an ideal shot look like? Imagine you can tell all the other trades and departments how you want it prepared – what’s on your wish list?
Alexander Nowotny: Of course, with every project you always hope for the ideal situation in which all processes mesh smoothly. The FX department receives assets and data from almost all the other teams at Weta. We process animations, creatures, models, layouts and camera information. We work closely with shader developers, lighting TDs and compositing artists to achieve the desired look of a shot. In a perfect world, all these departments would always work together in perfect harmony and timing. Every model would be optimised and tailored for simulation purposes and all data formats would flow smoothly through a pipeline that everyone could easily understand. But the process of making a film is a very creative process that requires a lot of people to work together. With every new film we try to push the boundaries of what is technically possible, decisions are sometimes made on the spur of the moment and new creative ideas only emerge as the project progresses. Therefore, this process can never be completely perfect and perhaps that’s what makes this profession so unique and interesting.
DP: Unfortunately, this probably happens rather rarely – do you have any tips on how to organise collaboration between the departments?
Alexander Nowotny: It may sound like a truism, but in my experience, the most important factor is communication. If you want hundreds of people from different departments to work well together, you have to constantly make sure that all the information is channelled in the right direction so that the ship as a whole sails in the right direction. I always endeavour to stay in contact with the leads and supervisors of the individual teams and manage this flow of information. Since I’ve been working for Weta for quite a while and I know the company structures and many of the super talented artists there, I sometimes find it a little easier to set things in motion or ask for a favour. ?

DP: If you could start again with the film: What would you do differently? Is there anything where you say to yourself “we stumbled”, or something you will definitely do differently in the next film (when it comes)?
Alexander Nowotny: In retrospect, of course, you’re always smarter and see things that could have gone better or been more optimised. But I don’t think I would do things very differently. On the whole, our FX strategies worked out quite well and the team worked extremely well together. Some of the minor issues we’ve had to iron out along the way have even resulted in new creative approaches and these happy accidents sometimes outweigh the extra effort.
DP: What was your personal favourite shot?
Alexander Nowotny: I don’t want to give too much away about the content of the film because it’s a pivotal moment for one of the most important characters in the story. But the shots of Raka in the river turned out really great. My team did a fantastic job.
DP: Can you reveal what you’re working on next? Or is there something you’re currently putting into the toolbox with your developers/R&D people?
Alexander Nowotny: On the technical development side, we’re entering a new phase with some of our Loki simulation tools and I’m excited to be involved in that work. Unfortunately, I’m not allowed to talk about the next film project I’m working on yet, but there will be more on that soon.
