Table of Contents Show
Originally launched as a comic series in 2003, the zombie world of “The Walking Dead” can now be consumed in numerous forms: for example as a series, game, animated film, book or board game. At the end of 2018, a new
Game of “The Walking Dead”, developed by studio Overkill Software and published by Starbreeze and 505 Games.
“Overkill’s The Walking Dead is a first-person shooter with co-operative gameplay. Each of the four characters Maya, Aidan, Grant and Heather has special abilities that they must use together to achieve their goals. The cinematics created by Goodbye Kansas, which raised great expectations among fans of “The Walking Dead” games with their extremely high photorealistic quality, were unfortunately not fulfilled by the game itself. The version released for Microsoft Windows received mixed reviews and caused publisher Starbreeze, which took over Swedish developer Overkill Software in 2012, a financial crash landing. The game community criticised the weak gameplay and technical problems in particular. There is currently still confusion surrounding the console version: while Sony announced in January that the game would be cancelled for consoles, publisher ‘505’ Games announced that the launch had merely been postponed – albeit indefinitely.

Goodbye Kansas Studios’ outstanding achievement for the release trailers is not affected by this. “Grant” won various awards: In addition to the animago, it received silver at the Epica Awards and bronze at Eurobest. “This was our first animago, but hopefully not our last,” says Director Fredrik Löfberg. “We are extremely proud of the award, because the competition was strong and the bar was set very high.”
Gradual increase in efficiency
Publisher Starbreeze had already realised other full CG trailer projects with Goodbye Kansas, so discussions within the team for the “Walking Dead” cinematics also began at an early stage. Right at the beginning of the process, the decision was made to realise four trailers that would provide an insight into the playable characters. Starbreeze did not want to reveal too much about the game world in these trailers, instead focussing on specific moments that would help define the characters. “The client initially provided us with scripts from which we took the stories and then refined them. Starbreeze had confidence in our storytelling skills, creativity and technical solutions and gave us a lot of creative freedom,” says Director Fredrik about the initial phase. As is usual with other projects, the team first felt their way through a storyboard and produced board-o-matics. This process runs in parallel with a very rough temporary sound design and a music score that supports everyone involved in the realisation of the final film.

The team re-visualised these 2D images in the subsequent pre-visualisation process. The in-house motion capture studio was a great advantage here, as it meant Goodbye Kansas could simply bring in its stunt team and play through the film once in advance. “Each film had its own day of shooting, so we were able to find the beat and timing of the overall performance and action. We filmed scenes with different body language and action in order to work as thoroughly as possible in advance and eliminate questions,” explains Fredrik. The camera layout from the storyboards and the pre-vis editing were great aids for the camera set-up on set to find the best camera angles.
In-house performance capture
After Starbreeze had approved the previs of the four trailers, Goodbye Kansas started shooting in the in-house performance capture studio. A full-time technical team on set for all projects ensures that the team is working with the latest technology, pipeline and workflows at all times. The MoCap stage has been in existence for ten years, so the team is well-rehearsed. Anton Söderhäll, Executive Producer in the Capture Division, knows how the stage is equipped: “It has a passive optical system with retroreflective body markers. 65 high-end motion capture cameras track the markers and translate them into digital 3D space.” Using a replica skeleton, the team transfers the movements to the target rig. The facial expressions are simultaneously captured by head cameras. The videos are then tracked and translated to the movement of the facial rigs, which were created by scanning numerous facial action coding systems (FACS) of the actors. “Our rigs are designed to be very user-friendly and optimised for our workflow. Everything we record in the studio – reference videos, body data, audio – is synchronised with the master timecode clock. We are very interested in capturing the best possible data quality. This ensures that no nuances are lost when digitising the movement,” explains Anton.

A lot of data comes together during filming, but space can be saved with the reference videos, which can be highly compressed for the post-production process. “The animation data is also not as large as you might think. And there have never been any problems when the hard drives fill up during a day of filming,” says Anton. After filming, the team saved everything, including a careful backup, to the main server so that all departments could access all the data.
A balanced character
The actors for the four game characters were cast by the team in London. There was one day of rehearsals and one day of filming for each trailer. “Because we knew what worked and what didn’t thanks to the Previs shooting days, we were able to focus entirely on the actors’ performances and immerse ourselves in the characters,” recalls Fredrik. William Hope, known as Lieutenant Scott Gorman from the film “Aliens – The Return”, plays grandfather Grant. He seems to have experienced so much in his long life that even a zombie apocalypse can only shock him to a limited extent. With a determined face, he trudges through the gloomy and foggy landscape and finds shelter in an abandoned car. His only company is a zombie in the passenger seat who can’t bite him because he has no lower jaw. So Grant tells him about his past while indulging in alcoholic leftovers from a found object. Meanwhile, the zombie hordes gather around the car.
As it’s a voice role, the director wanted to cast Grant with an actor that Starbreeze would also use for the game voice:
“During rehearsal, we asked William about his vision for the character. We talked about the person Grant was before the apocalypse and how it changed him. The aim of this was to find the right balance of tension, humour and kindness for the grandfather. We created enlightening and contrasting moments for the dark situation where he sits next to a zombie.” A balancing act in the design was also to make Grant seem tough enough to cope with the approaching zombie horde, while at the same time making it believable to the audience that Grant might commit suicide. “We deliberately made the ending ambiguous. But Walking Dead fans know that zombies are blind and react to sound, so Grant can fall asleep there during the storm without it being too dangerous for him,” Fredrik reveals.
Game models as a basis
Grant’s asset was provided to Goodbye Kansas by the client. The team used the high-resolution ZBrush model and retopologised his clothing and equipment for an accurate cloth simulation. “It was important to the client that the look of Grant’s face was preserved. Therefore, we transferred the basic face shape of the client’s model into our standard face topology and created an additional pass with skin-surface details. With the help of our in-house tools, we were able to adapt the acting performance to Grant’s CG face,” says Daniel Bystedt, Lead Character Artist.

Grant’s head and facial hair were created using the Yeti plug-in. The artists created the shading of his skin using V-Ray alSurface skin shaders; the higher resolution and detailing of the textures was created based on the game character’s textures using Substance Painter. The zombies were also based on high-resolution ZBrush files from Starbreeze. The artists retopologised the geometry to give it more definition and topology for the simulation. “We used Substance Painter to add some extra detail to the zombies. The hair grooming was done in Blender, then we created a procedural hair system based on the groom in Yeti,” says Daniel. The team also built some “killed zombies” with holes in their heads from which brains spill out. This effect required additional sculpting and texturing work. Each zombie body, including clothing, was posed, with the artists grooming the hair to suit the individual pose. “We created the zombie with the missing jaw from scratch, it’s a one-off. We used Blender for the Previs simulation of the torn flesh. This gave us a feel for how much flesh had to be attached to the jaw without it being too distracting.”
Case-dependent animation work
For the animation of the CG characters, the animation artists tried to preserve as much of the performance capture as possible, because the acting performance of the talented actors is the starting point for the photorealistic facial expressions in the final trailer. “If this foundation is missing, it doesn’t matter how great the facial rig is. What matters is a believable source performance. Then we always built the rigs with the performer in mind. To preserve as much of the acting as possible, we scanned the performers and tried to integrate as many of their features as possible into the digital character. So the characters still look like Maya or Grant from the game, but when they smile or cry, we see the actors’ emotions – modelled as closely as possible to the source material,” explains animation director Jonas Ekman.

But even if the actors’ performance is the most important source of digital recreation for the many components of such a trailer – the decision in favour of keyframe animation remains case-dependent. The team looked at the storyboards as well as the performance and considered in which cases it made sense to invest time in keyframing vs. motion editing or a combination. In the Grant trailer, performance capturing was used for the grandfather’s face. The zombie next to him barely has a face, so keyframing with a simple face rig made more sense. Another example of key animation is the teddy bear that he picks up in one scene: “It would have been too much effort to build a real prop with limbs for the shoot, which is why we used keyframe animation instead,” says Jonas.
Environment for the right mood
“During the pre-vis phase, we went through many steps until we found out where the trailer should be set. We then decided how much rain and fog was the right amount and what the lighting should look like. To do this, we designed some key concepts that came close to the look we wanted,” says Fredrik. Grant’s backstory of a long journey in search of his granddaughter until he reaches Washington DC was familiar to the team when choosing the location. Google Maps 3D view helped the artists a lot in finding a suitable motorway location around DC. “I wanted to place Grant under a subway, surrounded by rain. Our environment was based on a real environment north of DC, which we adapted according to our needs and vision,” says the director.
One of the biggest challenges in terms of the environment was to create a seemingly empty motorway that was still interesting and engaging. In the final scene of the first episode of “The Walking Dead” series, Rick Grimes rides towards Atlanta while the motorway is empty on his side and a motorcade is jammed on the other side. The Atlanta skyline is visible in the distance. The director did not want to copy this well-known shot, but instead wanted to tell a subtle story behind the two crashed cars. “In the car Grant is sitting in, the previous owner’s family left a lot of clues. Most people don’t consciously pay attention to these details, but they add much-needed life to a world that would otherwise feel empty and boring.”
For the look and lighting, the team looked at numerous scenes from films and TV series. The artists analysed how the camera work differed from one another. The team then discussed what time of day, how overcast it should be and how much fog there should be. In the end, it came down to a natural look with beautiful lighting that emphasised certain important elements. It was important to keep the film toned down in a cinematic way. “As we move around the environment, we show it from many different angles, which was a challenge. Because each of them had to look great, but also connect the trailer at the same time,” says Fredrik. Nothing was scanned for the environment, the pre-production only included concept art.
The team created the rain using a variety of techniques, VFX supervisor Henrik Eklundh explains what they were: “Houdini was used to create the drops in the air, which then hit the ground. We created a kind of rain mist and real fog for the feeling of high humidity. For the raindrop movement in the puddles, we used an animated displacement map, which we merged with the shading and texturing of the tarmac and puddles. We also created moving wet maps on some surfaces. For the rain interaction with the car surface, we used special setups consisting of droplets sliding on the surface and some that stay in place. We put all these elements together, added some more for variety and blended them together perfectly.”

The team achieved the foggy look of the scenery with a very cloudy base in which the direction of the sun is only hinted at. In this way, most of the elements were illuminated. “Within the car, we added additional light sources to capture the characters and align the look with the direction given by our in-house art direction. Together with the muted colours in the correction, this resulted in the final look,” says Henrik.
The perfect engine for this project
According to Henrik, the V-Ray render engine was perfect for this project. “The hair in particular was a challenge, but with the new hair shader for V-Ray Next, I think we could have done even better. Also for the water in the air it was difficult to get samples with correct shading, so we had to play around with the shading to make it work. Additionally, we rendered some passes in Houdini Mantra and simple tricks like doubling the size worked wonders.”
The motorway environment is based on a real location north of DC, which the team found via Google Maps
For an optimised render process, Goodbye Kansas kept the amount of reflections and the depth of reflection as low as possible without compromising the look. The team created the render layers in such a way that not much tracing was necessary. In the comp, they then layered back everything that had a big impact on the look. The team also solved many other issues in the compositing process: “We always look at what can be moved from 3D to Nuke. We try to reproduce details without having to go back to the asset and therefore the entire pipeline,” says Henrik. CG elements in the distance were created with matte paintings, which were seamlessly connected to the 3D renderings.
Tough, but controlled
“For us, this was the first trailer that relied so heavily on performance capturing. Our character and facial departments did a fantastic job in creating this believable and endearing character in a very specific trailer setting. There is usually a lot of action in cinematics, but here the pace was slow, which made it a super interesting project. Actor William Hope’s performance finally perfected Grant,” summarises Henrik.
“Grant” was the second trailer to be produced after “Aidan” because the facial rigs and animation workflow had to be in place. After that, “Maya” and “Heather”, which with 46 shots had the most individual shots of the trailers, could be produced all the faster and with even more convincing results. “It paid off that our pipeline and workflow were already working very efficiently at this point. The Heather project was a tough one, but a controlled one. Our outstanding character modellers and look developers pushed their craft on these films and delivered very high-quality results,” sums up producer Thomas Oger.


