Texturing Hero Assets on the Movie Pacific Rim: Uprising with DNEG
Hi, thanks for having me. My name is Marc Austin, I am a Build Supervisor at DNEG Vancouver. I’ve been with DNEG for 7 years now, starting as a CG Generalist and working my way up to Build Supervisor. Most recently I’ve been working on Alita: Battle Angel and Pacific Rim: Uprising.
DNEG is one of the world’s leading visual effects, animation and stereo conversion companies for feature film and television, with studios in London, Vancouver, Mumbai, Los Angeles, Chennai, Montréal, Chandigarh, Hyderabad, and Goa.
We’re currently working on projects like First Man, Venom, Fantastic Beasts: The Crimes of Grindelwald, Avengers 4, Wonder Woman 1984.
Working in VFX for Feature Films
We need to produce photo-real images with decreasing timescales and increasing complexity.
No two shows have the same asset requirements; we create anything from a giant robot destroying a city, to a magical forest, to a microscopic car chase. This is what makes the job fun and interesting, but also very hard to pipeline. We need our tools to keep pace and the main requirements of the toolsets we use are quality, speed, and flexibility.
How we started using Substance at DNEG
I started to use Substance Designer on Assassin’s Creed (2016 film) to make the Animus. This was a great chance to test out the software in a production environment. The addition of Iray to Substance Designer meant I could start to texture and do look development of assets all within one application. This reduced the creative cycle, and the need to bounce between different packages.
The knowledge I gained during this project was crucial; it gave me and DNEG the confidence to use it for far more complex assets on Pacific Rim: Uprising.
Our use of Substance in the Movie Pacific Rim: Uprising
We used Substance products to create all the Jaegers for Pacific Rim: Uprising. Work started on the film around October 2016. From the start, we knew this film was going to be a challenge to complete using traditional methods.
New shots were continuously being designed to sell the scale of the Jaegers. We didn’t want the quality of the assets to dictate how a shot would be realized, so we opted to make the Jaegers hold up to scrutiny from any camera angle, and within a few meters from the surface.
Mechanical geometry usually has a high surface area and, in turn, high UV coverage is needed. The number of textures needed to cover an asset was massive, up to 1000 UDIMs per Jaeger.
In total we made 11 Jaegers, each using 4K texture resolution over their whole UDIM range. The hero Jaegers also have multiple weapon configurations and were progressively damaged throughout the film adding to the complexity.
Whilst technically difficult to realize, our approach gave texture coverage for nearly all the shots in the film; only a handful of shots needed custom up-resing. This saved us lots of costly per-shot work, especially near the end of the show.
How Substance Integrated into our Workflow
We used Substance Designer to design and build our materials, with very little work needing to be done inside our normal look development software. The materials were designed to be reusable and tweakable so they would work across multiple assets.
Reusing these material presets across multiple Jaegers had the added benefit of making everything feel as if it all belonged in the same world.
A challenge in traditional texturing techniques is to maintain quality and artistic consistency over large assets. Using Substance Designer removed these worries, as every piece of a Jaeger would be sourced from the same material.
At the beginning of the project, it was unknown if Substance Designer alone would be enough. We planned for a hand-painting finishing stage to get assets completed and signed off.
During development, the material quality was pushed much further than we thought possible. We were delighted to find that the quality of the Jaegers straight from Substance was so good that little to no touch-ups were needed to finalize the assets. This saved us lots of time allowing our artists to focus on other assets and briefs.
We used Clarisse to bake out data maps like AO, curvature etc., so we could leverage our render farm and version tracking software. The data maps closely matched textures generated by Substance products in case we wanted to bake with Substance Painter or Substance Designer.
The Substance materials were built upon the same common set of data maps and hand-painted maps, allowing us to design a standard set of interchangeable features. Simple hand-painted maps were made in Mari (Substance Painter didn’t support UDIMs at the time) and included masks for decals, panel lines, and substructures. We altered these maps in Substance Designer to create features like decal wear and panel warping.
The Jaeger’s surface area was huge, coupled with the need to allow any framing of a Jaeger a shot needed, we ended up with 1000+ UDIMs at 4K. Dealing with this amount of data was the main reason to use Substance Designer.
Our Material Creation Workflow
I started working on the toolset near the end of 2016. At that time, the Substance tools were not as developed as they are today.
Early on in the project, I settled on a strategy of developing four classes of tools: – Navigation – Core tools – Features – Materials
To help standardize our approach and conventions I designed the base template, where we could build upon each material. This defined where inputs would be located, and how the data would be exported to be compatible with our renderer.
Since the graphs could easily get messy and unintelligible, tools were developed to help node flow and structure data throughout our material.
A default starting template helps us to maintain a common set of inputs and outputs. Here you can also see the range of tools we made for the project.
These were the missing building blocks upon which many higher-level tools were built, similar to the filters found in Substance Designer’s library. They were designed to be as generic as possible, allowing them to be used in a wide variety of applications.
Speed was important for these tools, so they were assembled from a mixture of atomic nodes, pixel processors and FX map nodes. These included simple tools like exposure, vibrancy and histogram control tools, as well as more complex tools like scatterers, 2D vector generation and trails.
These were designed for certain elements like water drips, edge chips or decal wear. They were assembled from default Substance nodes and our core tools. The UI design of these tools was tailored to quickly dial in similar but varied patterns so the Jaegers would feel as if they were all part of the same world.
We designed a set of vector control tools and a new trails tool. With these two core tools, we were able to make realistic drips and erosion which followed over the Jeager’s body.
Most of our surfacing artists work here, using the default template as a base. Feature tools were used in conjunction with Substance default tools to get the look we needed for each Jaeger.
It was important that we re-use as many cool-looking elements as possible between artists. For example, if an artist made a very good-looking oil pattern for their material, we would try to package up this sub-portion of their graph, as a feature tool. By tagging this new feature correctly, the Substance library would be auto-updated, allowing the whole surfacing crew to use this new tool and keep it up to date.
This structured material creation process helped us strategize which tools needed developing and which didn’t, and allowed us to maximize the little time we had.
Here is a production graph for a paint material. Inputs were altered and feed into many different features. These features masks were used to alter the main material’s properties and features. Whilst complex, this graph was still readable, allowing new artists to quickly use and update materials.
How we Developed Automapic, our Batch Processing Tool
Substance Automation Toolkit was not released during the development of Pacific Rim: Uprising. We did, however, develop a similar tool more suited to our needs called Automapic.
This tool was our primary means of rendering out our Substance materials. From Clarisse, we stored a description of which materials were assigned to which UDIMs. By loading this configuration file, the baked UDIM inputs, and our Substance materials, the tool could batch process over all the materials for an asset, delivering a tidy set of material textures ready for our production renderer.
A custom tool called Automapic was developed to automate the material output. This tool used info from the asset and the Substance materials to generate UDIM texture sets for all materials.
Substance for our Future Projects
With new shows like Avengers 4 and Wonder Woman 1984, we will be using Substance products more and more extensively.
It’s a case of choosing the best tools for the task. Substance Painter best allows this quick responsive surfacing goal and has gained wide adoption at DNEG. We also use the workflow for Pacific Rim: Uprising, where we have very large assets, or where we can iterate in a batch processing way.