Select Page
The Mill: Creating Kia’s Super Bowl Ad with Substance

The Mill: Creating Kia’s Super Bowl Ad with Substance

The Mill: Creating Kia’s Super Bowl Ad with Substance

Claudia Vance on April 22 2017 | Stories, Film/VFX

Visual effects studio The Mill creates visual narratives for some of the biggest names in industry and media, including carmaker Kia, whose Super Bowl LI ad “Hero’s Journey” featuring actress Melissa McCarthy was crowned most popular by USA Today’s Super Bowl Ad Meter. The commercial was produced in collaboration with creative agency David&Goliath and directed by Matthijs Van Heijningen of MJZ. We had the privilege of interviewing The Mill’s Sharlene Lin, who used Substance in the asset creation for the spot.

Hi Sharlene, could you present yourself to the community?

My name is Sharlene Lin. I’m a VFX Artist at The Mill in Los Angeles.

What’s your background?

I grew up in China where I studied at a Top 5 fine art school, and later I moved to New Jersey. When I immigrated to the U.S. I went to college and graduated from the School of Visual Arts in NYC. It was there that I was able to intern and work with several amazing artists and studios before moving to California.

I began my career in visual effects and film production around 2009. I was able to contribute to several movies, including Snow White and the Huntsman and its sequel, The Huntsman: Winter’s War; The Avengers; Jack Reacher; Looper and The Twilight Saga: Breaking Dawn. At The Mill, I’ve worked on many commercials for brands including Hay Day, Game of War, Halo, Honda, and Kia’s 2017 Super Bowl spot, ‘Hero’s Journey’. I also worked on HELP, the VR short film for Google Spotlight Stories.

Tell us about your role at The Mill.

Working for a VFX company like The Mill means I get to work on a variety of projects from week to week. As an artist here, it’s important to be equipped with a diverse skill set. For example, one day I could be creating a photo-real car, the next day I could be building a robot for a science fiction piece, and another day I could be making cute creatures for a mobile game.

Why did you choose to use Substance Designer/Substance Painter for the Kia ad?

As VFX artists, we end up using different tools all the time to achieve the highest quality look. The unique and attractive offering that Substance Painter provides is the ability to see a good material representation of what I’m working on in real time – especially when calibrating proper roughness values that involved more back-and-forth tweaking before using Substance Painter. Some of my look-dev artists who focus on the final shader and render have said that this has made the process a lot easier for them. I started to really see the value of the Substance toolset – especially Substance Painter – through the Allegorithmic tutorials.

Our Modeling and Texturing Supervisor at The Mill, Felix E. Urquiza, was fortunately very open to looking into this new tool, which gave me proper time to work with it. I started using Substance Painter and Substance Designer in my spare time which also helped me provide more examples of how we could use it. Kia ‘Hero’s Journey’ was one of the main projects at The Mill that saw a lot of use with Substance Painter.

Can you walk us through the texturing pipeline you developed for the boat?

With the Kia project, which was shot mostly on green screen, our responsibility was to create believable environments and creatures that would hold up the realism. Once we had previsualization, we started to model all of our assets provided by reference. The boat model took one week and it took three days to get the main texture in Substance Painter.

Due to some changes, the entire boat project took around two weeks. The important part with Substance Painter is that due to the ability to work quickly, I had a rough material pass in one day which later went out for approval by the director, Matthijs Van Heijningen. We assigned base materials to all the geometry to decide the look. After approval by Matthijs, we worked on the details of the material, such as dirt, rust, grime, and water. After that, it was really just quick iterations that ended up changing the body color a few times before landing on a final version. The iterations and process would have taken longer in other programs. The boat also had multiple texture udims, around 40 of them.

After all maps were exported using the Arnold Udim preset, it went back to look-dev artists who connected them with the Arnold shaders.

Next, compositing artists got our scenes from the Arnold renderer and made them look nicer! They composited the acting scene, CG environment, and particles.

Was this the first time you used the Substance toolset? If so, what were the challenges you faced and how did you overcome them?

I used Substance Painter and Substance Designer on some other projects, but this asset was one of the more complicated ones. Using the toolset for the Kia project has helped in developing our company workflow. It had 40 texture udims and required high quality, photo-realistic texturing in a short period of time.

One of the main challenges for me was changing the model as I textured; the main scene became really large and heavy for our systems. I decided to split the scene into two sections to speed up the process. When I presented the finished texture of the ship, I could not render all of it together in Iray. The solution was to connect all the texture maps into a look dev scene to render a turntable to present the final look.

Given what you learned, and that updates have been released since then (making the tools more suited for VFX), what would you look forward to doing next with the Substance toolset?

I would like to use it more within our jobs, of course. We are working on building a workflow around Substance Painter when it comes to texturing, and I plan to help in that process. I would also like to use it more in my spare time and make some new projects.

There is obviously a long list of feature requests coming from the VFX world, and users can be reassured that we are working on them. What would be your top 3?

  • Able to paint across multiple udims (so we could develop Substance Painter character workflow and use on organic assets.)
  • Allow non-square images in stencil (many reference photos that we took were not square, and it takes time to edit all of them.)
  • Master Layer- A layer in which I could add a material or effect on all udims at once would be very helpful, especially when dealing with an object that is mostly made of the same type of material.

Is there anything you would like to add?

I would love an auto UV or a UV-less workflow for Substance Painter. So we could use this like Zbrush with Keyshot for concepting and maybe later baking out.

The Man in the High Castle: A TV Series with Substance

The Man in the High Castle: A TV Series with Substance

The Man in the High Castle: A TV Series with Substance

Alexandre Bagard on March 16 2017 | Substance Designer, Substance Painter, Stories, Film/VFX

We had the pleasure to interview Theory Studios and Barnstorm VFX, the studios behind the visual effects in the TV series “The Man in the High Castle”. The series produced by Amazon Studios takes place in a dystopian alternative history where the Axis Powers won World War II.

In the following interview, the two studios detail how they used Substance in their production pipeline by focusing on a particular asset.

 

 

Wistia video thumbnail - The Man in the High Castle, Season 2 VFX

Lawson Deming (co-founder Barnstorm VFX, Visual Effects Supervisor) – Lawson Deming was on set for the majority of the visual effects sequences and worked closely with the art department on “The Man in the High Castle” Season 2 to help fit the visual effects into the world of the show. He also contributed his modeling, texturing, and lighting experience to the 3D team and introduced the Substance Painter-to-Cycles workflow that was utilized over the course of the season.

Cory Jamieson (co-founder Barnstorm VFX, Visual Effects Producer) – Cory Jamieson oversaw the management of the visual effects team and coordinated the work between Barnstorm VFX and Theory Studios, as well as being involved in the supervision and execution of key effects sequences.

David Andrade (co-founder Theory Studios) – David Andrade managed the team of artists at Theory Studios, which built and shaded many of the CGI assets used in “The Man in the High Castle” and created custom tools to improve the workflow and output of artists. He was involved in everything from asset design and modeling to character animation to the rendering pipeline used on the show.

Final scene
Work in progress image
Final scene
Work in progress image
Final scene
Work in progress image

On Season 2 of “The Man in the High Castle”, Barnstorm Visual Effects was tasked with supervising the creation of roughly 400 effects shots over the course of the season, nearly all of which were completed in-house.

For assistance in creating many of the large setpieces, Barnstorm turned to their partners at Theory Studios, who were responsible for modeling and texturing the multitude of structures in Germania, including the massive Volkshalle (both inside and out). Theory, which employs a distributed network of artists around the world, also designed custom tools to facilitate the secure sharing of assets and to execute cloud rendering on Amazon’s AWS for the large number of complex 3D shots that needed to be created in a short period of time.

Regardless of where they were initiated, cloud renders of CGI elements were automatically pushed to Barnstorm’s server, where compositors incorporated the material into the final shots. Throughout the process, Barnstorm and Theory artists collaborated via Shotgun and Theory’s proprietary MakeTheory software.

Final scene
Work in progress image
Work in progress image
Final scene
Work in progress image
Final scene
Work in progress image

Early on in the process, the team decided to utilize a PBR-based shader workflow and made Substance Painter the cornerstone of their texture design. Nearly every 3D object, from smaller elements like cars all the way up to the largest structures, such as the Volkshalle, utilized a combination of Substance Designer procedural materials and Substance Painter texture painting.

Barnstorm and Theory’s 3D pipeline utilized Blender and since it does not natively support the implementation of Substances, a custom uber-shader was created to make it easy to apply Substance materials and render them in Blender’s Cycles rendering engine. With so many assets spread across the workforce, the Substance workflow ensured consistent look and physical behavior among materials designed by different artists.

PBR Workflow with Substance Painter: A Case Study from The Man in the High Castle, Season 2

The original shot included a tall practical sign in multiple languages over a dark night sky that would need to eventually be replaced with a towering CGI Nazi embassy. Additionally, it was requested that the sign be shortened and only include English, which led us to pursue an additional CGI solution. This project would later be important to helping us develop our internal Substance Painter to Blender workflow.

Modeling, UV unwrapping, and initial material assignment take place in Blender. Once prepared, each object that will be receiving a Substance Painter pass is exported to OBJ individually. Single objects with multiple materials also have a UV Color ID map exported as a texture resource to Substance so that it can be used with the Color ID picker mask.

Our project configuration is always set to the Metal Rough preset, which is the set that fits most accurately into Blender’s PBR calculations. The project is also set to OpenGL to further accommodate export into Blender. The model is imported into Substance and texture maps are immediately baked. Custom Color ID maps are also imported and assigned at this time so that material groups can be mapped.

A roughly equivalent metal Smart Material is applied as a base and modified to something more appropriate for the given model. This provides a strong physically-based foundation for the custom painted details in the finishing steps.

Small height, normal, and roughness details are painted by hand onto the base metal material, allowing the following procedural and particle effects that simulate the paint and weather to be affected accurately.

A fill layer with adjusted roughness, metallic, height, and normal values is applied over the metal base to simulate paint. This is set to a lower opacity to allow some of the base’s worn metal look to come through.

Localized grunge is applied using a combination of custom paint strokes and color/roughness adjustments driven by generator masks and curvature/AO maps. When painting custom grunge or damage, we take the full context of the shot into account – which areas are the most exposed to the elements, weather directionality, where they might be touched or brushed the most often by pedestrians, and similar factors. Authentic details like that are critical to selling a photoreal element, especially when it’s the focal point of a shot.

Paint bumps and chips are created using custom brush strokes on a height-only layer. The effect can be further modified to create general surface displacement by adding a blur filter to the strokes or generator mask, which makes the model feel more hand-crafted and real.

Overly dark edges where the AO map has masked in grunge too aggressively are lifted using a partially-transparent color and low roughness adjustment driven by a generator edge mask.

Leak and Rain particle effects are added over the top of the pole material to give it the feeling of being outside in the elements. This adds a final touch of environmental authenticity to the material before exporting the maps.

The PBR Metal Rough map set is exported in 2K or 4K from Substance and imported into Blender, where they are plugged into our internally-developed Substance node. This shading process is repeated on all foreground, middleground, and background assets until the scene is completed and interacting accurately.

In some instances, here with the sign itself, we use image textures created in other software in conjunction with Substance maps mixed into our custom PBR node. Here, the image of the sign (left) has been mixed with the Metal Rough set export from Substance (right) to build the material on top of the sign image. The material in Substance was built in black and white with this usage specifically in mind:

Last, a final custom greyscale displacement map (created in Nuke or After Effects) is mixed over the top of the Substance displacement maps to create indentations for the bolt models in addition to the Height maps.

This wireframe, taken from Blender, shows the complete extent of the shot replacement done in CGI.

The final shot, rendered in Cycles and composited in Nuke.

Here are some comparison shots between early blockout and final scenes textured with Substance

Final scene
Early blockout
Final scene
Early blockout
Final scene
Early blockout

A big thanks to both Barnstorm VFX and Theory Studios for this very detailed user story!

Cutting-Edge Texturing for VFX: Substance at Double Negative

Cutting-Edge Texturing for VFX: Substance at Double Negative

Cutting-Edge Texturing for VFX: Substance at Double Negative

Alexandre Bagard on April 6 2017 | Substance Designer, Stories, Film/VFX

After over a year of collaboration, we are thrilled to share that Double Negative has been developing innovative texturing workflows using Substance for feature film VFX! In this post, Marc Austin (Lead TD Generalist at Double Negative) tells us about the texturing pipeline he built for one of the central assets of the recently released Assassin’s Creed movie.

Tell us about your role at Double Negative.

As a generalist here at Double Negative, my day-to-day tasks can change quite a lot between shows. I work mainly on hero or complex build tasks, my specialty being surfacing. I also work closely with lighting and pipeline departments to set up sequences.

Why did you choose to use Substance Designer for the Assassin’s Creed movie?

We always keep an eye on promising new technologies and products here at Double Negative. Late in 2015, we were looking into how to approach the complex surfacing challenge of the Animus (The device Cal uses to re-live his ancestors’ memories).

The concept model of the Animus was made of thousands of mechanical links, pipes, and pneumatic muscles. The modeling task was daunting as all these mechanisms had to be fully articulated with a wide range of possible poses.

In a normal build pipeline, a model sign-off would trigger a surfacing task to paint and develop the look of the asset for shots.

However, for the Animus, we needed a way to surface the asset in parallel to meet the deadline. The modeling and rigging were going to be in constant flux during the show and surfacing couldn’t wait for a finished model.

During this early stage, Substance Designer was showcasing features such as Iray integration and the pixel processor, which opened up possibilities for surfacing the Animus in Substance Designer.

With some testing, we came up with a methodology of batch baking substances per UV tile (UDIMs). Using parametric substances we could design a surfacing pipeline which could create a new set of textures far quicker than hand painting. We could re-run these Substances each time there was a modeling update. Coupled with the types of materials used on the Animus (metals and plastics) Substance Designer seemed like a perfect fit.

You can see the Animus in action at 0:48 and 1:39 in the official trailer

Can you walk us through the texturing pipeline you developed for the Animus?

The basic stages we settled on focus largely on working around the absence of UDIMs in Substance.

Model preparation
The Animus was prepared in Maya so we could preview small parts of the model in Substance Designer. We subdivided the model and exported the geometry based on which UV tile the geometry occupied, with 200+ UDIMs in total.

Mesh Info Baking
We used Clarisse from Isotropix to bake the mesh data maps. We chose Clarisse since it was easier to develop custom shaders for and fit inside our publishing pipe.

Substance Designer
A Substance per material was created; these ranged from plastics and metal to rust and dirt. Our propriety shaders needed different inputs than native substance shaders, so all the outputs were converted to Double Negative friendly outputs when we baked the substances.

Baking script
We made a batch baking script which iterated through each Substance changing the inputs and random seeds per UDIM. The result was a texture UDIM sequence per substance material.

Nuke clean up
Nuke was used to simplify the number of outputs our render used. Since each UDIM could have multiple materials, we needed to combine the textures to optimize our texture IO. Material ID masks generated in Katana were used to mask between the substances. These material ID maps matched the shader assignments that Katana used.

Render
These maps were used unaltered as the inputs to our shaders inside of Katana and Clarisse.

Although this was a new tool to work with in your field, could you tell us what was actually easy?

Matching the materials between the Substance previews and our shaders was very easy. Our shaders use the same underlying maths so a direct conversion was possible. This enabled us to use Substance Designer’s viewport as a very accurate preview tool so few or no adjustments were needed in our renders.

Making custom nodes was also easy to implement. An example of one set of nodes were cylindrical and spherical mappers, so all of the mechanical parts of the Animus were textured without seams.

All the textures for the Animus were procedural; no bitmap textures were used. It was surprising how quickly I could match onset references using a combination of shipped Substance Designer nodes and custom ones I made.

And what was hard? What solutions did you find?

Working around the lack of UDIMs in Substance Designer was the biggest challenge. Creating the batch baking methodology meant a large data set needed to be managed and iterated over each time a model update was made.

We have made good progress automating a lot of these steps since we finished the Animus. Hopefully, in the future, it will be easier and quicker to do updates.

Given what you learned, and that updates have been released since then (making the tools more suited for VFX), what would you look forward to doing next with the Substance toolset?

I would love to use Substance Painter; I can imagine a massive use case for it, but until UDIMs are implemented managing hero assets is too difficult. We are, however, using it for smaller assets with promising results.

I’m currently developing a new suite of tools which will form the framework for far more complex Substances. An example of one of the tools is a dripping rust Substance which will always flow over the model due to gravity regardless of UV orientation or scale.

There is obviously a long list of feature requests coming from the VFX world, and users can be reassured that we are working on them 🙂 What would be your top 3?

UDIMs – It’s a must for VFX. I would love to see this supported in all of Allegorithmic’s products.

Full API – Scripting inside Substance Designer is limited. Having the external API is really nice but a fully featured API inside Substance Designer would let us do so much more.

LUTs and advanced viewer – Not as important as the first two, but having a custom LUT would mean surfacing could be authored directly from Substance Designer. Additional features for the viewer like custom back plates, A/B wipes, image history and temporarily viewing a single channel would be very nice to have.

Anything you would like to add?

Working with Substance has been a joy so far and it helped massively for the Animus. With the addition of a few more features, I can definitely see VFX studios using Substance as a backbone of their surfacing pipelines in the future.

We would like to give a special thanks to Marc and the team at Double Negative London for taking the time to share their experience with Substance and look forward to our ongoing collaboration. You can expect more news from the VFX/Animation side of things in 2017 🙂

Logan: Texturing Workflows for VFX with Rising Sun Pictures

Logan: Texturing Workflows for VFX with Rising Sun Pictures

Logan: Texturing Workflows for VFX with Rising Sun Pictures

Alexandre Bagard on May 4 2017 | Stories, Film/VFX

X-Men fans, mutants, and VFX aficionados: we are stoked to share our interview with Rising Sun Pictures, a leading Australian VFX company. Andrew Palmer tells us about the studio’s experience with Substance Painter on some of their latest feature film projects, with a focus on the recently released Logan.

(If you haven’t seen Logan yet, this article contains spoilers)

Thanks for taking the time for this interview. Could you introduce yourself?

My name is Andrew Palmer and I’m Senior Look Development Artist at Rising Sun Pictures. I have led several projects including X-Men Apocalypse, Logan and currently Thor – Ragnarok.

How did you start using Substance and on which projects?

We started using Substance Painter at RSP in July 2016 as an alternative texturing solution for digital assets. Our texturing and shading workflow at that time was fairly involved and that resulted in slow turnaround times. We saw immediate gains with the Substance Painter workflow, so we contacted Allegorithmic and worked with them on the Linux beta release. The first project we used Substance in production on was XXX – The Return of Xander Cage. Projects since then include Logan, Alien Covenant, Nest, and currently Thor – Ragnarok.

We used Substance Painter heavily on Logan because of its ability to create photorealistic looks very quickly whilst maintaining a large amount of flexibility. For example, when geometry gets updated or if UVs change, Substance handles those changes well compared with other packages that require map transfer baking or re-painting.

Could you describe how you used Substance Painter on a particular asset?

Substance Painter worked especially well for us on many sequences for Logan. All assets required photorealism, and fast turnaround. Most notably, we did lookdev for Logan and Laura’s claws in Substance Painter. Having the ability to see layered effects such as blood in real-time was a game changer for us as Substance took care of the complexity that normally comes with such effects. PBR (Physically Based Rendering) compliant texture maps were also a huge win. Plugging the generated textures into our shaders not only resulted in anticipated response, it simplified the shader network itself.

For Logan’s claws, I started with base maps generated from a previous show (The Wolverine) and worked up a new “Adamantium” smart material (very cool!).

I then started to build layers for blood smear variations. That included a combination of smart masks and hand-painted layers. I found this workflow not only fast and intuitive but I was also able to iterate on different looks quickly and respond to feedback with flexibility. For the “Charles’ Death” scene, Substance Painter was especially good as we could use the shot HDR in the viewport directly and preview the lighting whilst painting dripping blood smears in real-time. There were many “oooo’s” and “aaaah’s” around the office as people strolled past my desk and saw the results.

What were the main obstacles you overcame on this project?

There were a few teething issues at first. The lack of Alembic support meant we had to paint on alternate geometry that was exported at the model cache stage. We also changed the way we work with UVs and UDIMs to better work with Substance Painter’s UDIM implementation. For example, on the El Paso Bridge sequence in Logan, we had to replace the bridge, cars, fences, light poles and more. The end result was to separate material types per UDIM and that works well for background and mid-level assets. All metal components in UDIM 1001, all concrete in 1002, and so on. Smart materials were assigned per UDIM. Updating geometry was straight forward. We could also reuse smart materials on other shots. We use an all-linear color pipeline so some color conversion was needed prior to rendering. The introduction of LUTs in Substance Painter 2.5 is a welcomed update.

Are there any tips and tricks you would like to share with the community?

Smart materials are really powerful! Being able to create custom looks, save them to a library and reuse them with no paint work required is a huge time saver.

How do you see the future of Substance in VFX? What are your expectations?

Substance Painter is the future standard workflow for PBR-based texture creation. Its ability to handle multiple shading components whilst being extremely flexible with updating geometry is innovative and beyond other software packages. While Rising Sun Pictures has firmly integrated Substance Painter into our pipeline, we would love to see UDIM implementation improved, open source file formats such as Alembic supported, and Python scripting support.