Select Page
Creating Ceramic Creatures For Adobe’s Project Aero with Stuart Lynch

Creating Ceramic Creatures For Adobe’s Project Aero with Stuart Lynch

Creating Ceramic Creatures For Adobe’s Project Aero with Stuart Lynch

Pierre Bosset on August 25 2018 | Substance Painter, Stories, Cinema 4D, v-Ray, ZBrush, AR, Design

Today we interview Stuart Lynch, a freelance 3D artist currently working on Adobe’s Project Aero. For this project, Stuart created a series of outstanding ceramic creatures, which will be used to demo an upcoming augmented reality application developed by Adobe.

My Background

Hello, I’m Stuart Lynch, a 3D generalist living in Emeryville, California. I’m originally from the UK, but I’ve lived in the US for 18 years.

I’ve been working in the industry in one form or another since I was 18, starting out as a graphic designer in a print shop. I made the transition to a full-time 3D artist in 2001 when I landed a job as an animator in the pharmaceutical industry, creating interactions within the body. I stayed in this role for 3 years until I’d gained enough technical knowledge to confidently go it alone as a freelance artist.

Freelancing has allowed me to experience a lot of different industries, namely: broadcast motion graphics, gaming, architecture, concept design, as well as a significant number of digital installation and projection-based work.

For the past couple of years, I’ve focused almost exclusively on virtual and augmented reality, working full-time as a director of immersive technologies, then moving to Google. Most recently I have been working for Adobe, assisting them creatively with the launch of their upcoming software ‘Project Aero’.

How I Discovered Substance

Google invited me to join their UX design team but provided little information about the actual role. They did, however, question my texture painting abilities, which I claimed were “sufficient”.

The truth was that for years I’d been creating all of my work without painting or UV mapping. While some of my designs certainly have the typical edge wear and damage associated with painting apps, those effects were always the result of high-poly counts, projection techniques, layered materials and creative ways to blend them all together. Sadly, none of this can be easily baked down into a meaningful cross-platform result, so I realized that I needed to learn some new techniques, and fast.

With two weeks before the start date of the job, and after conducting a little research, I determined that Substance Painter was the best choice for the solo artist. I dove straight in with the Allegorithmic tutorials and was up and running creating rusty metal spheres in a matter of hours. Although slightly intimidating at first, the workflow quickly becomes second nature, the UI is highly intuitive and stunning end results were much easier to create than with my previously described methods.

Fast forward to 8 months later and I now consider Substance Painter an essential part of my creative pipeline.

Adobe’s Project Aero

Thanks to a friend, I was recommended for a creative position at Adobe to assist them in creating an asset for their upcoming AR platform. Working with Stefano Corazza (founder of Mixamo and Senior Director of Engineering at Adobe), we collaborated on creative ideas and settled on a metallic vine surrounded by interactive butterflies. The resulting artwork was used during the ‘Project Aero’ announcement at Apple’s WWDC keynote and later used to promote ‘The Festival of the Impossible’, an Adobe-sponsored 3-day event showcasing work from creative talent in the emerging fields of AR and VR.

In the second phase of the project I’m working on creating a variety of additional assets that are also designed specifically to work in AR. Substance Painter continues to play a very important role in the delivery of these designs, often as the last step in the process before final deliverables are due.

The Ceramic Statues Textured with Substance Painter

I had been struggling for inspiration when I stumbled upon a tutorial regarding the use of ID maps in Substance Painter. Typically, if the creative juices aren’t flowing, I at least find the time to up my technical game.

I had whipped up a quick and dirty creature creation in ZBrush, exported to Substance Painter and started playing around with color assignments. It was about an hour later when I realized I was stumbling into the ‘happy accident’ zone. That place where you’ve been aimlessly clicking buttons and then something that resembles art begins to emerge on the screen.

So, taking inspiration from that first model, I decided to take my week-long vacation creating a whole series of similar designs, instead of camping and enjoying nature as originally intended.

My Workflow

I begin simply by freeforming some straightforward models in ZBrush, which are then auto-retopologized to a manageable poly-count with a quick UV map for good measure.

In the second phase, I start cutting the model with ZBrush’s slice tool, converting the shape into meaningful groups. Typically, I’m trying to follow the flow of the topology or trying to isolate parts of the body. These groups are later converted to a color texture, which in turn provides Substance Painter with ID maps that help to isolate material effects.

Again in ZBrush and now working with a 2-3 million poly object, I add slightly extruded panel loops, some wear and tear, followed by procedurally-generated height maps. The finished design is intended to have the appearance of a ceramic glaze, so I try to keep the height detail to a minimum.

Moving over to Substance, I follow the standard baking procedures using a low/high-res mesh, import the generated ID map, and then I’m ready to assign colors. I prefer to start out in the simplest way possible, building the look over time, working with simple diffuse colors and patterns, and assigning those to each specific ID channel. This is a process I typically go through until I land on an aesthetically-pleasing color combination that feels right to the eye.

Once I’m happy with this process, I start blending those colors within smart material channels, adding subtle wear here and there, finally layering with some dust and dirt to achieve the desired final look.

The end results are brought back into Cinema 4D and rendered with VRayForC4D.

Future projects

In an ideal world, this time next year I’ll be directing a kids’ film about saving the environment. Pipe dreams aside, I always have personal projects planned but rarely the time to commit to them. Such are the joys of having a full-time job and raising two children!

Saying that, I believe Substance will play a part in both my professional and my personal work forevermore. I don’t see myself ever putting it down. Substance is to the painting workflow what ZBrush is to modeling. It’s highly intuitive, very unique, stable and it has really changed my perception of texture painting.

Thanks for the opportunity to let me talk about this fun project, if you’d like to see additional works from this project, please visit this page.

SUBSTANCE SOURCE AUTOMOTIVE MATERIALS: PROTOTYPING

SUBSTANCE SOURCE AUTOMOTIVE MATERIALS: PROTOTYPING

SUBSTANCE SOURCE AUTOMOTIVE MATERIALS: PROTOTYPING

Hi, material lovers!

This is it. The moment when we follow up on our promise to release a massive amount of materials dedicated to automotive texturing. The team has been busy, and the result is our most substantial material package so far! We thought we’d deliver a bit over 300 materials and that’s what we promised; in fact, it’s going to be almost 500.

This release is also our broadest collection of materials so far, as we cover all the materials needed for the construction of a car: paints, leathers, plastics, textiles, and composites. Each of these materials is a fully tweakable procedural, scan, hybrid or MDL shader. And it’s all on its way to Substance Source.

We know that it’s a lot. So we decided to give you the time to dive into the various categories and material types. This is why we planned a release every week in June. And a few surprises to spice it up, because, why not?

Friends from everywhere: we worked on offering maximum customizability to the materials to ensure you’ll find useful stuff for any texturing project.

And friends from the automotive world, you’ll be pleased to know that for this release we partnered with talented professionals from the automotive industry. We stepped into their shoes to understand the constraints and opportunities in designing useful digital assets for the industry.

In short, the spirit of this release is a blend of universes and usages.

So have fun! Browse more than 1,000 presets or play with endless variations within each material! And why stop there? Get (technically) creative – hack into the SBS and MDL graphs; mix, layer or expose your magic!

Car Prototyping Materials

Today, we introduce the Substance Source automotive release, with a selection of the first 45 materials associated with concept cars prototyping.

Artists and designers can have a photorealistic vision of pre-concepts at the earliest stage of the design process. The materials we offer are those used in the creation of mock-ups.

The production of a car model is a long process. It begins with an extended period of iteration around the shapes and volumes of the car. Once this has been validated, and to get an impression of the final vehicle, modelers build 1:4, then 1:1 scale models.

It is important to note that mock-ups aren’t always created using the same process. In fact, since they anticipate the validation of visual cosmetics, the process varies according to the part of the car constructed. Designers need to use rapid prototyping techniques, automated multi-axis machining and hand modeling of plasticine.

Clay is the reference material in the design process of a car. Its neutral aspect is ideal for the comparison of different concept-car shape proposals.

We created digital clays reproducing the material surface, taking into account the impact of the various tools used to shape, cut and smooth the model. You will get an ultra-realistic look and feel.

Digital artists can now use digital clay textures over 3D speedforms at early stages of design to compare the digital model to the physical clay mock-up.

Full-scale models are created in clay. They usually consist of a wooden or iron frame, which modelers then cover with styrofoam. They then smooth clay over the foam. After this, modelers use various tools and slicks to finalize the shape of the car.

Camouflage Stickers

Later on, the design process prototypes test takes place with the cars wrapped in crazy checkerboard or swirl patterns.

The level of preparation that goes into such seemingly haphazard patterns is considerable. Car makers assign engineers to be in charge of developing bespoke camouflage for each new model. They work in conjunction with the vehicle’s designers to erase character lines almost as soon as they are drawn. Future models must be kept secret.

So today, in Substance Source, we introduce vinyl camouflage materials, with eight completely procedural patterns. This allows artists to adjust patterns directly on the 3D car model – or even to design new patterns with the .sbs graph.

Composite

While clay might be the material of choice for the main body of the car, prototypes of additional parts may be produced using composite materials.

The composites have several uses. You can create communication visuals to demonstrate the design process of a show car, for instance, or you could get a glimpse of the component before even launching fabrication of the mock-up.

Artists now have access to materials such as carbon and glass fibers, woven composite textiles, and felt. And each of these materials is completely procedural – and therefore customizable.

Metals and Coatings

Finally, prototyping calls for the creation of metallic structures. They will be used for the reproduction of the surfaces of cast and machined metals, as well as for baked metallic paints used in low-volume manufacturing processes.

That’s it for the first part of our massive Substance Source automotive release! Download the selection of free materials on Substance Source and start experimenting.

See you next week for the next release: we’ll be focusing on exterior materials. In the meantime, drive safely!

SUBSTANCE SOURCE AUTOMOTIVE MATERIALS: PROTOTYPING

SUBSTANCE SOURCE AUTOMOTIVE MATERIALS: PROTOTYPING

SUBSTANCE SOURCE AUTOMOTIVE MATERIALS: PROTOTYPING

Hi, material lovers!

This is it. The moment when we follow up on our promise to release a massive amount of materials dedicated to automotive texturing. The team has been busy, and the result is our most substantial material package so far! We thought we’d deliver a bit over 300 materials and that’s what we promised; in fact, it’s going to be almost 500.

This release is also our broadest collection of materials so far, as we cover all the materials needed for the construction of a car: paints, leathers, plastics, textiles, and composites. Each of these materials is a fully tweakable procedural, scan, hybrid or MDL shader. And it’s all on its way to Substance Source.

We know that it’s a lot. So we decided to give you the time to dive into the various categories and material types. This is why we planned a release every week in June. And a few surprises to spice it up, because, why not?

Friends from everywhere: we worked on offering maximum customizability to the materials to ensure you’ll find useful stuff for any texturing project.

And friends from the automotive world, you’ll be pleased to know that for this release we partnered with talented professionals from the automotive industry. We stepped into their shoes to understand the constraints and opportunities in designing useful digital assets for the industry.

In short, the spirit of this release is a blend of universes and usages.

So have fun! Browse more than 1,000 presets or play with endless variations within each material! And why stop there? Get (technically) creative – hack into the SBS and MDL graphs; mix, layer or expose your magic!

Car Prototyping Materials

Today, we introduce the Substance Source automotive release, with a selection of the first 45 materials associated with concept cars prototyping.

Artists and designers can have a photorealistic vision of pre-concepts at the earliest stage of the design process. The materials we offer are those used in the creation of mock-ups.

The production of a car model is a long process. It begins with an extended period of iteration around the shapes and volumes of the car. Once this has been validated, and to get an impression of the final vehicle, modelers build 1:4, then 1:1 scale models.

It is important to note that mock-ups aren’t always created using the same process. In fact, since they anticipate the validation of visual cosmetics, the process varies according to the part of the car constructed. Designers need to use rapid prototyping techniques, automated multi-axis machining and hand modeling of plasticine.

Clay is the reference material in the design process of a car. Its neutral aspect is ideal for the comparison of different concept-car shape proposals.

We created digital clays reproducing the material surface, taking into account the impact of the various tools used to shape, cut and smooth the model. You will get an ultra-realistic look and feel.

Digital artists can now use digital clay textures over 3D speedforms at early stages of design to compare the digital model to the physical clay mock-up.

Full-scale models are created in clay. They usually consist of a wooden or iron frame, which modelers then cover with styrofoam. They then smooth clay over the foam. After this, modelers use various tools and slicks to finalize the shape of the car.

Camouflage Stickers

Later on, the design process prototypes test takes place with the cars wrapped in crazy checkerboard or swirl patterns.

The level of preparation that goes into such seemingly haphazard patterns is considerable. Car makers assign engineers to be in charge of developing bespoke camouflage for each new model. They work in conjunction with the vehicle’s designers to erase character lines almost as soon as they are drawn. Future models must be kept secret.

So today, in Substance Source, we introduce vinyl camouflage materials, with eight completely procedural patterns. This allows artists to adjust patterns directly on the 3D car model – or even to design new patterns with the .sbs graph.

Composite

While clay might be the material of choice for the main body of the car, prototypes of additional parts may be produced using composite materials.

The composites have several uses. You can create communication visuals to demonstrate the design process of a show car, for instance, or you could get a glimpse of the component before even launching fabrication of the mock-up.

Artists now have access to materials such as carbon and glass fibers, woven composite textiles, and felt. And each of these materials is completely procedural – and therefore customizable.

Metals and Coatings

Finally, prototyping calls for the creation of metallic structures. They will be used for the reproduction of the surfaces of cast and machined metals, as well as for baked metallic paints used in low-volume manufacturing processes.

That’s it for the first part of our massive Substance Source automotive release! Download the selection of free materials on Substance Source and start experimenting.

See you next week for the next release: we’ll be focusing on exterior materials. In the meantime, drive safely!

CHOCOLATE TRIBE BRINGS LIFE TO ROBOT & SCARECROW

CHOCOLATE TRIBE BRINGS LIFE TO ROBOT & SCARECROW

Chocolate Tribe was responsible for over 90% of all the 140 VFX shots on Robot & Scarecrow, the latest short film from acclaimed UK director Kibwe Tavares. The film was co-produced by London based production houses, DMC Film (owned by Michael Fassbender and Conor McCaughan) and Nexus Studio. Starring Holliday Grainger and Jack O’Connell as the titular Robot and Scarecrow, this beautiful film tells the tale of a robot pop performer who meets a lonely scarecrow at a festival where they then embark on a whirlwind of experiences.

The production team approached Chocolate Tribe, a VFX studio based in Johannesburg, South Africa, to produce the cutting-edge visuals needed for Robot & Scarecrow. While this may seem like a radical geographical departure, Chocolate Tribe was uniquely suited to handle the work required for the film. The principle team from Chocolate Tribe for Robot & Scarecrow, Rob Van den Bragt and Tiaan Franken, have decades of experience working in both the South African and London visual effects industries, where Rob was a VFX supervisor with The Mill for close to 10 years. They could bring the best of both these communities together for the eight-month production schedule.

CRAFTING THE ART


Chocolate Tribe showed its versatility from the inception of the process, as the original plate photography had been shot three years before, at the Secret Garden Party festival in the UK. Working with pre-existing plates meant that Chocolate Tribe needed a proven and robust render solution.

The combination of Redshift’s GPU-powered speed and biased workflow made the decision to go with Redshift an easy one, as Tiaan explains: “Because of Redshift’s turnaround time on rendering and testing, that made a dramatic difference in crafting your art, rather than waiting on the renderer.”

Rob continues: “The capability of the renderer was of utmost importance. Yes, Redshift reduced our rendering overhead, but more importantly the render output was amazing. We ran various render tests with various rendering solutions. Together with the HDRIs from the set, the lighting model of Redshift with its physically correct shaders matched the non-biased renders in the majority of cases. Truth is, that biased renderers have been overshadowed a little over time by non-biased renderers, but in 95% of all cases, the difference is so marginal, that it often isn’t worth the substantial extra render time. What matters more is the physical correctness of the shaders and lights.”

Although Chocolate Tribe had eight months to deliver the project, due to the sheer volume of shots required, Rob was aware of the challenge that they had ahead. “It seems like a lot of time, but it’s not like you have two weeks to craft every shot. So we obviously needed something that worked fast, without sacrificing quality. Redshift delivered frames in two minutes; sometimes in less than a minute.”

These quick frames times were not because Chocolate Tribe was making any compromises when using Redshift. “These were with all the bells and whistles, we had depth of field, motion blur, GI, ray-tracing, SSS and high sampling even in our test renders,” Rob explains.

MAKING A SAVING

 


Chocolate Tribe found Redshift added economies of scale to workstations as well. Running dual NVIDIA 980tis would yield double the render speed of the significantly more expensive NVIDIA Titans running in single-card configurations.

Quite early on in production, when more GPUs were needed, NVIDIA had released new Pascal-based graphics cards. Chocolate Tribe found that they only needed the ‘mid-range’ NVIDIA GTX 1070 when adding more cards, which were half the price of the original GTX 980tis.

“The 1070s actually had a bit more memory, so we could push the envelope even further….and you saved a few grand,” notes Tiaan.

Having worked with the GPU-based renderer on previous productions, the team decided that the Redshift would suit Chocolate Tribe’s production methodology and requirements well at this level of production. Integration was seamless. When compared with alternative render solutions, the biased renderer output held its own comfortably.

“We were using Maya and immediately [Redshift] felt right at home. It is all about workflow for us. We program our pipeline using Python and MEL; and it was easy to accommodate Redshift,” explains Tiaan.

While Chocolate Tribe has used the Redshift plugin with Houdini, the team didn’t need it for Robot & Scarecrow. Tiaan singled out an easy way artists can switch between applications that use Redshift without the need to buy additional, application-specific plugins:

“That’s the amazing thing we love about [Redshift]; I could use my same license and jump to another platform. I didn’t have any licensing issues. It gives you the ability to explore other platforms without costing you double.”

TIGHT INTEGRATION

Redshift’s ability to integrate into a visual effects pipeline was also critical when it came to the final look of Robot & Scarecrow. Allegorithmic’s Substance Painter was used to create the textures. “The workflow between Substance Painter and Redshift was mind-blowing! After you had tested your shaders with the preview in Substance Painter, you would get exactly the same result [in Redshift] by just hooking up the textures properly,” says Tiaan.

Chocolate Tribe was able to utilise every part of its creativity when designing shaders for Robot & Scarecrow as Tiaan explains: “We had 4K textures on multiple outputs, normal maps, displacement maps – whatever you needed. On the Robot alone, she had 800+ pieces of geometry. We were concerned at render time that Redshift may fall over with all the textures, and it never happened.”

Rob agrees: “We never down-resolved our textures. We had 8K and 4K textures, we were throwing 1000s of textures at Redshift…I actually still don’t understand what Redshift was doing. It just handled it, I couldn’t believe it!”

Even when the shots were close-ups of Robot’s face, which consisted of a dynamic facial screen driven by a sub-surface scattering shader, render times were only slightly affected. “Normally if you go extremely close with sub-surface scattering on the face, you would expect to jump into greater render times. But it never really went beyond ten minutes a frame,” says Tiaan.

Although Scarecrow’s render times were much quicker than Robot’s, there was a large amount of texture-based displacement in the model. According to Tiaan, Redshift’s displacement tools were up to the task, especially the global parameter for displacement settings: “The simple thing of just selecting your geometry and controlling all of your displacement within one node. It’s such a timesaver.”

REAL-TIME ADJUSTMENTS

 

When it came to final render, Tiaan programmed a solution within Maya and Redshift to adapt the render samples needed per shot.

“We programmed our default preset to a base level. On a close-up shot, if we needed more samples we would turn it up. Off the bat, 80% of the time, our default preset held up,” he says.

“This was with global illumination, depth of field and motion blur,” adds Rob.

“That was a huge thing for us,” Tiaan explains. “Working with rendering you often get stuck with doing motion blur in post, and it’s horrible. Same thing with depth of field. We could tweak all of this [using Redshift] in real time in our camera view. Doing everything on the fly was just so much fun.”

This approach meant that Chocolate Tribe minimised the amount of passes it required when it came to compositing. “It was five passes max. We tried to do everything straight in the beauty. The main reason for this is that the final render just looks better when you do it all for real,” says Rob.

TESTING AND EXPERIMENTATION


At every one of the weekly production Skype calls, Chocolate Tribe had delivered a new edit with at least ten revised shots, which went up to more than fifty shots as production ramped up.

“We would continually keep them in the loop; it really helps to keep the client’s confidence levels high, with clear communication as they are seeing progress,” says Rob.

Tiaan also noted that as the pipeline got more streamlined towards the end of the production, Chocolate Tribe’s technical capability really came into its own: “We could start with a clean plate, with a final track, pulling in the rigs, caching them out and auto-assigning the shaders and light rigs. This is a whole process that we have automated. Literally from nothing to something, you could get there in ten minutes, kick a render off and see where you are at. This quick turn-around of shots allowed us to get a quick preview of multitude of shots in hours, or even minutes.”

When Kibwe Tavares visited Chocolate Tribe’s offices in Johannesburg, he was amazed at the speed that Chocolate Tribe was able to iterate through shots. Redshift was naturally a big part of this. “He was knocking doors down going, ‘Where is the render farm that you are hiding?’,” laughs Tiaan.

“I don’t think Redshift realise what a ‘shift’ they have created in the industry. They have redefined how the rendering process and pipeline can work,” says Rob. “Redshift has opened up a world of creativity, experimentation and artistic finessing.”

Chocolate Tribe chose their name to relate the fact that they are a community of dedicated and passionate creatives of all walks of life with a proud African vibe. This pride embodies the work that they have created for Robot & Scarecrow, not just for its artistry and technical achievement, but also for the fact that it has proven that the South African visual effects industry can facilitate the best directors and production companies anywhere on the planet.

Working with Gods: Tendril create a unique story for American Gods

Working with Gods: Tendril create a unique story for American Gods

Neil Gaiman’s bestselling, Hugo Award winning, epic novel American Gods has entranced readers across the globe with its blend of classic American road trip intertwined with supernatural folklore, and now it’s the turn of Canadian animation studio, Tendril, to entrance us with its short five-minute opening sequence for the American Gods series.

In the fifth episode, which was directed by Vincenzo Natali, the ‘Coming to America’ animated sequence tells the tale of the holy woman, Atsula, and her relationship with her God, Nunyunnini, as her tribe arrive on the new continent after being led across the land bridge from Asia.

With a background in animation, Vincenzo chose to work with the design-driven studio Tendril to create a striking, character-based story which relied on a stylised design for the main protagonists with naturalistic environments.

Tendril was brought on board in June 2016 during pre-production which enabled the concept and development teams to collaborate on the overall design process. This time also allowed the studio to test potential rendering solutions before production ramped up in September for a December delivery.

While Tendril had been using Redshift for design pitches – where the speed of delivering styleframes in five minutes had been a boon – the team was unsure about using a GPU-based solution for the detailed landscape environments that were a key component of Coming to America.

Tendril’s lighting and texture artists, Christian Hecht and Alex Veaux, worked on the style frame scene that was used to test Redshift. “It was quite a complex scene right off the bat! It was a full-on forest with ground cover, displacement on the bark, full volumetrics. Everything basically,” says Christian. “I was like, ‘GPU? I’m not sure!’”

Christian took some assets and quickly put together a forest style frame [in Redshift] and we were like, ‘I think it’s going to work!’” adds Alex.

While speed is a common reason that many artists use GPU rendering, Redshift was chosen for Coming to America as it could combine speed with reliable handling of the large datasets that the creative team would generate.

“Because Redshift has a very clean and stable implementation in Maya, it was much more reliable than any of the other render engines that we had used,” Alex explains.

EASY MATTES AND TILING

Due to the highly stylised nature of Coming to America, even though Redshift’s interactive preview allowed the artists to quickly see the scene, compositing was still a large factor. “It was probably one of the heaviest comp projects that we had done,” explains Alex. “Tons of AOVs and mattes. The good thing is that it was easy to set up.” Mattes especially had been difficult to create in Tendril’s CPU-based rendering software. “It was a nice surprise [with Redshift] – simple AOVs.”

The texture detail of the stylised animation is a standout feature of Tendril’s work. The characters had up to 90GB of textures each, spread over 6 UDIMS for each part of the model, including all of the authentically detailed clothing and props.

“Redshift did a really great job of tiling the textures,” adds Alex. “With previous renderers of course you have that, but it is not really tightly integrated. So it was great to see this in Redshift.”

WHEN THE CLOUD IS NOT ENOUGH

As the project schedule was tight, Tendril had to improvise a GPU-hardware solution.

“We tried using cloud-based GPU rendering, but the scenes were just way too big to consider a cloud solution. So it had to be in-house with workstations managed by Deadline render manager,” says Chris Bahry, creative director at Tendril.

Tendril set about filling as many workstations as they could with dual GPUs. “We scaled up from just having a few workstations with this spec to outfitting 18 workstations,” says Chris. Most of the workstations were running dual NVIDIA GTX 980tis, with a few running NVIDIA Titans for good measure.

“I had a Titan in my box with one 980ti,” Alex explains. “It was great, as I could see if we were going out of core [with the GPU memory] or not. It was a great benchmark. I found it very useful to have one [card] that is a little bit bigger, so that I could have some breathing room if needed.”

ENHANCED QUALITY AND CAPABILITIES

At the outset, Tendril had calculated that an hour per frame was the acceptable time needed for the show. “The Bison god was easily an hour and half,” comments Alex. This was offset by a lot of the character closeups, which took only 15-20 minutes per frame.

Tendril’s previous frame time on shows using a CPU-based render solution had also been around an hour per frame, but when comparing the output, Alex noticed that there was a marked difference with rendering in Redshift: “These scenes were fifty times bigger than anything we had done before with our previous render solution. Even though the render time was the same, the density and quality of the frame was so much more… We had never done hair before, or direct volume rendering with the beauty [pass].”

Another benefit above CPU-based rendering was the ability to easily add effects. “We had never done in-camera depth of field before either, as it was just too slow. We always did depth in post. In this, it was all in camera. That’s huge!” says Alex.

HELP AT HAND

Now that there has been time to reflect on the process for Coming to America. Tendril has invested in a dedicated GPU farm and is moving forward with Redshift on other projects.

Tendril kept Redshift’s support team on its toes, and was pleased with the fast responses and fixes that were delivered. “The guys were submitting bug reports and getting fixes during the project,” says Chris. “In a matter of days you would get stuff fixed, so that was really nice; with a really active forum,” says Christian.

“[The Redshift forum] has become the best Maya forum!” jokes Alex. “Everyone on there is pretty talented”.

READY AND ROBUST

The GPU renderer managed the resource-rich project with no problems: “If we can create an animation like that [in Redshift]. It speaks to how robust [Redshift] is,” says Christian, reflecting on how Redshift performed with the heavy datasets. This robustness did not just rely on the proven implementation of Redshift within Maya; many of the effects, such as the snow, smoke, fire and embers, were done using Houdini.

Using the alpha of the Redshift Houdini plugin, the team at Tendril rendered out the sims directly within Houdini. This saved time on production as it meant there was no need to translate the Houdini files into Maya.

“The very last shot is a Redshift alpha in Houdini; the one with the snow blowing. Because it was totally isolated, we were able to say, ‘Alright, let’s do it all in Houdini’,” explains Alex.

Redshift had enabled Tendril to create the dark and brooding Coming to America animation within the production schedule, with minimal creative or technical compromises. This was noticed by the director Vincenzo Natali when he arrived for a review session. Molly Willows, Tendril’s director of communication, remembers his comment: “Vincenzo said, ‘I couldn’t believe it, I came in to check on the progress and everything was rendered!”