Select Page
Creating photoreal graphics with Megascans rendered in Redshift

Creating photoreal graphics with Megascans rendered in Redshift

We were really impressed to see the results of the photorealistic textures created by Quixel’s Megascans software rendered in Redshift 2.0.


When we first took a peek at the images, we had trouble identifying the real one! There’s a ton of detail there: make sure to check out the high-res versions in 4K!

With the use of Megascans and Redshift, you too can create similar environments using their extensive scan library.

“As Redshift is the world’s fastest renderer, this 4K render finished in less than an hour with a single GeForce GTX 970 on a mid-tier computer — many times faster than traditional CPU rendering techniques.”

Muse VFX Creates Explosive Visuals for Teen Wolf

Muse VFX Creates Explosive Visuals for Teen Wolf


​Hollywood-based Muse VFX is no stranger to producing epic visual effects for prestigious clients including Sony Pictures, Warner Bros and DreamWorks. They’ve risen to become one of the top VFX houses for episodic television, creating complex sequences for shows that have been a staple for many, including Star Trek Voyager, Star Trek Enterprise and Lost. They’ve garnered quite a few loyal clients over the years, largely in part to their ability to continuously evolve their skillset and pipeline, all while maintaining an unwavering passion to deliver quality work with impressively quick turnarounds. As Fred Pienkos, the studio’s founder and VP states, “together, we solve problems and create beautiful imagery under extremely short deadlines and tight budgets. At the end of the day, we still absolutely love what we do.”

VFX pros across the industry understand all too well that their choice of rendering solution can make the final process either efficient or infuriating. As Muse does not shy away from large-scale projects, they are rightfully picky about the software they can rely on to generate high-powered effects. The studio came to a crossroads in their rendering workflow one morning when they were tasked with delivering a large format project consisting of a dozen, 20-second-long sequences, all at 12K and 30 FPS. To raise the stakes even higher, they only had a 2 month turn around, which amounts to a blink of an eye for a project of this scale. As Lead Artist/TD Stefan Bredereck states, “I knew that with the render workflow and the CPU based render engine, we would not be able to finish the project within the proposed schedule.” The scope of the project was simply too large, and the team did not yet have enough render power to rely on. As it was, their CPU rendering pipeline was enough to elicit a “big audible sigh when needing to render legacy assets.”

Not being the type to give up easily, Bredereck began seeking out alternative options. While researching, he happened upon Redshift and was quickly impressed by its GPU rendering capabilities, the first he had seen that produced production-grade results. “I thought the ideas were incredibly smart, and this is the way it should be done,” states Bredereck. After a phone call from Pienkos, it didn’t take long for a relationship to emerge between Redshift and Muse VFX. The project was delivered to satisfaction, as would be many others to come, as the studio integrated Redshift as their main render solution with 100 GPUs replacing most of their former CPU nodes.

Recently, Muse VFX was brought onboard MTV’s hit series Teen Wolf, where the team was tasked with two especially challenging sequences to bring the series to a close. For the first, the goal was to help break the show’s heroes out of stone shells in which they were encased by the show’s villain, the Anuk-ite. This sequence was complex, requiring a combination of rigid body, particle and fluid simulations. It would warrant multiple iterations and client revisions but, to fit with the studio’s running theme, also had to be completed quickly. The second sequence dealt with the defeat of the Anuk-ite, in which a substance called Mountain Ash would surround the villain in a “dark creeping vortex cloud,” and eventually turn him to stone. The shot required expertise in creating a variety of dynamic effects that interacted cohesively. Bredereck knew that Redshift and its recently released Houdini integration would be perfect for the job.


The combination of Redshift and Houdini was especially suited for these difficult sequences because it allowed the team to directly render volumes of fluid simulations at incredible speeds. Using Redshift’s native volume shader, the team was able to render everything in one engine with one lighting setup. Since they knew any completed work would likely need further iteration or art direction, this added a refreshing efficiency to the look development process that allowed them to direct their focus towards the quality of the work, rather than towards worrying if it would be completed on schedule.

The quick turnaround was one of the most significant obstacles facing the team, as they were granted only 2 weeks of post-production per episode. John Gross, the studio’s founder and VFX Supervisor on the show, would address any aesthetic changes with his team on site, and then watch as his requests were implemented right before his eyes, in near real-time. This enabled Gross and the client to choose, without delay, which option would be the best fit for the show. With Redshift, Bredereck states, they could “render 3 different looks overnight, where with our CPU renderer we would have barely finished one version.” The artists were also able to push the boundaries of their own workflows for FX with Redshift’s capacity to handle complex lighting and global illumination along with volume effects. As Bredereck explains “using global illumination in combination with glossy surfaces, frosted reflections and complex volume renders of fluid simulations… and not waiting 5 hours for a frame or getting a super noisy result, that is just amazing.”


The effects industry is ever-evolving, and with this, prominent VFX houses like Muse face increasing requirements for powerful and effective rendering solutions. Studios will need more machines, more render power and more room but will likely be continuing to face the same tight turnaround times. Grabbing more machines might seem like a quick fix, but eventually space will run out and studios will be left facing the same issues, perhaps on an even grander scale. Muse VFX has effectively resolved this issue by switching over to GPU rendering. The studio is currently using quad GPU nodes with 4 Pascal Titans in one render node. They’re finding that Redshift running on a single Titan is still able to outperform as many as 9 dual CPU nodes. According to Bredereck, “the benefit of using any number of large textures and billions of polygons and not being limited to just the GPU memory are the features that get us through the work day and enable us to deliver on time.”


Since the wrap of Teen Wolf, Muse continues to take on high-volume projects with Redshift. The studio has taken the time to shop around for any and all GPU rendering options to make sure that they move forward with the one that is best for them. “At the end of the day,” says Bredereck, “our philosophy is to use the right tool for the job. Based on the project, that may be Maya, 3ds Max or Houdini, but in all cases, it is always with Redshift these days.”

All images in this article are Copyright 2017 MTV.

Working with Gods: Tendril create a unique story for American Gods

Working with Gods: Tendril create a unique story for American Gods

Neil Gaiman’s bestselling, Hugo Award winning, epic novel American Gods has entranced readers across the globe with its blend of classic American road trip intertwined with supernatural folklore, and now it’s the turn of Canadian animation studio, Tendril, to entrance us with its short five-minute opening sequence for the American Gods series.

In the fifth episode, which was directed by Vincenzo Natali, the ‘Coming to America’ animated sequence tells the tale of the holy woman, Atsula, and her relationship with her God, Nunyunnini, as her tribe arrive on the new continent after being led across the land bridge from Asia.

With a background in animation, Vincenzo chose to work with the design-driven studio Tendril to create a striking, character-based story which relied on a stylised design for the main protagonists with naturalistic environments.

Tendril was brought on board in June 2016 during pre-production which enabled the concept and development teams to collaborate on the overall design process. This time also allowed the studio to test potential rendering solutions before production ramped up in September for a December delivery.

While Tendril had been using Redshift for design pitches – where the speed of delivering styleframes in five minutes had been a boon – the team was unsure about using a GPU-based solution for the detailed landscape environments that were a key component of Coming to America.

Tendril’s lighting and texture artists, Christian Hecht and Alex Veaux, worked on the style frame scene that was used to test Redshift. “It was quite a complex scene right off the bat! It was a full-on forest with ground cover, displacement on the bark, full volumetrics. Everything basically,” says Christian. “I was like, ‘GPU? I’m not sure!’”

Christian took some assets and quickly put together a forest style frame [in Redshift] and we were like, ‘I think it’s going to work!’” adds Alex.

While speed is a common reason that many artists use GPU rendering, Redshift was chosen for Coming to America as it could combine speed with reliable handling of the large datasets that the creative team would generate.

“Because Redshift has a very clean and stable implementation in Maya, it was much more reliable than any of the other render engines that we had used,” Alex explains.


Due to the highly stylised nature of Coming to America, even though Redshift’s interactive preview allowed the artists to quickly see the scene, compositing was still a large factor. “It was probably one of the heaviest comp projects that we had done,” explains Alex. “Tons of AOVs and mattes. The good thing is that it was easy to set up.” Mattes especially had been difficult to create in Tendril’s CPU-based rendering software. “It was a nice surprise [with Redshift] – simple AOVs.”

The texture detail of the stylised animation is a standout feature of Tendril’s work. The characters had up to 90GB of textures each, spread over 6 UDIMS for each part of the model, including all of the authentically detailed clothing and props.

“Redshift did a really great job of tiling the textures,” adds Alex. “With previous renderers of course you have that, but it is not really tightly integrated. So it was great to see this in Redshift.”


As the project schedule was tight, Tendril had to improvise a GPU-hardware solution.

“We tried using cloud-based GPU rendering, but the scenes were just way too big to consider a cloud solution. So it had to be in-house with workstations managed by Deadline render manager,” says Chris Bahry, creative director at Tendril.

Tendril set about filling as many workstations as they could with dual GPUs. “We scaled up from just having a few workstations with this spec to outfitting 18 workstations,” says Chris. Most of the workstations were running dual NVIDIA GTX 980tis, with a few running NVIDIA Titans for good measure.

“I had a Titan in my box with one 980ti,” Alex explains. “It was great, as I could see if we were going out of core [with the GPU memory] or not. It was a great benchmark. I found it very useful to have one [card] that is a little bit bigger, so that I could have some breathing room if needed.”


At the outset, Tendril had calculated that an hour per frame was the acceptable time needed for the show. “The Bison god was easily an hour and half,” comments Alex. This was offset by a lot of the character closeups, which took only 15-20 minutes per frame.

Tendril’s previous frame time on shows using a CPU-based render solution had also been around an hour per frame, but when comparing the output, Alex noticed that there was a marked difference with rendering in Redshift: “These scenes were fifty times bigger than anything we had done before with our previous render solution. Even though the render time was the same, the density and quality of the frame was so much more… We had never done hair before, or direct volume rendering with the beauty [pass].”

Another benefit above CPU-based rendering was the ability to easily add effects. “We had never done in-camera depth of field before either, as it was just too slow. We always did depth in post. In this, it was all in camera. That’s huge!” says Alex.


Now that there has been time to reflect on the process for Coming to America. Tendril has invested in a dedicated GPU farm and is moving forward with Redshift on other projects.

Tendril kept Redshift’s support team on its toes, and was pleased with the fast responses and fixes that were delivered. “The guys were submitting bug reports and getting fixes during the project,” says Chris. “In a matter of days you would get stuff fixed, so that was really nice; with a really active forum,” says Christian.

“[The Redshift forum] has become the best Maya forum!” jokes Alex. “Everyone on there is pretty talented”.


The GPU renderer managed the resource-rich project with no problems: “If we can create an animation like that [in Redshift]. It speaks to how robust [Redshift] is,” says Christian, reflecting on how Redshift performed with the heavy datasets. This robustness did not just rely on the proven implementation of Redshift within Maya; many of the effects, such as the snow, smoke, fire and embers, were done using Houdini.

Using the alpha of the Redshift Houdini plugin, the team at Tendril rendered out the sims directly within Houdini. This saved time on production as it meant there was no need to translate the Houdini files into Maya.

“The very last shot is a Redshift alpha in Houdini; the one with the snow blowing. Because it was totally isolated, we were able to say, ‘Alright, let’s do it all in Houdini’,” explains Alex.

Redshift had enabled Tendril to create the dark and brooding Coming to America animation within the production schedule, with minimal creative or technical compromises. This was noticed by the director Vincenzo Natali when he arrived for a review session. Molly Willows, Tendril’s director of communication, remembers his comment: “Vincenzo said, ‘I couldn’t believe it, I came in to check on the progress and everything was rendered!”



Chocolate Tribe was responsible for over 90% of all the 140 VFX shots on Robot & Scarecrow, the latest short film from acclaimed UK director Kibwe Tavares. The film was co-produced by London based production houses, DMC Film (owned by Michael Fassbender and Conor McCaughan) and Nexus Studio. Starring Holliday Grainger and Jack O’Connell as the titular Robot and Scarecrow, this beautiful film tells the tale of a robot pop performer who meets a lonely scarecrow at a festival where they then embark on a whirlwind of experiences.

The production team approached Chocolate Tribe, a VFX studio based in Johannesburg, South Africa, to produce the cutting-edge visuals needed for Robot & Scarecrow. While this may seem like a radical geographical departure, Chocolate Tribe was uniquely suited to handle the work required for the film. The principle team from Chocolate Tribe for Robot & Scarecrow, Rob Van den Bragt and Tiaan Franken, have decades of experience working in both the South African and London visual effects industries, where Rob was a VFX supervisor with The Mill for close to 10 years. They could bring the best of both these communities together for the eight-month production schedule.


Chocolate Tribe showed its versatility from the inception of the process, as the original plate photography had been shot three years before, at the Secret Garden Party festival in the UK. Working with pre-existing plates meant that Chocolate Tribe needed a proven and robust render solution.

The combination of Redshift’s GPU-powered speed and biased workflow made the decision to go with Redshift an easy one, as Tiaan explains: “Because of Redshift’s turnaround time on rendering and testing, that made a dramatic difference in crafting your art, rather than waiting on the renderer.”

Rob continues: “The capability of the renderer was of utmost importance. Yes, Redshift reduced our rendering overhead, but more importantly the render output was amazing. We ran various render tests with various rendering solutions. Together with the HDRIs from the set, the lighting model of Redshift with its physically correct shaders matched the non-biased renders in the majority of cases. Truth is, that biased renderers have been overshadowed a little over time by non-biased renderers, but in 95% of all cases, the difference is so marginal, that it often isn’t worth the substantial extra render time. What matters more is the physical correctness of the shaders and lights.”

Although Chocolate Tribe had eight months to deliver the project, due to the sheer volume of shots required, Rob was aware of the challenge that they had ahead. “It seems like a lot of time, but it’s not like you have two weeks to craft every shot. So we obviously needed something that worked fast, without sacrificing quality. Redshift delivered frames in two minutes; sometimes in less than a minute.”

These quick frames times were not because Chocolate Tribe was making any compromises when using Redshift. “These were with all the bells and whistles, we had depth of field, motion blur, GI, ray-tracing, SSS and high sampling even in our test renders,” Rob explains.


Chocolate Tribe found Redshift added economies of scale to workstations as well. Running dual NVIDIA 980tis would yield double the render speed of the significantly more expensive NVIDIA Titans running in single-card configurations.

Quite early on in production, when more GPUs were needed, NVIDIA had released new Pascal-based graphics cards. Chocolate Tribe found that they only needed the ‘mid-range’ NVIDIA GTX 1070 when adding more cards, which were half the price of the original GTX 980tis.

“The 1070s actually had a bit more memory, so we could push the envelope even further….and you saved a few grand,” notes Tiaan.

Having worked with the GPU-based renderer on previous productions, the team decided that the Redshift would suit Chocolate Tribe’s production methodology and requirements well at this level of production. Integration was seamless. When compared with alternative render solutions, the biased renderer output held its own comfortably.

“We were using Maya and immediately [Redshift] felt right at home. It is all about workflow for us. We program our pipeline using Python and MEL; and it was easy to accommodate Redshift,” explains Tiaan.

While Chocolate Tribe has used the Redshift plugin with Houdini, the team didn’t need it for Robot & Scarecrow. Tiaan singled out an easy way artists can switch between applications that use Redshift without the need to buy additional, application-specific plugins:

“That’s the amazing thing we love about [Redshift]; I could use my same license and jump to another platform. I didn’t have any licensing issues. It gives you the ability to explore other platforms without costing you double.”


Redshift’s ability to integrate into a visual effects pipeline was also critical when it came to the final look of Robot & Scarecrow. Allegorithmic’s Substance Painter was used to create the textures. “The workflow between Substance Painter and Redshift was mind-blowing! After you had tested your shaders with the preview in Substance Painter, you would get exactly the same result [in Redshift] by just hooking up the textures properly,” says Tiaan.

Chocolate Tribe was able to utilise every part of its creativity when designing shaders for Robot & Scarecrow as Tiaan explains: “We had 4K textures on multiple outputs, normal maps, displacement maps – whatever you needed. On the Robot alone, she had 800+ pieces of geometry. We were concerned at render time that Redshift may fall over with all the textures, and it never happened.”

Rob agrees: “We never down-resolved our textures. We had 8K and 4K textures, we were throwing 1000s of textures at Redshift…I actually still don’t understand what Redshift was doing. It just handled it, I couldn’t believe it!”

Even when the shots were close-ups of Robot’s face, which consisted of a dynamic facial screen driven by a sub-surface scattering shader, render times were only slightly affected. “Normally if you go extremely close with sub-surface scattering on the face, you would expect to jump into greater render times. But it never really went beyond ten minutes a frame,” says Tiaan.

Although Scarecrow’s render times were much quicker than Robot’s, there was a large amount of texture-based displacement in the model. According to Tiaan, Redshift’s displacement tools were up to the task, especially the global parameter for displacement settings: “The simple thing of just selecting your geometry and controlling all of your displacement within one node. It’s such a timesaver.”



When it came to final render, Tiaan programmed a solution within Maya and Redshift to adapt the render samples needed per shot.

“We programmed our default preset to a base level. On a close-up shot, if we needed more samples we would turn it up. Off the bat, 80% of the time, our default preset held up,” he says.

“This was with global illumination, depth of field and motion blur,” adds Rob.

“That was a huge thing for us,” Tiaan explains. “Working with rendering you often get stuck with doing motion blur in post, and it’s horrible. Same thing with depth of field. We could tweak all of this [using Redshift] in real time in our camera view. Doing everything on the fly was just so much fun.”

This approach meant that Chocolate Tribe minimised the amount of passes it required when it came to compositing. “It was five passes max. We tried to do everything straight in the beauty. The main reason for this is that the final render just looks better when you do it all for real,” says Rob.


At every one of the weekly production Skype calls, Chocolate Tribe had delivered a new edit with at least ten revised shots, which went up to more than fifty shots as production ramped up.

“We would continually keep them in the loop; it really helps to keep the client’s confidence levels high, with clear communication as they are seeing progress,” says Rob.

Tiaan also noted that as the pipeline got more streamlined towards the end of the production, Chocolate Tribe’s technical capability really came into its own: “We could start with a clean plate, with a final track, pulling in the rigs, caching them out and auto-assigning the shaders and light rigs. This is a whole process that we have automated. Literally from nothing to something, you could get there in ten minutes, kick a render off and see where you are at. This quick turn-around of shots allowed us to get a quick preview of multitude of shots in hours, or even minutes.”

When Kibwe Tavares visited Chocolate Tribe’s offices in Johannesburg, he was amazed at the speed that Chocolate Tribe was able to iterate through shots. Redshift was naturally a big part of this. “He was knocking doors down going, ‘Where is the render farm that you are hiding?’,” laughs Tiaan.

“I don’t think Redshift realise what a ‘shift’ they have created in the industry. They have redefined how the rendering process and pipeline can work,” says Rob. “Redshift has opened up a world of creativity, experimentation and artistic finessing.”

Chocolate Tribe chose their name to relate the fact that they are a community of dedicated and passionate creatives of all walks of life with a proud African vibe. This pride embodies the work that they have created for Robot & Scarecrow, not just for its artistry and technical achievement, but also for the fact that it has proven that the South African visual effects industry can facilitate the best directors and production companies anywhere on the planet.