Select Page
Substance Designer: Making Incredible Materials with Daniel Thiger aka “Dete”

Substance Designer: Making Incredible Materials with Daniel Thiger aka “Dete”

Substance Designer: Making Incredible Materials with Daniel Thiger aka “Dete”

Pierre Bosset on August 3 2017 | Substance Designer, Stories, Game, Film/VFX

Daniel Thiger is an experienced artist in the game industry, but he only recently started picking up Substance Designer. By dedicating his spare time solely to the learning of it, he managed to master the software and achieve mind-blowing results pretty fast. In this interview, you will learn more about his workflow, and maybe even get a tip or two!

Hey, Daniel! Thanks for agreeing to do this interview. Could you introduce yourself to the community?

My pleasure, thanks for having me. My name is Daniel Thiger, I grew up in Gothenburg, Sweden, and am currently residing in Seattle with my wife Bella.

What is your background?

I’ve been working in the game industry since 2005. My career started at Dice in Stockholm, Sweden. Over the years I held many different roles, from Environment Artist, Concept Artist, to Technical Art Director.

I worked on most Battlefield games released during that time, the last one being Battlefield 3.

In 2011, I moved to Seattle to work for Bungie as a Lead Environment artist on Destiny and Destiny 2.

What are your sources of inspiration?

Being an environment artist, I can get inspired by almost anything around me from nature to architecture to artwork on websites like ArtStation. Natural materials offer an infinite diversity when it comes to shape and material response, so personally, those have always been the most interesting to create.

I’m also fortunate in that I work with a bunch of very talented artists at Bungie, which is a continuous stream of inspiration.

How did you discover the Allegorithmic tools?

I was introduced to the tools during an Allegorithmic presentation at work a few years ago. At that time, we evaluated it mostly as a tool to help us automate certain tasks and not really as a content creation tool.

Substance Designer and Substance Painter kind of fell off my radar until I ‘rediscovered’ them not too long ago, and was blown away by the art that people were creating with it.

So in my spare time, I decided to finally try to learn Substance Designer properly. I started by watching all the video tutorials and breakdowns that I could find, even ones where the subject matter wasn’t of particular interest to me. There is always some nugget of information you can pick up.

You recently created an ArtStation channel just to post creations made with Substance Designer. Could you tell us more about it?

I pretty much set up the ArtStation account to document my own learning experience and to help me stay consistent in quality. In order to learn and understand the capabilities of Substance Designer, I wanted to use it exclusively to create my materials and not rely on other pieces of software to bail me out.

Most of my career has been focused on creating natural environments and terrain, so I was curious about recreating those kinds of materials.

With so much 3D scan data around these days, there is a unique opportunity to utilize them as reference for nuances in shape, diffuse, roughness, and normals. You can really zoom in on details and try to match your Substance textures more closely to real life examples. 3D scanned quality is almost impossible to match but it’s been an interesting experiment and challenge to see how close I can get with 100% Substance Designer generated materials.

To me, the advantage of the Substance Designer workflow is that it’s scalable. Once you have a finished substance, just by tweaking a few parameters, you can easily create endless diversity. It opens up new opportunities for giving and receiving feedback. For example, art directors now have the ability to give feedback even after something is finished, which was previously almost impossible when dealing with baked or sculpted textures.

How much work do you usually put in creating one Substance material?

It depends. Since it’s my pastime entertainment, and I’m using my evenings and weekends, I don’t really keep track of time. But typically something like 10-15 hours, depending on the complexity of the material. There is a lot of trial and error involved, since what I like one day I might dislike the next, but ultimately it’s based on when I think it’s completed.

When a Substance material is finished, I like putting together a simple scene in Marmoset to contextualize my material. It’s a useful tool to help me catch smaller issues that I may have overlooked.

Could you make a breakdown of one material you find particularly interesting?

Sure! Most of my materials are rocks, so let’s walk through one of those. This material is a desert rock slab with small rocks and pebbles scattered around.The original reference was taken by the Curiosity Rover on Mars.

Slab

The first step is to create a separate graph for the main slab shape that will be the base for this material.

Instead of setting up a series of shape control parameters, I rely on the ‘Random Seed’ function to create the diversity I need here.

In the main graph, I use three instances of the newly created Slab node. Just by tweaking the Random Seed slider, I quickly generate shapes that I like. These get connected as pattern inputs on a Tile Sampler node. Tweaking scale, position, rotation, and color generates my main shape.

Pebbles

I then create another separate graph for the small pebbles using the same technique as I did for the slab shapes. I use three different Tile Samplers to generate different scale pebbles around the slab shape. The Mask Map input is heavily in use to make sure there is minimal overlap of pebbles onto the main shape.

Sand

The last element to this material is the sand component, which is just an inverted cell noise with some warping applied.

Detailing

Warping, detailing, and layering are added to the slab shapes before combining all of the components together using the Height Blend node. This node is extremely useful, as it doesn’t just output the blended result, but also a mask for how the inputs intersect. These masks can then be used when working on diffuse, roughness etc.

Diffuse

For Diffuse, I create the sand and rock detail separately and then blend them together using the output masks from the height blend nodes.

Which nodes do you usually use in your workflow?

Almost all of my materials start with the Tile Sampler to generate the main shapes. With all of its input parameters, you can customize the output in any way desirable. It’s crazy powerful. I tend to use all of the available warp nodes and also Slope Blur for details and shape manipulation.

When I need to blend shapes together, I find Height Blend to be among the most useful nodes.

For diffuse and roughness creation, the obvious superstar is the Gradient Map; it can sometimes be the only thing you need. I also tend to use Dust a lot, it can be used to brush surfaces with a layer of sand, snow, or dirt.

What tips would you give to artists who want to start out with Substance Designer?

Since my job includes building most of the environment art shaders we use at Bungie, I have a strong node-based background, so I felt right at home in Substance Designer. But even so, I pretty much started learning the program from scratch.

I guess for a new user, it might be daunting to see all of the nodes and confusing to interpret how they are best combined. Watching tutorial videos of anything made in Substance Designer was a very useful learning tool for me. You will quickly get exposed to many of the nodes and different ways of working with them. Another great way to quickly learn is to dissect other artists’ graphs, Substance Share is a great resource for this.

Once you start building your own materials, I think it’s important to focus on the shape (height) first. Try not to get distracted by details like diffuse, roughness, etc. until later on. That’s why when creating a new material, my base setup is just medium gray for diffuse and roughness, which helps me focus before diving into details.

It’s important that your height information is as accurate as possible, as it might hurt you later on down the line. Subtle differences in height can be hard to catch by just looking at the height information. However, with tessellation enabled in the viewport, it’s very easily caught. That’s why I recommend enabling tessellation in the 3D viewport even if you’re not considering using it as a part of your project. It will help you understand how different shapes relate in terms of height. For instance, if you have vines growing on top of a rock, you want them to hug the cliff and not sit meters above the surface.

Are there any features you would love us to implement?

Something I find myself wanting to do is to warp/bend shapes or details in a specific direction. There are quite a few ways to do this already, but they don’t really offer any precise control. I find myself wanting to use some kind of lattice tool or the ability to bend something along a curve, kind of like the Puppet Warp tool in Photoshop.

It would also be interesting to be able to do WorldMachine-style erosion. I recently took a stab at creating something like it, which turned out ok but still pretty far from what could be achieved with Worldmachine.

Terrain creation is already possible within Substance Designer, but what sets Worldmachine apart is its erosion filters. I think with that capability integrated, Allegorithmic could seriously tap into the market of terrain generation.

What are your next projects?

I think I’m barely scratching the surface of what Substance Designer is capable of, so to further explore and learn, I plan to continue making more projects like the ones you can find on my ArtStation account.

Other than that, I’m working with a couple of friends on a hobby project for mobile devices. It’s a turn-based strategy game set in World War 2 called Day of Victory. Being the only artist on the project, I tried looking for ways where Substance Designer could boost my output.

For instance, we needed a few medals to be built. Traditional 3d modeling/rendering would have been sufficient but time-consuming. Instead with a substance I created, we can now generate as many medal permutations as needed with very little effort.

Which Substance artist inspires you and would you like us to interview next time?

The high quality work of Chris Hodgson, Bradford Smith, and Peter Sekula really caught my eye when I first picked up the program. They continue to be great sources of inspiration.

An old colleague of mine just started an ArtStation account dedicated to Substance Designer experiments. His name is Eric Wiley and he is someone to look out for.

Lastly, could you send us a picture of you and your working desk?

Chris Hodgson, One With The Substance

Chris Hodgson, One With The Substance

Chris Hodgson, One With The Substance

Vincent Gault on November 22 2016 | Substance Designer, Stories, Tutorials

While you see objects and colors, Chris can directly read the Matrix code: his ability to interpret patterns and reproduce them via Substance Designer is simply impressive. So let’s talk a bit more with Chris Hodgson, the man who reads the code: we may learn one or two tricks during the process 😉

You can find his Artstation right here.

Hey Chris, it’s a pleasure to talk with you! As we always do before to delve into the heart of the interview, can you introduce yourself to the Substance community?

My pleasure. I’m Chris Hodgson, I grew up in Pontefract, West Yorkshire, in the UK and studied computer animation at Teesside University. After graduating I got my first job in the games industry and since then I’ve worked as an environment artist on a number of projects large and small including Watch_Dogs, Tom Clancy’s The Division and Payday 2.

While my actual job title is still environment artist, I have increasingly become more and more interested in texturing over the years and now it is easily my favorite aspect of what I do.

You are one of our avid Substance Designer power users, creating awesome Substances on a regular basis, but do you remember what first led you to discover and use our tools?

I had known about the software for a year or two but it wasn’t quite on my radar because at the time it wasn’t industry standard and I simply didn’t know many of my colleagues or friends using it. (That has all changed now, of course!)

Then just over two years ago, a lot of high quality texture work using Substance Designer started to appear on various forums and game art community sites. Hugo Beyer and Josh Lynch are two of the culprits behind this impressive work and I quickly realized how powerful the software actually was and that now was the time to start working with Substance Designer.

After months of using Substance Designer in my spare time after work and at lunch, I posted my first finished Substance on ArtStation and I have never really looked back since. Now I can’t walk ten feet outside without seeing a potential material I can replicate lurking in a patch of mossy grass or a cobbled street.

What do you think are the main strength(s) of Substance Designer compared to other texturing applications?

I’d say that the major strength that keeps me coming back to Designer is its very non-destructive nature. Placing and arranging various 3D assets into a tiling pattern and then baking the result to a flat plane has been the default way of creating tiling textures for a good number of years now. Whilst I enjoy this approach and the high quality results it delivers, once complete it is very difficult to change aspects of your texture. Just adding an extra row of bricks in a tiling brick material for example can be a lengthy task as you have to delete and then manually replace your individual brick meshes into position once again.

Substance Designer allows these kinds of modifications to become as simple as moving a couple of sliders left and right. This increase in speed enables me to be more creative with the composition and content of my textures and the work is very easily reused in other substance materials in the future. I also just find it to be really good fun seeing the results of your work so quickly.

Do you think there are still some misconceptions about Substance Designer and procedural art in general?

Definitely. There is a lingering misconception that procedural art is somehow less creative and I’ve even heard the opinion that it is somehow taking control away from the artist. Personally, I’ve never had more control or felt as creative as I do when I am creating textures with Substance Designer. Luckily the proliferation of successful Substance pipelines in top studios around the world speaks for itself and is slowly putting these misconceptions to bed.

I believe these kinds of attitudes can arise from a fear of change rather than something inherent to Substance Designer specifically.

With a good amount of industry experience and a developed workflow, people can find it hard when a new piece of software drastically increases productivity and quality and makes a part of their workflow obsolete. This is understandable to a degree, but I don’t believe that it is a sustainable attitude to have in the games industry due to the speed at which it evolves, I really enjoy the constant development of the tech and tools and it is something that is no doubt here to stay.

While you could sometimes decide to rely on bitmaps for small details (like leaves), you generally go 100% procedural: what are the pros (and potentially the cons), in your experience?

In production I use both methods for creating the smaller details but it really comes down to if it will be quicker to build them in Substance Designer or in an external application. Usually it’s best to build complex shapes externally and bring them in as height maps, but with simpler details you can go ahead and build them right in Designer.

The art that I post to ArtStation usually uses a more 100% procedural approach because I really like the challenge of keeping everything inside Substance Designer. I always learn new things when I take this approach and every new technique I learn by pushing myself in this way helps everything I make going forward.

What do you do when you don’t texture?

I like to spend time with my friends in the center of Stockholm and when I’m not doing that I like to play games and watch films. I’ve also recently gotten into photography and when I get the time I like to travel to new places to put the camera to good use.

Who would you like us to interview next ?

I’d love to hear from Bradford Smith, especially something about his workflow!

Finally, what about making a breakdown of one of your Substances for the community to learn from your workflow?

Over the course of creating more than a few rock Substances, I have discovered a good way to get a good base to start with. I use simple gradient maps multiplied together to get a shard that can then be scattered around with Tile sampler nodes. If you blend the two gradient maps with the (min) darken blend mode rather than multiply it can also give slightly different but still pleasing results.

In order to give the base rock a little more realism, I warp the shard I made in the previous step with “crystals 1b”, a custom node that allows me to alter the size of the crystals normally generated with the regular “crystals” node. I actually warp 3 times using this method with different crystal scales in order to get good variety in the details.

Now I use the tile samplers to position the shards with various amounts of randomness in their scale, value and rotation. The two tile samplers position the shards at slight angles to each other to give good directional flow to the rock surface going forward. The first tile sampler is then used to cut into the second using the darken blend mode. I use a histogram range node to position the cut at the height that looks the most visually pleasing. I take the result, offset and darken again in a similar way as before to get another cheap set of details into the rocks highest points.

This next step includes more cutting away at the rock shape. However, this time I also add some rock details into the lower areas of the height map using using the (max) lighten blend mode. I also do some slight blurring in the lower areas to keep the texture from having such harsh height changes. It helps the rock surface look more worn and eroded.

With the large scale shapes complete, I move onto smaller and smaller surface details. I blend in a small amount of grunge map 4, subtract some scratches and I build a complex noise pattern with medium to small details that is then used right at the end to directional warp and slope blur the rock height. This is the stage where the rock surface really starts to feel realistic with a range of shape variations from large to small.

Some more cutting into the shape using the darkening technique used earlier.

This is the last stage editing the height information before it is output to the normal map. I always add the smallest details right at the end. I use moisture noise to directional warp the surface a tiny amount and add in some final micro details with black and white spots 2.

Now onto the rock color. I warp the final height data using the complex noise pattern from earlier in the graph and pass the result through two different gradient maps. I then use a mask created from two of the preset grunge nodes to blend between the two. I also create a simple dirt gradient map in a similar way that will be blended in the next step.

In the final steps I combine the dirt and the rock base color using the the dust node. This allows the dirt to only sit on the upwards facing parts of the rock surface.

To finish the base color I add some very subtle amounts of curvature with some level adjustments. This just makes the final color pop a little more than it normally would.

The roughness is really simple and constructed from the results of previous stages of the map with a small amount of grunge added on top.

From there I output all the maps and present them in Marmoset Toolbag.

Chris Hodgson

Chris just released some of his most famous Substance files on Gumroad: don’t hesitate to support him 😉

You can find Chris’s Artstation right here.

And here is a look at his personal work space right here:

The Substance Art of Peter Kolus

The Substance Art of Peter Kolus

The Substance Art of Peter Kolus

Pierre Maheut on June 6 2016 | News, Stories, Design

Who are you?

Peter Kolus

 

Where are you based?

I was born in Slupsk (a small city in the north of Poland) and currently I’m based in Warsaw.

 

Website and portfolio:

You can visit my website at: www.peterkolus.com
ArtStation profile: https://www.artstation.com/artist/pionier
Behance: https://www.behance.net/pionier
CGSociety gallery: http://pionier.cgsociety.org/

Hey Peter, can you tell us more about yourself?

I’m Peter Kolus, a shy guy from Poland who has been in love with 3D graphics since childhood. I don’t have a traditional art background, in fact I graduated from humanities in high school. Like many in the VFX business, I’m a self-made man and learn everything by myself at home.

 

 

What do you do?

For the past two years I’ve been working as a freelance artist, doing lots of different projects for my clients, typically realistic images for advertising. In February I decided to try something new and joined a company not related to advertising/VFX business, 3D4Medical, as a senior 3D generalist.

 

What are your specialties?

I think about myself as a generalist artist. My professional path started at the architecture company and then followed to VFX/cinematics studios and ended up at the advertising agency. I definitely prefer to work on realistic and cartoony images (you can see plenty of these in my portfolio). But if you ask me to point out three stages I like the most in production itself, that would be compositing, lighting and recently texturing with Substance Designer.

Render of the New Balance Visaro shoes

“The ability to easily swap assets by one click and to generate new sets of maps saved me a huge amount of time in a freelance job with a very tight deadline.”

What are your sources of inspiration?

Definitely the Old Masters (ha, you didn’t expect that, did you? ;)). Everything that you can see on the Rijksmuseum website has that ‘thing’ which drives my creativity and inspires me on daily basis. The second source of inspiration is the art of photography. I’m checking different photographers’ portfolios online regularly, and I go to photography exhibitions and study good lighting. It’s amazing how much it can improve your work.

 

How did you discover the Allegorithmic tools? Which ones have you used on this project?

The very first video I saw was the one on YT that shows Substance Painter with its particle brushes, but back then I didn’t have time to explore it more. Then somewhere around the release of Substance Designer 4, I spent some time learning more about Allegorithmic’s tools. The ability to easily swap assets in one click and to generate new sets of maps saved me a huge amount of time in a freelance job with a very tight deadline. After only a few days of watching tutorials and reading forums, I knew I would buy the license. It was totally mind-blowing comparing to my old Photoshop texturing approach, Substance Painter with painting on each map at the same time and Substance Designer with its node-based system.

“The big challenge was to recreate all of the patterns from the shoes, and this was the part where Substance Designer showed its true power with a procedural approach.”

Tell us more about the New Balance and Airtox projects.

I did the New Balance project for my friends’ company HelloMono and it was actually one project with some additional small tasks around it. The original idea was to create an interactive experience when you can discover both shoes by watching them in 3d via the website. Another part was related to displaying different color variations and different sole materials. Once the commercial projects ended, I decided to revisit the Visaro model and make the renders for my portfolio. Therefore, I did shading, lighting, and postproduction all by myself. I also helped a bit with modeling the main shell and general proportion of the shoe. Same for the Furon model that is currently at the WIP stage where I’m testing new bump maps made in Substance Designer.

 

The second shoes project – Airtox – was done for a friend’s company, Alchemiq Studio. The goal here was to provide an asset ready for hires print where the shoes will be split in half to show off their features.

 

What was your biggest challenge on these projects?

The biggest challenge was to create the correct model at first. We didn’t have the scan, and the client provided only the CAD model of the sole (which requires retopology). The rest of model needed to be done manually. I took tons of reference pictures for modeling to make sure that the proportions were correct. The big challenge was to recreate all of the patterns from the shoes, and this was the part where Substance Designer showed its true power with a procedural approach. It was a real lifesaver in the New Balance and Airtox shoes accordingly.

 

What are the different tools you use?

Whenever I have to or the client wants me to use maps bigger than 4k, I’m forced to use Photoshop, obviously. Apart from that, I’m a 3ds Max and Modo user. For rendering, I choose Corona or the native Modo renderer. For compositing it’s Fusion or Nuke (again, I prefer node-based systems over ones that are layer-based).

What was your production pipeline on this project and how did Substance integrate into it?

It’s simply a one-man production pipeline. After modeling with UVW’s was done, I brought in everything to Substance Designer to start making all the maps. I exported only bump and normal maps out of Substance Designer. However, I set up reflection values base on the guide to Physically Based Rendering available from Allegorithmic’s website. I also setup tiling for each map in 3d software instead of doing it inside Substance Designer. That way I was not limited to 4k output at the rendering stage.

 

Do you have some techniques with Substance to share with the community?

I don’t have any special tricks, to be honest. I highly recommend playing with each node to find out how it works. After that practice, you should realize how to use them in combination with different nodes to produce what you want.

The Airtox shoes

“I wasn’t a big fan of texturing before, but since Substance Designer that has changed.”

How did your use of Substance change your approach to texturing?

I wasn’t a big fan of texturing before, but since Substance Designer that has changed. I know it sounds cheap, but I really enjoy this process now. You can make variations very quickly, it’s all interactive, it’s easy to manage projects and switch source meshes while maintaining all texture details. You don’t generate work files that are big in size, which means you don’t wait on writing/reading the files as much.

Tell us more about your next projects.

Currently I’m working together with a very talented photographer, but I can’t really talk much about it at this stage. I’m also preparing the Reebok classic shoes render (my portfolio project). Obviously I will use Substance Designer and Substance Painter for texturing.

 

 

All images courtesy of Peter Kolus

Substance for Architecture with Gastón Suárez Pastor

Substance for Architecture with Gastón Suárez Pastor

Substance for Architecture with Gastón Suárez Pastor

Pierre Maheut on August 23 2016 | News, Stories, Design, Architecture

Who are you?

Gastón Suárez Pastor

 

What do you do?

I’m a 3D artist from Buenos Aires, Argentina, and I specialize in architectural visualization.

m a 3d artist from Buenos Aires, Argentina, and I specialize in architectural

visualizatio

What is your background?

I studied architecture at the University of La Plata, Argentina.

 

Where can we find you online?
www.timemachinecg.com

www.ronenbekerman.com/portfolio/best-of-week-272016

 

What is the state of archviz in Argentina and in South America in general? What future do you see? 

There are many good artists in my country but I think it is a complex market. There is good architecture and there are important firms, but when it comes to visualization, not many studios are willing or able to pay for the services of these artists. This generates uneven competition between professional artists and beginners, and in the end, the value of our work takes a direct hit. This is why most [professional archviz artists] work for the US and Europe – Europe in my case.

 

How did you discover Allegorithmic’s Substance software? 

I have known about Allegorithmic’s software for quite a while. I’m always on the forums and browsing dedicated pages. Also, I’m a hardcore gamer and a huge nerd, so I like to know how games are made. In that respect, Allegorithmic is all over the place. With all the hype there is around archviz in Unreal Engine, I came to rediscover the software (and understand it) and integrate it into my pipeline to seamlessly work between real-time and passive rendering.

 

You use the Corona renderer. Tell us more about your workflow between Substance and Corona. How do you get great results going from one to the other? 

I have read (among other sources) the PBR guide you guys have published and tried to understand the principles and workflow as deeply as I can. I stick to that and things run pretty smoothly. I always need to tweak glossiness and specular maps but it works pretty awesomely well right out of the box. I get the results I’m looking for way faster than if I were to create my materials inside 3DS Max, besides the required tweaks. I think the guys at Render Legion are changing the way glossiness works in Corona 1.5, so I hope the update is for good. I would really like it if Corona, Vray, et cetera got rid of the specular/glossiness model and embraced metallic/roughless. That would be so great.

“I cannot think of texturing without Substance software anymore.”

Tell us more about the project you posted on the Allegorithmic forums. Can you describe your workflow on this project? 

The scene was rendered with Corona Renderer 1.4 on 3DS Max 2017. Assets were created with ZBrush and most important textures were created with Substance Painter and Substance Designer. This is a personal project for exploring new techniques – kind of a sandbox – mainly for the purpose of fully integrating Allegorithmic products to my pipeline/workflow. As far of the completion of this project is concerned, it was a success. I cannot think of texturing without Substance software anymore. Here is a close-up of the flowers. I created them with ZBrush, unwrapped with UV Master and then painted in Substance Painter:

Here are the models, UV layouts and the maps from Substance Painter, as well as one of the models in Substance Painter:

Here is a screen capture of the Painter layer stack for the metal structure and a render done with Corona. (I came up with that material in no more than an hour.)

Layer stack for metal structure (rendered in Corona)

The entire metal structure was created completely in Substance Painter and needed no tweaks with Corona.

Metal structure in Substance Painter

Metal structure rendered in Corona

Trying to do that the traditional way in 3DS Max would have taken me hours, even a day or so of tweaking, blending, and editing textures, and it wouldn’t have turned out so well.

 

On top of that, I can have a very specific material for every piece of geometry. All of them will be different but at the same time alike: you can see the metal structure as an element made of one single type of material, but every single pice in that model is unique because of the accumulated dirt, scratches, edge wear, and all of the great effects that Substance Painter puts at your fingertips mostly by the push of a button. That is amazing. Doing it the traditional way is a pain.

 

This image features materials created in Substance Designer and painted with Substance Painter:

The floors were done with Substance Designer. Again, these are super-simple materials (my first ones), but I insist on the face that I was able to achieve these results with almost zero knowledge of the software, which I think is really cool.

Kitchen floor materials created in Substance Designer (with graphs)

This one is a great example of what I said in the forum about recreating very low-res textures from the only original reference we could find. For the floor, it was a small sample in its original size: 205x205. Thanks to Substance Designer, we ended up with a beautiful 4K texture with a lot of details and all of its maps. Just wonderful.

 

This was going to be the tiles for the kitchen and then we decided to go with brand-new metro tiles, but I kind of liked this material. I exposed some parameters to make it newer and and more detoriated. I’ll surely use it somewhere else.

 

 

From reference material sample (205x205) to 4K texture

Finally, I can show you two super simple assets, which are the table and kitchen island. These are very simple assets, but one thing I realized with them is something I have already mentioned about consistency and uniqueness. All the wood should look the same, but unique in the way they wear and are used. Smart materials are so great.

This was pretty much what I did with this project. It was a seamless and painless implementation to my pipeline. It is super-easy to pick up and is a true miracle worker. Once you texture with Substance and see the results, you won’t be able to think of texturing in any other way. Besides, I now have a solid texturing hub for my real time and passive rendering workflow. If I do a project with Corona and then I have to make it interactive with Unreal Engine 4, the most important thing to achieve is consistency: the results must look the same. This way, that issue is solved.

“It was a seamless and painless implementation in my pipeline.”

Do you know of other examples of Substance usage in the archviz community? 

Most people in archviz don’t know about the Substance software, and when they do, they believe it’s just for games. However, I truly believe that this is going to change soon because Allegorithmic is starting to be known among archviz artists. Every day I see artists posting work here and there using Substance Designer and Substance Painter. SOA Academy, an archviz school, is planning a master class featuring Substance Designer. Ronen Bekerman, who runs an archviz blog, is also interested in this software. Both of these references are very important in our field.

 

The other reason I believe that Substance for archviz is imminent on a wider scale is realtime archviz (Unreal Engine 4 and Unity 5). If you’re close to anything related to games, sooner or later you’ll meet Substance and you will love it.

 

Do you have some cool tips and tricks you want to share with the community? 

I’m a newcomer to Substance, so pretty much everyone knows a lot more than I do! In talking to archviz guys using Substance, though, I did learn a really cool trick from one of your streams (the one about creating wood in Designer) with the pixel processor node. In archviz we are always creating tile patterns so it is very important that the surface details don’t continue from one tile to another, and this one does it beautifully. I recommend you watch the video.

In your opinion, what is the future of archviz in architecture? 

I think (or at least I hope) that PBR will become a standard at some point and realtime archviz will take a bigger role not only for animation or walkthroughs but for rendering still images as well. If you see the work of Koola or UE4Arch, among many others, their quality is amazing and there are some shots where you won’t be able to tell the difference between Unreal Engine and a passive renderer. Working in real time is such a great thing.

 

What are your future projects?

I’m actually beginning a new sandbox archviz project where the main goal is to fully texture the scene within Substance Designer and render both in Corona and Unreal Engine 4 to compare the quality.  I’m also setting up a store where assets compatible with Corona/Vray and Unreal Engine, with complete texture sets and LODs, will be available. There will also be a section with free stuff.

 

Finally, I’m currently working on several commissioned projects from clients in Norway, Finland and Sweden with a great number of open environments and nature. I am putting a lot of Substance into them.

 

What do you do besides 3D archviz? 

I like gaming, and I also like photography, so I enjoy going out with my camera to take pictures and scan nice trees, rocks, and ground to have in 3D. Astrophotography is something that I like a lot, too. As you can see, I am a huge nerd, as I mentioned before 🙂

 

Is there a local architecture, design or even game/VFX 3D artist that inspires you? 

In architecture I really love the work of Thiago Lima from Brazil.

 

One last question – can you show us a picture of your workspace? 

Here it is!

 

 

Texturing Epic Games’ Robo Recall: Substance Painter for VR Workflows

Texturing Epic Games’ Robo Recall: Substance Painter for VR Workflows

Texturing Epic Games’ Robo Recall: Substance Painter for VR Workflows

Pierre Bosset on May 18 2017 | Stories, Game

Epic Games’ Robo Recall is arguably one of the most fun arcade shooter games for Oculus Rift out there. Besides that, it’s also one of the best looking VR games to date. We interview Edward Quintero, who was responsible for texturing the characters and weapons for the game.

Hi Edward, thanks for taking the time to do this interview! Can you tell us more about your background?

I’ve been working in the visual effects, animation and video game industries for well over 17 years now. I’ve done a little of everything including surfacing material/texture artist, matte painting, concept art and recently creative direction.

I’ve been lucky to have worked at great companies like Tippett Studios, Industrial Light and Magic (ILM), Dreamworks Animation and Epic Games.

I’ve also had experience as an entrepreneur. Back in 2003, I co-founded a design studio called Massive Black with some friends and most recently, started Mold3D.

How does your VFX background influence your work in the game industry?

I had experience working as an art manager for the games industry in the past, but it wasn’t until recently that I had the opportunity to work in a creative role in games. About a year ago I got a call from an old friend and mentor, Kim Libreri, who is currently CTO at Epic Games. He asked me if I would be interested in coming on board to help out on some of their projects.

Of course, I said yes! So over the course of a year, I worked on Paragon and Robo Recall as a contract artist.

The quality of video game art has been kicking some serious butt in the past decade or so, especially with the advent of real-time lighting game and PBR workflows. I felt it was a perfect time to try video games as I could now use a similar artistic approach that I’m used to working in VFX and animation.

In regards to painting textures, I’d say the biggest difference in games vs. VFX is the number of maps you need to create in order to help “drive” a material shader into telling it what look or effect you want to create. In film work, you really have to think ahead and be able to visualize the look of the final product in advance, and only knowing the final result at render time.

This is a disadvantage in many ways. With programs like Substance Painter, you get a “what you see is what you get” approach which makes creating so much more enjoyable, not to mention more efficient.

What I do credit my experience in film is my approach to painting. I think a program like Substance Painter makes it too easy, sometimes resulting in a procedural or similar look. So I’d like to think that my experience in VFX helps me aesthetically approach my work more carefully, and add the subtlety and realistic aesthetic that I’m used to creating for film assets.

What was your role in the VR game Robo Recall?

I was brought onto the team to help develop the look for the robots. I textured all the in-game characters, weapons, gloves and some props.

How did you start using Substance Painter? How did it change your workflow?

My first time using Substance Painter was actually on Robo Recall, now I’m hooked! I fell in love with how the app uses the generators and smart materials to quickly create wear and tear on the models, and the rest of the tools are amazing as well.

I’d say the biggest change to my workflow was how closely the look of the assets in Painter mirrored what I was seeing in VR using Unreal Engine. The advantage of painting inside a realistic lighting setup is a huge plus!

What this meant was less back and forth trying to nail your look. In VFX you have to rely on your test renders to make sure you are going in the right direction. Substance Painter was able to help me see my results in real time.

Could you describe us in detail your workflow with Substance Painter on this project?

Sure. I used Substance for painting, but more importantly, I was able to develop an internal material pipeline to make my work more efficient.

The first step was to develop the look of my first character, which resulted in a library of smart materials that were reused on every asset moving forward.

From then on, it was only a matter of custom edits to make sure the materials were assigned to the correct parts, and to control where grime and scratches were placed.

So developing these smart materials as-as first step saved me a ton of time moving onto new characters, especially since most of the assets shared similar materials.

After painting, I’d save out the textures and import them into Unreal Engine, where they would be assessed in VR using an Oculus headset. This was an amazing experience for me. Something about walking around your asset, in real time and in real scale is unexplainably fun.

Working in VR is something I definitely look forward to going back to, and I personally can’t wait for the future where we can do final paint and look development completely in VR.

Are there any tip or tricks you can share with the community?

I would say the biggest tip is to use the paint layer inside your masks to customize your Substance Painter generated smart masks. I see a lot of texture work online where you can easily tell a smart masked was used. The result is a procedural look that is easy to spot.

While smart masks are amazing, they aren’t 100% accurate. So always go back and edit your scratches and grime, or add your own custom work. I use photographs and my own masks in combination with the Painter generated ones, to give me a realistic and custom look.

What is different about creating materials and texturing for VR games?

Aside from what I’ve already mentioned above, I’d say game shaders are not as complicated as what you’d see in a VFX or film pipeline. Especially when you start going in deep with things like hair, or skin. You just don’t have the level of control yet as real-time engines have to do a lot of heavy lifting. This means lighter shaders and less resolution in your maps. 2k maps are common in games, wherein film you can be working with 8k+ resolutions for extreme detailing for closeup work.

What this means to the game artist is less texture map creation and output. In film, you rely more on creating custom alphas and maps that drive and control many material attributes. I spent more time painting black and white maps in film than anything else. The advantage is greater control and authorship over the final look.

In games, you have fewer maps to worry about but you do get the added plus of working in real time and with a “what you see is what you get” approach to painting. This makes it feel more artistic and enjoyable and you worry less about the technical aspects of creating images.

There is a pro and con to each discipline but every year games are getting closer to what we are used to seeing in a VFX pipeline. Without a doubt, real-time technology and creating for real-time engines is the future.

You are co-founder of Mold3D. Tell us more about it!

I started Mold3D about 4 years ago with a friend and colleague Robert Vignone. It was our intent to create a brand/website that would focus on art and education. At its inception, we were focused on 3D modeling and 3D printing but recently have started to cover emerging technologies and design.

One of the results of creating Mold3D was our online school, Mold3D Academy. We offer online classes taught by professional artists and are happy to announce 2 upcoming Substance classes to our summer term lineup.

We are developing a Substance Painter class taught by Christophe Desse of Naughty Dog fame, as well as an in-depth Substance Designer class taught by Pete Sekula. Also in our lineup will be classes that focus on real-time character and environment creation. So VR and video games are definitely a focus for us this year.

Will you be using Substance for future projects?

Yes! At the moment I am working on a secret VR project for an upcoming VR platform. I try to occasionally take on side projects in order to stay relevant in the industry and I’d like to think the experience reflects in the type of classes we develop for Mold3D as well as my personal work.