That was
a pretty good sales pitch right? That previous post, being all emotional and
unsuccessful, until I discovered Substance Designer and Painter. But in all
seriousness, this was just what I needed to boost my lacking 2D-Texture skills,
and more important, to bring back some joy. Being proud again. “Look dad! I
made a rusty pipe texture!”.
Revelations
I’m not
sure how long these “Substance” programs have been around, but at least I never heard
of them until I came across an interview with an Environment Artist veteran,
who wished he had learned to work with more modern software like this earlier on. Got
stuck in old habits while those damn kids from the neighbours passed by on
their fancy turbo-roller-skate-future-things. Environment Artist? Check? Tired
of traditional workflows? Check. Sounds like me, goofing around without too
much success in Paint Shop Pro and Lightwave for many years. Except that this
guy was much better, and found solutions.
So,
being triggered, I gave it a try but didn’t expect too much. You know, at some
point you get rusty and prefer to stick with whatever techniques you learned.
If you program Delphi for 20 years, it sucks to restart and having to learn
everything from scratch again in another language. Nonetheless, such “escapades”
can be eye-openers. Not often, but sometimes, sometimes it’s like love on first sight.
I had
that with Sculptris some years ago. After playing with primitive cubes and
cylinders like a baby in “normal modelling software”, who would have thought I
could make a one-million-poly organic thing that actually kinda looked like a
humanoid? Malformed and monstrous though, but that was the assignment after
all. Using Sculptris was a revelation in two ways. First because I succeeded
for a first time, second because it worked completely different than any other
3D modeller I had seen so far. Even the UI was completely different… Awkward I’d
say, even a simple task like Saving required a cold brain-reboot with
Sculptris. It’s as if that program wasn’t made for a Windows computer, but for
some unknown alien OS. Normally I would give up fast on weird programs, but “digital
claying” was so much fun. And once you know the weird spots and hotkeys, it
works pretty simple actually.
Now I had
this same kind of experience with Substance Designer. While downloading the
trial, I watched some YouTube movies. Results were very promising, but the way
of working was… nothing like Photoshop (or Paint Shop Pro in my case, I just
grew up with that – and probably missed a whole lot of 21th century 2D-witchcraft,
still using a very old version of that). How to explain it… have you ever
watched a 64K demo??
Sometimes you fall in love... with strange programs. Like Substance Designer.
64K
madness
A 64K demo is just a program where a camera flies through a, often abstract / Sci-Fi
themed, world. Complete with 3D stuff, animations, particles, and a layer of
audio on top. All pretty normal, but there is a small catch; as the name
suggests, the BIG challenge is to put the ENTIRE program into TINY 64 kilobyte
storage memory…
We all
know you’ll have to feed a game engine textures, 3D geometry and audio waves.
Otherwise it won’t poop out anything. Typically these resources are packed in
ready-to-rumble files, like TV dinner lasagne. You would draw textures with a
digital brush and photograph snippets. You would model 3D geometry with
primitive shapes and triangles, dragging their corner vertices into the right
positions. You would make audio by recording samples and mixing them together.
All “pre-processing” work.
But a
64K demo doesn’t use any pre-processed files. Just generate an empty 256x256
pixel bitmap in MS Paint, and you’ll notice you already exceeded the limit, 3
times! In a 64K demo, everything happens at the fly. Textures, geometry and
audio are computed mathematically. Bubutbut, how?!
You
could make a shader that generates random pixel colours, but that would result
in total chaos and epileptic attacks.
There need to be certain patterns.
Patterns you can describe in a mathematical way. Math and natural, organic
creations sounds like an unhappy match, but if you believe in a God, he or she
must be the greatest mathematician of all times. Mountains or ocean waves, they
can be simulated using sines, Fourier transformations, and a proper set of
input constants. Bathroom or pavement tiles have rectangular shapes. Rust, mould
and moss starts in gaps, and grows in certain patterns. Trees turns into grass,
grass into rock, and rock into snow as we climb higher on a mountain. Wind
blows in a certain direction.
Think in (noisy) patterns
Nothing happens just randomly in nature. And by observing, you can translate that “behaviour” into formula’s, choices or semi-random noise that delivers values between certain constraints. Everything you see and hear in a 64K demo is based on that, and thus computed real-time. Substance Designer borrows some of that principles. You won’t be drawing tile lines, skin pimples and rust layers here. You’ll be calculating it.
Nothing happens just randomly in nature. And by observing, you can translate that “behaviour” into formula’s, choices or semi-random noise that delivers values between certain constraints. Everything you see and hear in a 64K demo is based on that, and thus computed real-time. Substance Designer borrows some of that principles. You won’t be drawing tile lines, skin pimples and rust layers here. You’ll be calculating it.
Here the very first to compute a wood nerve pattern (thank you Daniel J.R. for showing how!). Banding is created with a gradient, horizontal blinds, and perlin noise. Next this banding gets warped semi randomly, based on another perlin noise. The result (below) looks trippy, but hey, ever looked close to a plank while being on drugs?
Substance
Designer is a texture “composer”, “calculator”… I’m not sure how to call the
process, but you won’t be drawing, that’s for sure (though you can import
photos or handmade textures and throw them into the formula). You’ll be
connecting various “base patterns”, noises and mathematical operators such as
blends, contrasts, warps or high-pass filters. Sounds very hard, but the
node-based system allows a fast trial & error approach. Just connect X and
Y, and see what happens. Soon enough you’ll be generating useful patterns that
can be used in your material, or otherwise in a future one, as you can build
your own library easily. So, no, you really don't have to be a scientist to use this.
The
results can be converted to typical (PBR) textures in the end. In a lot of
cases you’ll be making masks and a heightmap first. That heightmap can be
converted to a NormalMap and AmbientOcclusionMap. But the bumpiness can also
tell something about the roughness/glossiness. Sub-computations can be linked
to other sections that generate dirt, damaged edges, rust or whatsoever.
No you understand what the trippy pattern in the picture above is doing. It is used is used to generate colors, but also has a subtle effect on the NormalMap and RoughnessMap.
But is it useful?
Now I can hear you think... that floor above sucks, and also, I can do that in Photoshop just as well - and probably a lot quicker. I doubt if SD is any faster than a good old paint program indeed. It took me hours to generate the texture above, and yeah, the saturated colours are a bit weird. In my defence, a horror game needs weird textures. But more honestly, of course I lack skill (and patience). Just check YouTube, and you'll see some truly amazing (and very informative) results.
If you master the brush, I bet you can do the same -and (much) quicker. But at some point, SD takes over the torch when it comes to micro-detail, *correct* results, and making tweaks afterwards. Let me explain.
One major power of the Substance Designer workflow is that the various output maps keep a correct relation with each other. If you want to change a texture-set generated with Photoshop, you'll typically have to break-out certain parts, and redo multiple layers. In SD, the entire graph automatically adapts when changing a formula or input pattern. This allows to quickly experiment with different settings, and to generate multiple output variants (see the wall-paint texture below).
On the longer term, this will gain speed. And as said, every change is done correctly. Adding a tiny speckle will not only update the colors, but also the normals, height, metalness, roughness, occlusion, and so on. Also, if we decide the texture resolution is too small, we can increase the whole damn thing with a few clicks. Not just a cheap upscale operation; as with Vector based graphics, the actual detail and variation will increase. And... all done seamless of course.
Making the dirty variant of the wall texture was pretty easy once the base graph was made. With Vertex-Painting you can mix these two textures in the engine.
I can do it
Maybe most important, for me at least, is that I could make these textures in the first place. It took some time, patience and copying ideas from YouTube tutorials, but the results were better than any other texture I ever generated. And that is what counts, right?
I could copy realistic photo snippets and smear some poop dirt in a layer on top. But making (correct) normalMaps or turning them seamless without a cheap lossy approach was beyond my Harry Potter powers. So yeah, I'm quite happy :) Of course I still need some true artists to turn the Tower22 demo into something beautiful, but at least Substance Designer gave me some tools to do it myself again, for the time being... and for as long the trial version still works that is ;)