Yo yo, Pythagoras in da house. F%ck you b!tch, MC^2 on the mic, spitting square roots that make you poop in your boots. What comes up, must go down, I'll bombard you and your homies with g=(m1 + m2) / r^2. I don't talk algebra operators, I shout integrators. Respect, subtract, smoke my crack, Blaise Pascal out.
Yeah, that would learn them... O pardon me, that was the first thing going through my mind when just hearing this little news on the radio. All right, let's rap something else: Geometry Shaders. have a moment for Snoop GPU, Easy E++, Dizzy Pascal, dirty old Register. and dr.Hashpipe.
Geometry Shaders? Who what where?
-----------------------------------------------------------------
"A Geometry Shader can generate, adjust or discard primitives(triangles, dots, lines, ...)"
Even when you never touched a keyboard, you probably heard of "Ssshaders". Vertex- and fragment(pixel)Shaders. What they do? They are like tiny programs running on the videocard "GPU's", telling how to compute their output: vertex coordinates & pixel colors. Vertex shaders can move the vertex / UV coordinates. Fragment shaders calculate the pixel colors. Usually based on quick & dirty lighting physics that approximate the real thing. Some practical examples (from the streets yo):
Vertex-Shaders
- Transform 3D coordinates to screen space / eye-space / world-space / cyber-space
- Animations; apply matrices from 1 or more bones to each vertex ("skinning")
- Water ripple physics
- Displacement mapping (read a value from a texture such as an heightMap)
- GPU cloth / particle physics
Fragment-shaders
- Diffuse, specular, ambient and emissive lighting computations
- ShadowMapping
- Reflection / refraction computations
- Cell-shading (cartoon look)
- Drawing specialized textures required for other techniques (depth buffer for example)
Well if you did some shaders, that's nothing new. Hey, they already exist for about 10 years now! The word "next-gen" can be replaced by "common-tech" again, time flies. The recent "Geometry Shader" program isn't really that new either anymore. It appeared somewhere in 2007 or 2008 I believe. How does it come I didn't notice it already then? Hmmm, let's just say it's usage is pretty limited in most cases.
The “GS” is an optional third program that can be executed after the vertex-shader. Unlike the Vertex Shader, the GS does not take single vertex-points, but complete primitives as input. Primitives? You know, dots, lines, triangles, the stuff you pass with a glBegin() command. So, when working with triangles you get array’s that contain 3 positions/texcoords/tangents or whatever you pass in the vertex-shader.
The GS passes this data further on to the Rasterizer(and then Pixelshader). But you want to perform some custom actions here right? Like said, you can decide here whether to "emit" a primitive or not. You could perform sort of culling here. Of course, you can adjust all the coordinates before passing them as well. But even more spectacular is the ability to generate new primitives. When processing a simple line as input, you could break it up in 10 pieces and bend it into a curved line. It's kinda like an advanced vertex-shader, with more input data and a flexible output. And just like in any other shader, you can use custom parameters including texture reads. You could for example render a terrain with rather large quads, then let the GS sub-divide the quad based on the camera distance, using a heightMap texture to apply the correct height value for each new vertex.
Some applications
- Handling sprite/particle clouds
- Tessellation / Level Of Detail
- Beziers, curves
- Fur / fins / grass
- Generating silhouettes (stencil shadows, light shafts, …)
- Render cubeMaps in one pass (this is why I got interested finally)
WIP, Work-in-Progress. Very simple 3D geometry, lot's of stuff missing, not very special so far... Except the fact that all textures are made by our own man Julio so far, instead of stealing them somewhere. Oh, did it use Geometry Shaders? No, but since lot's of objects are reflecting the environment here, a single-pass cubeMap technique can boost the performance pretty well...
Simple demo Code?!
-----------------------------------------------------------------
Let's do an example with Cg shaders. Ow, GLSL and some other shader languages support GS as well of course, as long as you have a card that supports Shader Model 4 or higher(I believe). In Cg, creating a GS program is pretty much the same as creating and using vertex- or pixel programs:
CG_GL_GEOMETRY = 10;
// Setup
gpProfile := cgGLGetLatestProfile(CG_GL_GEOMETRY);
prog := cgCreateProgram( cgContext.context, CG_SOURCE, pchar(code),
gpProfile, 'main',
pArgs );
// Use it
cgGLBindProgram( prog );
cgGLEnableProfile( gpProfile );
// --> render something
Now let's find a good & simple application for a Geometry Shader... Uh .... That's difficult. Maybe GS isn't that useful yet... Wait, how about an electro / lightning bolt? There are more ambitious demo's, but therefore I'd like to refer to the nVidia SDK's (check OpenGL SDK 10 for example). All right, here's the idea:
- Render 1 line (glBegin( GL_LINE ), 2 points)
- Let the shader break it up in 20 pieces, then applying random coordinates on each sub-point to make sort of a zig-zag.
The CPU can do that just as well, I know. But it's just to demonstrate how a GS program looks, smells, and works.
// Dammit, SyntaxHighlighter doesn't allow to use the >< characters, so I placed
// float3 between || pipes instead
LINE void main(
// Input array’s
AttribArray|float3| position : POSITION, // Coordinates
AttribArray|float2| texCoord0 : TEXCOORD0, // Custom stuff. Just pass that
uniform sampler2D noiseTexture,
uniform float boltMadnessFactor // Pass params as usual
)
{
// The bolt is just a simple line with 2 points:
// point 1 is the origin, point 2 is the target (impact point).
// Now apply "The Zig-zag Man"
const int steps = 20; // You could use distance LOD here, although that
// won't be needed for huge lightning bolts
float3 beginPos = position[0]; // Get input for simplicity
float3 targetPos= position[1];
float2 beginTX = texCoord0[0];
float2 targetTX = texCoord0[1];
// Interpolation values
float3 deltaPos = (targetPos - beginPos) / steps;
float2 deltaTX = (targetTX - beginTX) / steps;
// Output first (begin) point, don't modify it.
emitVertex( beginPos : POSITION,
beginTX : TEXCOORD0 );
// Generate 18 random points in between
for (int i=1; i < steps-1; i++)
{
// Interpolate position and texcoord custom data
float3 newPos = beginPos + deltaPos * i;
float2 newTX = beginTX + deltaTX * i;
// Just pick some random value from our helper texture
// then use it to modify the position.
// Reduce noise strength as the bolt approaches target
float3 randomFac = (2*tex2D( noiseTexture, newTX ).rgb-1) * boltMadnessFactor
* (steps-i);
newPos += randomFac;
// Generate extra line point
emitVertex( newPos : POSITION,
newTX : TEXCOORD0 );
} // for i
// Output last (target) point, don't modify it.
emitVertex( targetPos : POSITION,
targetTX : TEXCOORD0 );
// Of course this shader sucks. In fact, I didn't even try it :#
// But it roughly shows what you can do here.
} // Roger out
Still messing around with SyntaxHighlighter. It reminds why I hate HTML and such. How to shrink the gap between a line?! Why do the tabs suck so much?!
The next part of this Geometry shader adventure will follow in two weeks. More about rendering stuff in a single pass then. One of the best reasons to use that weird Geometry Shader :)
Oh, for all us forgotten Delphi souls. Can someone explain where to upload a small code-file so it can be downloaded here? The last Cg DLL headers for Pascal I could found were for Cg 1.5, written by Alexey Barkovoy in 2006. So I updated them a little bit. Warning, I haven't tested the whole though!!!
Another WIP. Same stuff, different light.
That really kickass ! another good article
ReplyDelete