[Home]OpenGL/Shaders

ec2-3-17-79-60.us-east-2.compute.amazonaws.com | ToothyWiki | OpenGL | RecentChanges | Login | Webcomic

Modern graphics cards support a more flexible rendering pipeline than the original OpenGL fixed model view.  The key features are in the programmable "shaders" that allow complex tweaks to be made to the colour and position of rendered objects at various points in the graphics rendering system.

Shader programs


The main shaders are the Vertex shader (working on the vertices of an object) and the Fragment shader (working on the individual pixels during rasterization).  Shaders in OpenGL are written in a basic C syntax.

Example vertex shader program:
      /* Global variables - shared with fragment shader */
      varying vec3 normalPassOn?;

      /* Simple vertex shader program */
      void main() {
          vec4 vertex  = gl_Vertex + vec4( -1.0f, -1.0f, 0.0f, 0.0f );
          gl_Position  = gl_ModelViewProjectionMatrix? * vertex;
          normalPassOn? = gl_NormalMatrix? * gl_Normal;
      }

GLSL syntax


The shader language supported by OpenGL is simple.  The shader must have a "void main()" method, with inputs and outputs being defined externally as global variables.

The basic defined types are:
  float, int, bool
  vec2, vec3, vec4 [vectors - also ivec and bvec]
  mat2, mat3, mat4 [matrices]
  sampler          [textures]

The vertex shader programs include the globals:

Inputs:
  vec4 gl_Vertex;
  vec3 gl_Normal;
  vec4 gl_Color;
  mat4 gl_ModelViewProjectionMatrix?;

Outputs:
  vec4 gl_Position;


The fragment shader programs include the globals:

Outputs:
  vec4 gl_FragColor?;
  vec4 gl_FragDepth?;

Additional globals can be defined to pass custom information from vertex to fragment shaders.  Note that fragment programs are usually run far more often than the vertex shaders (per pixel, rather than per polygon), so the passed information cannot be modified in the fragment shader programs.  Each pixel processed by the fragment shader has the same globals from the vertex shader.

OpenGL Integration


The OpenGL library APIs allow shader programs to be compiled at run-time specifically targeted at the hardware in use. 

First, we hard-code the vertex and fragment shaders as strings (they can also be loaded from files)

   GLchar * vertexProgram[] =
    {
      "/* Global variables - shared with fragment shader */          ",
      "varying vec3 normalPassOn?;                                    ",
      "                                                              ",
      "/* Simple vertex shader program */                            ",
      "void main() {                                                  ",
      "  vec4 vertex  = gl_Vertex + vec4( -1.0f, -1.0f, 0.0f, 0.0f ); ",
      "  gl_Position  = gl_ModelViewProjectionMatrix? * vertex;        ",
      "  normalPassOn? = gl_NormalMatrix? * gl_Normal;                  ",
      "}                                                              "
    };


   GLchar * fragmentProgram[] =
    {
      "/* Global variables - shared with vertex shader */          ",
      "varying vec3 normalPassOn?;                                  ",
      "                                                            ",
      "/* Simple fragment shader program */                        ",
      "void main() {                                              ",
      "  gl_FragColor? = vec4( 1.0f, 1.0f, 0.0f, 0.0f ) * dot( vec3(1.0f, 0.0f, 0.0f ), normalPassOn? );",
      "}                                                          "
    };


Each shader is compiled using the glCompileShader?() call...
   GLuint createShader( int type, char ** program, int programLines )
  {
    GLint  error;
    GLenum  compileStatus;
    GLuint  shader;

       /* Create a Vertex shader */
      shader = glCreateShader?( type );

       /* Install the shader program */
      glShaderSource?( shader,
                      programLines,
                      program,
                      NULL );

       /* Compile and check status */
      glCompileShader?( shader );
      glGetShaderiv?( shader, GL_COMPILE_STATUS, &compileStatus );

       /* Compilation error handling */
      error = glGetError?();
      if ( error != 0 || !compileStatus )
        {
          GLchar    message[1024];
          GLsizei * messageSize;

           glGetShaderInfoLog?( shader, 
                              sizeof( message ),
                              &messageSize,
                              message );
          printf( "Error compiling shader.  Error code %x: %s\n", error, message );

           glDeleteShader?( shader );
          return 0;
        }
      printf( "Success\n" );
      return shader;
  }


Finally, the Vertex and Fragment shaders are "linked" into a single program, ready for execution when signalled by the glUseProgram?( program ) call.

   GLuint createShaderProgram?( void )
    {
      GLuint program        = glCreateProgram?();
      GLuint vertexShader  = createShader( GL_VERTEX_SHADER, vertexProgram,
                                            sizeof( vertexProgram ) / sizeof( GLchar * ) );
      GLuint fragmentShader = createShader( GL_FRAGMENT_SHADER, fragmentProgram,
                                              sizeof( fragmentProgram ) / sizeof( GLchar * ) );
      glAttachShader?( program, fragmentShader );
      glAttachShader?( program, vertexShader );
      glLinkProgram?( program );
      return program;
    }

Implementing the standard Model view with Shaders



Well, with older versions of open gl you can cheat and ask it "What would the fixed function pipeline do?"
 void main(void)
{
    gl_Position = ftransform();
}

But the transform is something very like:
  gl_Position = projection_matrix * modelview_matrix * vec4(vertex,1.0);

Most of the rest of the vertex shader does basic processing, before passing through to the fragment shader (TODO figure out why some bits are in the vertex shader when they appear to be useful only in the fragment shader).
In general it is much much cheaper to do work in the vertex shader, because there are far fewer vertices than pixels.  So anywhere that plain old interpolation is good enough, you want to do it in the vertex.  Calculate as early as possible, rather than as close to when the value is needed as possible, unless you know you're going to texkill it more often than not.  (Of course, if you read a texture in, for bump mapping perhaps, then some stuff really has to wait until the pixel shader.)  --Vitenka (Everything is only useful in the pixel shader, when it comes down to it - that's the thing that draws on the (render surface) screen, after all.)
I can understand that I may want to do processing in the vertex shader, but why aren't gl_Color and friends *also* available in the fragment shader?  I would expect that everything I could read in the vertex shader would also automatically be accessible in the fragment shader.  Maybe this is an architectural hardware issue...  Also, I am confused by vec2( gl_MultiTexCoord0? ): surely this value changes on a pixel-by-pixel basis as it seems to be the XY offset into the texture buffer? - DDD
I believe it is deliberate that you have to specify which elements you pass to the fragment shader - in general if some are going to be ignored you want to not pass them, since register space is limited.  (Or, more usefully, you can process more pixels in parallel if you use fewer registers.)  However yes - everything changes on a per pixel basis.  But the hardware interpolates for you between the values passed in by the vertex shader.  You don't try to specify every pixel "You are this pixel in the texture", you specify each vertex's position in the texture and let the hardware lerp for you.  (Same for normals, lighting direction, world-position etc. etc. etc. though the exact methods of interpolation vary and sometimes need to be done manually in the fragment shader)  --Vitenka
Thanks - that makes more sense (although it does imply a certain level of "magic" to specify exactly what is in textureOffset each time the Fragment shader runs).  I assume that if I specify a triangle, there are implicitly three "textureOffsets" (one from each vertex), and the fragment shader sees a textureOffset somewhere in the middle of them.  Can I access the three vec2() values and dynamically choose which bit of texture to display should I happen to not want linear interpolation (e.g. if I want to draw the texture to look as though it is mapped onto a curved surface)? -- DDD
Right.  Your vertex format can have many different bits of data - typically you want to pass in a position (xyz) and a texture offset (uv) - that's how you get basic texture mapping.  I *think* you can specify several common types of interpolation in case you want to do something complex.  For what you are asking, what you actually want to do is work out a normal for each vertex (which could be uploaded as part of your vertex definition) that gets interpolated and you use that as part of a calculation done in the pixel shader.
If you want direct access to the vertex data... I'm not sure you can easily do that.  You can certainly FAKE it by, for example, sampling a texture in the pixel shader (textures contain images only by convention - you can have any data you want in there...)  I'm pretty sure that you can specify which interpolation method you want as well, though I'm not sure of the syntax.  --Vitenka 

Don't think of it as 'magic'.  Think of it like this:
(position, normal, tangent, colour, uv) per-vertex streams + big pile of shader constants
--> For every vertex, the Vertex shader transforms all of that into a new set of streams of your own defining
(screen position, z, uv, colour, whatever)
--> Now, the rasteriser works out, for every triangle, what each pixel should have
(doing frustum culling etc.)
and then interpolates to get the inputs to your pixel-shader.
--> For every pixel, the pixel shader runs - producing a final output colour and alpha and depth value (and maybe some other outputs, but usually just a colour)




Aha... I think I therefore understand the meaning of the "varying" keyword.  Each triangle results in three runs of the vertex shader, which means three normals, texture offsets, colours and lightings.  When the fragment shader is called, I have only one of each - representing a weighted average of the three vertex shader runs' outputs.  The sampler2D is uniform, because it is the same texture no-matter where in the triangle I am currently rendering.  I probably need to start playing with the textureOffsets in the fragment shader to see what happens if I try to deform a texture.  From what you are saying, I could even go as far as having two textures - one describing local curvature, one describing a flat image - and do some more complex mathematics to do non-flat image mapping.  Cheers! -- DDD
Indeed.  The search term you are looking for is "Bump mapping"  --Vitenka

I have pulled out the interesting features: texture, material colour, lighting to pass into a simple 20% bias colour model...

Vertex Shader


   /* Global variables - shared with fragment shader */             
  varying vec3 normalVector;                                     
  varying vec2 textureOffset;                                     
  varying vec4 colour;                                           
  varying vec3 lighting;                                         

   /* Simple vertex shader program */                               
  void main() {                                                   
    vec4 lightSource;                                             
    vec4 vertex;                                                 
    vec4 lightPosition = gl_LightSource?[0].position;             

     /* Do some basic position processing */                        
    vertex = gl_Vertex + vec4( -1.0f, -1.0f, 0.0f, 0.0f );       
    gl_Position  = gl_ModelViewProjectionMatrix? * vertex;         

     /* Set up lighting and normal vectors */                       
    lightPosition = gl_LightSource?[0].position;                   
    lighting = vec3( lightPosition.x,                             
                      lightPosition.y,                             
                      lightPosition.z );                           
    normalVector = gl_NormalMatrix? * gl_Normal;                   

     /* Set up colour and texture positioning vectors */            
    colour  = gl_Color;                                           
    textureOffset = vec2( gl_MultiTexCoord0? );                   
  }

Fragment Shader


   /* Global variables - shared with fragment shader */             
  varying vec3 normalVector;                                     
  varying vec2 textureOffset;                                     
  varying vec4 colour;                                           
  varying vec3 lighting;                                         

   /* Passed parameters */                                          
  uniform sampler2D texture;                                     

   /* Simple fragment shader program */                             
  void main() {                                                   
    vec4 pixelColour = texture2D( texture, textureOffset );       
    pixelColour = (pixelColour * 4 + colour) / 5;                 
    gl_FragColor? = pixelColour                                   
                    * ( 4 + dot( lighting, normalVector ) ) / 5;   
  }                                                               

Common shader techniques


Most techniques used in the shaders are simple optical effects with few attempts to accurately model physical effects.

Bump mapping        - Modifying pixel colours to simulate non-smooth surfaces by applying a "normal vector" texture.
Parallax mapping    - An addition to bump mapping where the image texture is also warped according to the bump map.
Displacement mapping - Similar to bump mapping, but dynamically applied to vertex points (requiring a fine vertex mesh)

Fogs and particles  - Alter the saturation of pixel colours based on the z distance from the viewer.

ec2-3-17-79-60.us-east-2.compute.amazonaws.com | ToothyWiki | OpenGL | RecentChanges | Login | Webcomic
This page is read-only | View other revisions | Recently used referrers
Last edited March 23, 2010 11:25 pm (viewing revision 17, which is the newest) (diff)
Search: