[OpenGL Part 3 ] Shaders in OpenGL

Shaders in OpenGL


In the previous part we looked very briefly at shaders. Shaders are small pieces of code that run on the GPU and they enable us to render graphics in lots of fancy ways. But before we look closer at the shaders, let’s have a look at the sequence of them :

The rendering pipeline


OpenGL goes through several steps in order to draw something on the screen. This sequence of steps is known as the rendering pipeline. It looks like this :

The gray-is parts are programmable and these are what I’ll be refering to as shaders. The ones with the dotted lines are optional while the ones with full lines we have to program if we want to render something. At least that’s what the specification says, but some implementations might not require it. So, on some systems, you might be able to skip it, but that’s not guaranteed to work on all system. And it’s much more fun writing the shaders ourselves anyways!

This part will teach you a little bit about each of the steps, what they do and how they work together. Each one of these steps are quite involved, so I’ll most likely dedicate an entire post for each of them.

Vertex Specification


In the previous part we set up VBOs and VAOs so that we later could use the VAO for rendering. That is the vertex specification stage of the rendering pipeline. More specifically its how OpenGL sets up the VBOs and VAOs when we tell it too. Since we already dealt with this stage in the previous part, and what OpenGL does behind the scenes isn't that relevant to us, we're just gonna skip to the next step.

Vertex Shader


The vertex shader is the first programmable step of OpenGL. This stage takes a single vertex and outputs a single vertex. The job of the vertex shader is basically to give every vertex the position they should have on screen. In the previous part we were able to use them directly because the position we gave in was the final position of the object. But if we wanted to move it, we could have used the vertex shader to do that.

Another point is that the screen is only 2D, so when we have a 3D object we need a way of representing that as 2D on the screen. This is a quite complicated step involving several steps to position all the vertexes in the correct position. We will look at this in later post, this post is just for getting an overview of all the shaders. The above image kinda shows this ; the cube is a 2D drawing, but it looks 3D because of how the vertexes ( corners ) are positioned.

Tesselation


In the games we see today, a high level of detail is important. And in order to achieve a high level of detail, we need a high number of vertexes. Imagine you have a ball in your game. How do you draw that with a high level of detail? If you have too few vertexes, it'll look blocky and not round at all. You could just add millions of vertexes to make it look better. But a million vertexes would mean 4 byte * 3 * 1 000 000 = 12 000 000 Bytes or 12 MB of just vertexes that's quite a lot, especially if your game has a lot of round objects.And more importantly, it takes time to render that much.

The purpose of the tesselation shader is basically to add more details to your object when needed. So when we see something from a distance, we don't need a lot of detail. But when we zoom in, we'll be able to see more details, so we need to render the object with more details so that it doesn't look block when viewed up close.

Geometry Shader


The next step in the rendering pipeline is the geometry shader. The geometry shader gets input in the form of primitives. ( A primitive is basically either a triangle, line or a point. ) Then with the geometry shader we can create new primitives. This means we can use it for things like spawning particels in a particle system. Or to make fur, hair, grass, etc.

Image converted using ifftoany

Let's say we have a sphere. When the tesseletion stage is done, we get the input in the geometry shader as tiny little triangles. Each one of these triangles are a tiny part of our sphere. With the geometry shader we can add fur to the sphere, and now we have a fuzzy little ball.


Using a geometry shader is one of the most efficent ways to make hair/fur/grass because it doesn't require any additional vertexes, everything is being done on the graphcis card. That makes it really quick.

The next three steps are fixed, so we can't implement them ourselves, so I'll only describe them briefly.

Vertex Post-Processing


This step does a lot of different operations on the vertexes. A lot of these prepares them for the next two steps primitive assembly and rasterization.

Primitive Assembly


This is, as the title suggest, the point were our primitives gets assembled. It receives a bunch of vertexes and puts them together into shapes like triangles. It also does some checks to see if a primitive is out of the screen ( or invisible in any other way ). If it is, this primitive won't get passed on to the next step.

Rasterization


Now we have our final primitives, but it's just a bunch of shapes. This stage rasterizes the data. That means it takes the data and turns it into something the resembles pixels, fragments.

As noted about, we don't get the actual pixel from the rasterizes but rather fragments. A fragment contains every data OpenGL needs in order to render a pixel. There will be at least one fragment per pixel. There can be more, but not less.

Fragment shader


This is the final shader that we can implement ourselves. It receives the input in the form of fragments ( as described above ) and outputs a single fragment when it's done. At this stage, we basically just set the color of the fragment. Though that can be rather complex. This is also the step were we'll put the texture on the object.

But setting the color and/or tecxtures also means setting the lighting, and this can get quite complex which means there will be another part for it. For now though, all you have to remember is that this stage is were we set the color ( including alpha value ) of the fragment.

Per-Sample Processing


The final step before we get something on the screen is the per-sample processing step. In this step OpenGL looks at every fragment and sees if it, for any reason, should not be rendered. This is done running several tests. If any of them fail, the fragment might not be rendered. Some of these tests aren't enabled by default, so you need to enable or set them up yourself.

Below is a short description of these tests, you can skip it if you want.

Per Sample Processing details

Ownership test


If there is another window over our OpenGL window, the pixels are not visible to us so there is no need to view them on the screen.

Scissor test


You can specify a special rectangle on the screen. If the fragment is outside of this, it'll fail the test.

Stencil test


A stencil test takes a stencil, which is basically a black and white image, and uses it to determine if the fragment should be rendered. It works just like a stencil in real life.

Imagine you take a sheet of paper and cut out a big 'H' in it. Then you put it over a different piece of paper and spray pain all over the H. When you remove the top paper ( the one with the H cut out, ) there will be a H on the paper, the exact same shape as you cut out. This is how this test works too. You can create a bitmap / image that works as the top of the paper. Everything this bitmap covers ( every black or every white pixel ) will then fail this test and not get rendered.

Depth


This is the test that checks if anything is actually visible covered up by something else. So if you have an object like a dice and something in front of it like a wall, the depth test is what makes sure the wall is drawn and not the dice.

Finally, the blending happens. This is where the final alpha value of the fragment gets determined. OpenGL has several ways of calculating alpha values, so this needs to be it's own step. It also relies on the alpha value set by the fragment shader so this step in particular needs to be done after the fragment shader

[collapse]

And that's all the steps of the rendering pipeline. Now we'll take a look at how we set them up in OpenGL. We will also expand on the previous example and make something a little bit fancier by creating our own geometry shader and fragment shader

Setting up the Shaders


There are a few calls needed for setting up the shaders, but it's actually a bit easier than VBO and VAO. The shaders consists of one main object called the program that collects all the shaders into one, like a VAO. The individual shaders are like the VBOs. They're created separately and in the end they're added to the program. After they've been added, we won't be dealing with them unless we are going to update them.

First we'll look at setting up the individual shaders. These are the grey steps in the image at the top. The process for setting them up is more or less identical for all shaders ( except that we have to specify the type of shader in one of the steps. )

glCreateShader


This is very similar to the other glGen* functions like glGenBuffers and glGenVertexArrays. But this one returns the id and only has one parameters, so we can only make one at the time. The parameter is used to specify type like glGenBuffers. This functions is used for all shader types.

Parameters :

  • GLenum shaderType - the type of shader to crate ( see below )

The shader type can be any of the following :

  • GL_VERTEX_SHADER - for creating a vertex shader
  • GL_TESS_CONTROL_SHADER - the first step of the tesselation shader
  • GL_TESS_EVALUATION_SHADER - the last step of the tesselation shader
  • GL_GEOMETRY_SHADER - for creating a geometry shader
  • GL_FRAGMENT_SHADER - for creating a fragment shader
  • GL_COMPUTE_SHADER - a compute shader is not a "standard" shader, it's just for setting a piece of code that will run on the graphics card. It is not part of the rendering pipeline so we won't be using it here

As you can see, there is 6 different types of shader we can create using this function. We will be using the 5 first, but the process for setting up each one of them is identical so it's not a lot of work.

Loading the shader source code


The next step is to set the actual source code for the shaders. This is the .frag and .frag files in the previous part. The first step here is to load the actual shader. This simply involves reading a simple text file, but we need to write the function ourselves because OpenGL has no support for it :

This function just takes a filename and returns all of the text file as a std::string

glShaderSource


Now we have our std::string we need a way of sending it to OpenGL. This is what glShaderSource is for.

Parameters :

  • GLuint shader - the id of the shader. We'll use the return value of glCreateShader
  • GLsizei count - the number of strings of data we want to use. We only have one file, so we'll use 1 here
  • const GLchar **string - the actual data we want to use
  • const const GLint *length - the length of each induvidual char*

This function might seem a little weird at first. The first argument is okay, it's just the Id of the shader. We dealt with similar things when we set up the VBO and VAO. But what about the others? I'll describe what the other parameters do and how to use them below. We won't be using this, but I do recommend reading it because then you'll know what the arguments are for. If you read it, you'll know exactly how to use it, which in the end, will make you less likely to write bugs.

glShaderSource details

As noted about glShaderSource is made to be able to take in several pieces of data. This allows you to have your shader spread into several different files. Then you could load all of them into different std::strings, one per file. Then finally you could add all the data into the shader with one call. This is were the different parameters comes in.

count is just the number of different std::strings we have.

const GLchar** string is a bit more tricky to understand. A GLchar* ( not the single '*' )is the same as char* which is just a text string. But we have two asterisks ('*')! In C++, a pointer is a lot like an array. So you can look at it like an array of char*. This is what allows us to send in several different strings at once.

The final argument, const const GLint *length works in the same way. Just think about it as an int array, were each value is the number of characters in the string with the same index.

Let's look at an example to illustrate this :

Note : this is pseudocode, it won't compile. But hopefully this helps you understand this function and all it arguments. Having a good understand of all the aspects of a functio will make it a lot easier to debug.

[collapse]

glCompileShader


Now we have loaded our shader, it's time for OpenGL to compile it. This is done using this simple function :

Parameters :

  • GLuint shader - the Id of the shader to compile, same as for glCreateShader and glShaderSource

As you can see, there isn't really much to this function. And after calling this method, this shader is ready to go. But we first have to create our main shader program. So the final crate + compile shader looks something like this :

The char* src = const_cast( str.c_str()); part is just a way of converting the result of str.c_str() ( which is a const char*. ) Because OpenGL expects a non-const char*, so we need to cast it using const_cast.

In glShaderSource(shaderId , 1, &src, &size); we use &src to create a pointer to the char* that holds our source. This turns it into a double pointer or a "pointer to a pointer", if you will. Similarly, OpenGL expects a pointer to an int for the size argument, so we pass in &size. In both of these cases the pointers are use to get array functionality for setting multiple sources ( as explained above. )

The shader program


Now we've created a shader time to add it to our program. As mentioned above, the shader program is what combines all shaders into one. Just like with VAOs we can have several off them. So we could have one for particle effects, one for regular objects, one for reflective surfaces, one for the ground with grass, etc... Since the shader program combines all the individual shader object, switching between them is easy. And setting them up is quite simple too!

glCreateProgram


This is very similar to the first function we looked at, glCreateShader. It simply creates an OpenGL shader program and returns the Id. We will use this program to connect our shaders and hook them up to the rendering pipeline.

That's all, now we have created a shader and can use it in the next step.

glAttachShader


Now that the shader program has been created, we can attach our shaders to it. This is as simple as it can get :

Parameters :

  • GLuint program - the Id of the shader program ( the one we created with glCreateProgram )
  • GLuint shader - the Id of the shader ( the one we created with glCreateShader )

It doesn't really matter at which point in time you call this function, as long as both the shader has been created with glCreate*. You can even do this before loading the source. All it does is that it attaches the shader to the shader program using the ids. Though I find it more logical to attach the shader after it has been fully created. That way we won't be adding any shaders that failed to compile.

glLinkProgram


The final step of creating a shader program is to link it. This will inspect the shaders and optimize them before creating an executable. And finally the executable will be sent to the GPU.

Parameters :

  • GLuint program - the Id of the shader program ( the one we created with glCreateProgram )

An we're done, the shader program has been created and uploaded to the GPU so we can use it in our OpenGL application.

glUseProgram


Finally, now that our program has been created, we can finally start using it. This function is also very simple, it simply activates the program we pass in as the parameter. There can only be one shader program at any time, so passing in a new id disables the old one.

Parameters :

  • GLuint program - the Id of the shader program ( the one we created with glCreateProgram )

Putting it all together


Below is a simple, fully working, example on how to set up a shade program.

And now, at last, we can have a look into setting up the induvidual shaders, first a look the language they are written in.

GLSL


Shaders are written in GLSL ( OpenGL Shading Language ), which is very similar to standard C. But with a few extra built in stuff and some other stuff removed. The most important addition for us right now is the storage qualifiers. These specify whether the value is an input or and output. If the value is an output, it also tells what kind of an input it is. The storage qualifier is place before the type of value ( see example below )

  • Attribute input values ( sttribute)
    • Attribute values ( passed from a VBO)
    • Only for vertex shaders
  • Attribute input values ( in )
    • Input values passed from previous shaders
  • Attribute output values ( output )
    • Input values passed from previous shaders
  • Custom input values ( uniform )
    • Input to the shader
    • Used for values that are not stored in a VBO
    • Can be any type ( float, int, bool, array )

We won't be using attribute, only in / out

Vertx shader example


Let's look at a simple vertex shader :

The top two variables attribute in vec3 in_Position; in vec4 in_Color; are shader input variables which we got from the VBO / VAO ( see below. )

The third variable, out vec4 ex_Color;, is our out variable. This is the variable we send to the fragment shader we have to do this manually by setting it in our main() like so : ex_Color = in_Color;

Fragment shader example


Now lets look at a simple fragment shader :

So, the way to pass in VBO data is through an in value. All attributes ( like positions and colors ) must be passed to the vertex shader and then from the vertex shader to the next shader and so on. An attribute can only be passed from one shader to the next, you can't pass it directly to the last shader for example. The output values will automatically be passed through the shaders we haven't written ourselves.

Geometry shader


The geometry is a little bit more complicated and more involved than the fragment shader and the vertex shader so I wont explain in in this post. I will show you a geometry shader example that you can experiment with. It's commented so hopefully it should be easy to get an overview over what it does.

Ordering


It is very important to get the ordering of the attribute variables right. Remeber this part :

The indexes we specify here ( positionAttributeIndex and colorAttributeIndex ) dictate the order you must create the attributes in the vertex shader. In out case, this will be :

Of course, we could change it so that

positionAttributeIndex = 1 and colorAttributeIndex = 0.

In this case we would have to declare

in vec4 in_Color; first and then in vec3 in_Position;.

This is something that's very easy to miss and can be really frustrating to debug. Generally, OpenGL is quite low level, so mistakes like these are really easy to do and hard to debug if you don't know exactly what to look for.

Some code


The major new part of code today is a reworked Shader.h that can load any type of shaders. I also added a geometry shader that'll give you an idea of what the geometry shader does. I didn't add a tesselation shader because that would require OpenGL version 4.0 and that would mean that a lot of you would not be able to run it. Besides, I think there already is enough new stuff in this part. Well anyways, here's some code :

Shader.h


The Shader.h has been rewritten. Most of it should be described in the blog post except for the getting of variables from the shader ( including the log. ) I'll get into that in another post

Vertex shader


I renamed the vertex shader vert.glsl.

Geometry shader


I added a geometry shader it has a few bools you can change to show off what you can do. Keep in mind that when we render it normally, all we get is a square. The extra triangles are created by the geometry shader itself.

Screenshot :
simple geometry shader

Fragmen shader


I renamed the fragment shader to frag.glsl I also added functionality for setting a random color :

Screenshot:
simple fragment shader

main.cpp


I also made a few changes to our main file. This time we only render the triangles, not the lines. I also changed the coordinates a little. It still forms a square, but it's separated into four equally large triangles ( instead of two. ) This makes working on it in the geometry shader a lot easier.

Screenshot:
exploded

Compilation


We'll compile it just like last time :

Using clang

clang++ main.cpp -lGL -lGLEW -lSDL2 -std=c++11 -o Test

Using gcc/g++

g++ main.cpp -lGL -lGLEW -lSDL2 -std=c++11 -o Test

Conclusion


In this part we looked at shaders, what they do, how to create them and how to set them up. I intentionally didn't dive deeply into the shaders themselves, but instead I showed how to set them up. I know there has been a lot of very basic setup stuff in these parts, but I find it important to know how to set up OpenGL properly.

The end result we get on screen in this part is quite simple, but feel free to play around with the geometry shader. There are a few bool values you can toggle to get different output. Or you could just modify the code yourself and see what you end up wtih.

But in the next part we'll finally look at getting something 3D on the screen. When we do have something 3D on the screen, we can manipulate ( roatate, move, stretch, etc.. ) it in various ways quite easily. See you then!


Feel free to comment if you have anything to say or ask questions if anything is unclear. I always appreciate getting comments.

You can also email me : olevegard@headerphile.com

Leave a Reply

Your email address will not be published. Required fields are marked *