This lesson is for PRO members.

Unlock this lesson NOW!
Already subscribed? sign in

Introduction to WebGL Shaders

6:20 JavaScript lesson by

In this lesson we learn about the two types of shaders required in WebGL - vertex shaders and fragment shaders. We create one of each and see how those shader programs affect what is drawn on the screen.

Get the Code Now
click to level up comment guidelines


In this lesson we learn about the two types of shaders required in WebGL - vertex shaders and fragment shaders. We create one of each and see how those shader programs affect what is drawn on the screen.

Let's draw some content on this webgl-enabled canvas. Webgl is based around two vital concepts, vertices which are points in a 3D space, and shaders, which define how those vertices are interpreted and rendered on the screen. A shader I actually a small program written in a C-like language which is called opengl shader language, or glsl for short. We'll be creating the shader programs as JavaScript strings and then using webgl to compile those strings into functional shader code. Crazy, right?

I'll call and create a function named createShaders after initgl and before draw. We're going to need to make two shaders, a vertex shader that will define how the vertices translated, scales, or otherwise transformed, and a fragment shader that will determine the color of the pixels drawnbythose vertices. First let's create the vertex shader, gain this will be a string, I'll just paste it in here. Now let's look at its program. Like a common C program there's a main function that contains the code of the shader.

Normally we'd be passing in an array of vertices, in which case this main function would be executed one time for each and every vertex in that array. It would probably transform the vertex in some way, translating, rotating, and scaling it, and assign the result of that transformation to especial gl property called glPosition. At this point we won't actually be passing in any vertices, so I've just hard-coded this to set a position of 0,0,0. You'll note that this is being assigned with a call to vec4.

A vector is a sort of typed array. A vec4 contains four elements. We'll use these for the x, y, and z coordinates of vertex, the fourth value is known as w, and is necessary for the 3D transformation matrix operations. For now, I should be left as 1. Now this program is not yet a program, it's just a JavaScript string. We need to create a shader object, use this string as its source code, and then compile that into an actual functioning shader. Those are the exact steps taken in the next three lines of code.

First we create the shader with glCreateShader, assigning in the type of shader we want, glVertexShader. Next we need to assign our string as the source of our shader. You might expect to call a method on the shader itself to pass in the source, but that's not how webgl works most of the time. Instead we call glShaderSource vertexShadervs. This takes the shader you're assigning the source to, and the source that you're assigning. You'll see this pattern many times in webgl.

Rather than doing something to an object directly, you pass it into a special webgl method that does something to it internally. Finally, you need to compile that source into a real shader, same pattern here. Glcompileshader vertexshader. See, that wasn't really so complex, right? Now we need to create the fragment shader. I'll paste the code in here, and here again we have a main function that assigns at vec4 to a glFragment property. You can consider a fragment as being a single pixel being rendered to the canvas.

This fragment shader code will be run a single time for every single fragment that's rendered. The value assigned to glFragment is a color that that pixel will get. The values here represent the red, green, blue, and alpha channels for that pixel. We're passing in 0,0,0,1, so that's going to be fully opaque black. So again, we create a shader, this time making it a glFragment shader. Assign the source, and compile it. Getting the hang of it? Now we have our two shaders, but they're still not usable in this state.

We need to create a program that links the two together and then tell webgl to use this program. Since we need to access this shader program outside of the function later, I'll create a top level variable here called shaderProgram. We'll create the shader program with glCreateProgram. Then we need to attach our shaders to this program. This is done with glAttachShader, passing in the program and the shader. Do that for both shaders. Then we need to link the two shaders into a program, glLinkProgram.

OK, now our program is ready for webgl to use, so we say glUseProgram passing in the program. We've created our two shaders and they're ready to go. Now we can draw something. In the draw function I'll add a call to glDrawArrasy. This will draw our array of vertices. Of course we haven't actually created such an array, but webgl will play along anyway. I'll pass in glPionts which tells webgl how we want to render the vertices. We'll see a lot more of this later.

The next value is offset which we'll leave as 0, and the number of vertices to draw which we'll set as 1. When we run this, you probably can't see much, because by default, a point is drawn too small to really see. We can change that by assigning another optional value in the vertex shader. We'll say glPointSize = 10.0. This tells webgl to render points as 10 pixel squares. Run it again, and yes we have a point at 0,0,0, the center of the 3D world.

Going back to the vertex shader, we can change the position on the x-axis, or the y-axis. We'll discuss the webgl coordinate system in another lesson. We can change the point size, and jumping down to the fragment shader we can change the color that that point is rendered with. Go through this and experiment with it until you understand the basic flow.

Joel's Head
Why are we asking?