WebGL setup and context

You can set up WebGL app by getting a rendering context, creating a vertex array, assigning texture graphics, and loading shaders to do the work in the GPU. To explain how to set up and use WebGL, we'll walk you through a modified Warp demo from ietestdrive (some parts have been removed to simplify the code). To review the full version, open warp in Internet Explorer 11, and press F12 to open the F12 developer tools.) For more info about F12 tools, see What's new in F12 Tools.

Setting up WebGL

The version of the Warp demo we're using consists of an HTML file and a JavaScript file. The full code listing for both files are in Resources. The HTML part of the Warp example is pretty simple. The HTML consists of a simple pair of canvas elements, one for WebGL, and the other for 2D canvas.

The JavaScript code is contained in script tags inside the <body> elements and below the <canvas> elements. By putting the JavaScript below the HTML, we can be sure that the HTML elements are created. Otherwise, we could put the script tags in <head> section, and use the load or DOMContentLoaded events to ensure HTML elements have been created.

The Warp example uses a renderer function written in JavaScript to contain the initialization code, as well as reset, undo, and mouse move functions. The init function initializes the associated shader programs, buffers, and data for both the grid and photo portions of the example.

What's a rendering context and how do you get one?

Canvas elements have two rendering context objects, CanvasRenderingContext2D for 2D and WebGLRenderingContext for 3D. The majority of drawing APIs are provided by these contexts.

A canvas rendering context is an object that provides a drawing surface that, along with methods, properties, events, and constants, lets you create and manipulate graphics on your screen. Canvas offers two rendering contexts:

A canvas element can only have one rendering context at a time, so this example uses two canvas elements. The first canvas is for WebGL and uses a WebGLRenderingContext object. The second canvas element uses a CanvasRenderingContext2D to draw reference points over the WebGL image. If you don't care about reference points, you can skip the second canvas.

The WebGL spec is still being developed, so the context is still in its "experimental" phase. Currently most vendors use the syntax canvas.getContext("experimental-webgl") to get a WebGLRenderingContext. This may change once the WebGL spec is final.

Here are the context setups for the Warp example:

This example returns a CanvasRenderingContext2D object:


// Convert the image to a square image via the temporary 2d canvas. 
var canvas = document.getElementById("2dcanvas");
var ctx = canvas.getContext("2d");


This example returns a WebGLRenderingContext object:


      // Get a context from our canvas object with id = "webglcanvas".
      var canvas = document.getElementById("webglcanvas"); 
 try {
   // Get the context into a local gl and and a public gl.
   // Use preserveDrawingBuffer:true to keep the drawing buffer after presentation
   var gl = this.gl = canvas.getContext("experimental-webgl", { preserveDrawingBuffer: true });
 }
   catch (e) {
   // Fail quietly 
 }


In this example, the canvas element is called webglcanvas and the canvas object is returned using the getElementById method. The WebGLRenderingContext is returned using var gl=canvas.getContext("experimental-webgl"). You can call the variable anything you want, but "gl" is a generally standard naming convention for a WebGLRenderingContext object.

The getContext method can accept optional parameters. These are WebGLContextAttributes and offer a variety of settings. The antialias:false parameter tells the browser not to apply anti-aliasing to the images. This helps keep lines clean, since anti-aliasing blurs the edges as it tries to smooth them. The second parameter, preserveDrawingBuffer: true, tells the browser to retain the drawing buffer's contents until explicitly cleared. Otherwise it's cleared once it's been displayed.

Because many browsers support canvas 2D context, but not all of them support WebGL, we need to trap errors getting a WebGLRenderingContext. Using a try/catch block, if getting the WebGLRenderingContext request fails, a message in HTML is displayed and the init function returns without going any further. If a WebGLRenderingContext is returned, WebGL is supported and the program continues.

Creating shaders

Once WebGL support is confirmed, the example loads two shader programs, the lineprogram to display the grid, and the pictureprogram to display the photo. A shader program combines a vertex and a fragment shader and their data. The program is then enabled and linked to the GPU. Depending on the choices made in the Warp example, one or the other shader program is used

The shader code is contained in <script> tags in the HTML portion. You can write your GLSL code using <script> tags, or you can create a string in JavaScript, concatenating lines and line feed characters. Shader code is put in <script> tags here for better readability. In this example, a single function called getShader gets the code for either shader. The shader object is returned by getShader, and a shader program is created once both the vertex and fragment shader objects are created.

The shader code that Warp uses is discussed in more detail in Shaders.

Creating a vertex array

Shapes are known as meshes in WebGL, and built by describing a set of triangles in an array of vertices. The next part of setup is to create vertex arrays that describe the shapes we're using. Most WebGL apps use programmatic ways to create vertex arrays that describe objects. You can import a set of coordinates created in a 3D modeling tool, or create an array in a loop. The vertex array is then bound to a WebGLBuffer to be used by the shaders.

Vertex arrays are Typed arrays of type Float32Array. Typed arrays are a recent addition to JavaScript to use with WebGL. JavaScript is traditionally a type-less language. For example, you can have a var that represents a string, a simple integer, or a float with 10 places to the right of the decimal. In WebGL, however, you need to declare the type of variable you're using.

Warp uses two vertex arrays: one to describe lines to show the grid itself, and another to describe triangles used with a photo. In the Warp example, the first array creates a line grid mesh as a new Float32Array with a size of resolution x resolution x 20, or 8000 elements. The resolution parameter is set to 20, and represents the number of squares across (or down). The last number 20, is the number of data points it takes to create each square. For each side there are four numbers (x,y to x1,y1), times five sides.

Using createBuffer, we create a buffer to store the vertex array and then bind the buffer to the shader program using bindBuffer. bindBuffer takes the type of buffer (ARRAY_BUFFER), and the buffer as parameters. Binding the buffer tells WebGL to make this buffer active, which means the buffer can be filled by calls to bufferData. The gl.STATIC_DRAW flag tells WebGL that this buffer data will be written to one time, but used many times.

The app then turns on the vertex attribute with the enableVertexAttribArray method. This provides the interface between the array and the shaders by passing the location of the arrayBuffer we created. Lastly, we use the vertexAttribPointer method to get the index of the attribute we're using for the buffer. The call to vertexAttribPointer specifies the attribute, number of dimensions per element (2), type (gl.FLOAT), whether to normalize numbers (false), and index offsets to start at (0 and 0). This gives us our x/y coordinates for each line that describes the grid so the shader can understand it. This vertex array is later linked to the shader code using the a_texCoord attribute of type vec2 (2 dimensional ). A vertex shader has a number of attributes it can take for input, such as vec2, vec3, vec4, bool, int, etc. We're using a vec2 (2 dimensional) attribute to represent a point made up of an x and y coordinate using one variable.

The lineProgram array has a built-in fudge factor to move the grid slightly (.001) to the right to ensure that the wireframe lines are rendered inside the canvas boundary. This value is calculated into the coordinates given. The loop builds the mesh as rectangles, made of pairs of triangles, 5 sides at a time, top to bottom, left to right. The coordinates for WebGL are expressed from -1 to 1 units for the whole canvas, so each square is .1 x .1 units. Here's the example that creates the mesh code for lineprogram:

This section creates the buffers and binds them to the shader program:


try {
// Load the GLSL source written in the HTML file.
// Create a program with the two shaders
this.lineprogram = loadProgram(gl, getShader(gl, "2d-vertex-shader"), getShader(gl, "red"));

  // Tell webGL to use this program
gl.useProgram(this.lineprogram);

  // Look up where the vertex data needs to go.
this.texCoordLocation2 = gl.getAttribLocation(this.lineprogram, "a_texCoord");

  // Provide texture coordinates for the rectangle.
this.texCoordBuffer2 = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, this.texCoordBuffer2);
      
// Create a buffer and set it use the array set up above.
// Set it to be modified once, use many.
// createRedGrid sets up the vector array itself.        
gl.bufferData(gl.ARRAY_BUFFER, createRedGrid(), gl.STATIC_DRAW); // Fill buffer data

// Turns on the vertex attributes in the GPU program. 
gl.enableVertexAttribArray(this.texCoordLocation2);

// Set up the data format for the vertex array - set to points (x/y). 
// Use floats.
gl.vertexAttribPointer(this.texCoordLocation2, 2, gl.FLOAT, false, 0, 0);
}

catch (e) {
    // Display the fail on the screen if the shaders/program fail.
    log('shader fail');
    return;
}



This is the grid building function passed to bufferData:


function createRedGrid() {
// Make a 0,0 to 1,1 triangle mesh, using n = resolution steps.
var q = 0.001; // A fudge factor to ensure that the wireframe lines are rendered inside the canvas boundary.
var r = (1 - q * 2) / resolution;
//2 numbers per coord; three coords per triangle; 2 triagles per square; resolution * resolution squares.
var c = new Float32Array(resolution * resolution * 20);
// Array index.
var i = 0;

// Build the mesh top to bottom, left to right.             
for (var xs = 0; xs < resolution; xs++) {
  for (var ys = 0; ys < resolution; ys++) {
    var x = r * xs + q;
    var y = r * ys + q;
    // Top of square - first triangle.
    c[i++] = x;
    c[i++] = y;
    c[i++] = x + r;
    c[i++] = y;

    // Center line - hypotonose of triangles.
    c[i++] = x;
    c[i++] = y + r;
    c[i++] = x + r;
    c[i++] = y;

    // Bottom line of 2nd triangle.
    c[i++] = x;
    c[i++] = y + r;
    c[i++] = x + r;
    c[i++] = y + r;

    // First triangle, left side.
    c[i++] = x;
    c[i++] = y;
    c[i++] = x;
    c[i++] = y + r;

    // Right side of 2nd triangle. 
    c[i++] = x + r;
    c[i++] = y;
    c[i++] = x + r;
    c[i++] = y + r;
  }
}
return c;
}


The next example creates a mesh for the image. It's essentially the same setup code as the lineprogram array . However, rather than describing discrete lines, the pictureprogram array describes triangles. By describing triangles, the number of points is reduced so only 12 points (or 6 coordinate pairs) rather than 20 points (or 10 pairs).

This section creates the buffers and binds them to the shader program:


      var vertexshader = getShader(gl, "2d-vertex-shader");
      var fragmentshader = getShader(gl, "2d-fragment-shader");
    
      this.pictureprogram = loadProgram(gl, vertexshader, fragmentshader);
      gl.useProgram(this.pictureprogram);

      // Put the shader source into the <divs>. 
      document.getElementById("vertexshadersource").innerText = gl.getShaderSource(vertexshader);
      document.getElementById("fragmentshadersource").innerText = gl.getShaderSource(fragmentshader);

      // Look up where the vertex data needs to go.
      this.texCoordLocation = gl.getAttribLocation(this.pictureprogram, "a_texCoord");

      // Provide texture coordinates for the rectangle.
      this.texCoordBuffer = gl.createBuffer();
      gl.bindBuffer(gl.ARRAY_BUFFER, this.texCoordBuffer);

      // createImageGrid sets up the vector array itself
      gl.bufferData(gl.ARRAY_BUFFER, createImageGrid(), gl.STATIC_DRAW);  // Fill buffer data             
      gl.vertexAttribPointer(this.texCoordLocation, 2, gl.FLOAT, false, 0, 0);
      gl.enableVertexAttribArray(this.texCoordLocation);
      // Set up uniforms variables (image).
      this.pictureprogram.u_image = gl.getUniformLocation(this.pictureprogram, "u_image");

      // Set the texture to use.
      gl.uniform1i(this.pictureprogram.u_image, 0);
    }
    catch (e) {
        log('shader fail');
        return;
    }
         
    this.loadImage();
}


This is the grid building function that's passed to bufferData:


function createImageGrid() {
  var q = 0.001;

  var r = (1 - q * 2) / resolution;
  var c = new Float32Array(resolution * resolution * 12); //2 numbers per coord; three coords per triangle; 2 triagles per square; resolution * resolution squares.

  var i = 0;

  for (var xs = 0; xs < resolution; xs++) {
    for (var ys = 0; ys < resolution; ys++) {

     var x = r * xs + q;
     var y = r * ys + q;

     c[i++] = x;
     c[i++] = y;

     c[i++] = x + r;
     c[i++] = y;

     c[i++] = x;
     c[i++] = y + r;

     c[i++] = x + r;
     c[i++] = y;

     c[i++] = x;
     c[i++] = y + r;

     c[i++] = x + r;
     c[i++] = y + r;

    }
  }
  return c;
}


im

Set up the texture, or get ready to load the photo

The last step in the init function is to specify the uniform variable that will contain the photo. Uniform variables are essentially global variables, and don't change that often. The photo can be changed in this example, so the code to load an image is outside the init function. However, the shader needs to know how to get to the image to display and manipulate it.

To use a variable in WebGL, you must get its location in the shader program to access it. The location of u_image, a uniform variable in the shader program is returned using gl.getUniformLocation(this.pictureprogram, "u_image"). The fragment shader sets up u_image as uniform sampler2D , and we can use it by getting its location from JavaScript. Sampler2D is WebGL speak for a texture image. You can also set up a samplerCube to specify a cubeMap as a uniform. A cubeMap uses a vector cast into 3D space, and WebGL selects the appropriate face to sample from.

In Shader programs we'll talk about how the texture is used by the shaders.

Related topics

Use GLSL to write WebGL shaders
Create a WebGL texture from a photo
UI support
WebGL demos, references, and resources

 

 

Community Additions

ADD
Show:
© 2014 Microsoft. All rights reserved.