Real-Time 3D Graphics with WebGL 2
上QQ阅读APP看书,第一时间看更新

Summary

Let’s summarize what we’ve learned in this chapter:

  • The WebGL API itself is just a rasterizer and, conceptually, is fairly simple.
  • WebGL's rendering pipeline describes how the WebGL buffers are used and passed in the form of attributes to be processed by the vertex shader. The vertex shader parallelizes vertex processing in the GPU. Vertices define the surface of the geometry that is going to be rendered. Every element on this surface is known as a fragment. These fragments are processed by the fragment shader.
  • Fragment processing also occurs in parallel in the GPU. When all fragments have been processed, the framebuffer, a two-dimensional array, contains the image that is then displayed on your screen.
  • WebGL is actually a pretty simple API. Its job is to execute two user-supplied functions, a vertex shader and fragment shader, and draw triangles, lines, or points. While it can get more complicated to do 3D, that complication is added by you, the programmer, in the form of more complex shaders.
  • The fine details of how WebGL renders geometry. Remember that there are two kinds of WebGL buffers that deal with geometry rendering: VBOs and IBOs.
  • WebGL works as a state machine. As such, properties referring to buffers are available and their values depend on the currently-bound buffer.
  • JSON and AJAX are two JavaScript technologies that integrate well with WebGL by enabling us to load large and complex assets.

In the next chapter, we will learn more about shaders and use them to implement light sources in our WebGL scene by passing information back and forth between the WebGL JavaScript API and the attributes, uniforms, and varyings.