Page 8: Using Textures

CS559 Spring 2023 Sample Solution

Image-Based Textures and Shaders

Shaders can look up values (usually colors) in images. This is how we implement traditional texture mapping.

First, we need to put the image file into a uniform variable. More specifically, we don’t only need an image, but also all the machinery that can read values from it. For example, we might use a mip-map. The GLSL data type is called a sampler2D.

Setting up a sampler2D from the host side is complicated. Fortunately, THREE does it for us. All we need to do is create a Texture and assign it to a uniform variable.

Then, in our shader program we refer to this sampler2D uniform using the texture2D function, which looks up a position in the texture.

Here’s a simple example (image © UW-Madison):

You can look at 10-08-01.js ( 10-08-01.html) and the shaders shaders/10-08-01.vs and shaders/10-08-01.fs to see how it works. Notice how the uniform is set up in the JS file, and then accessed in the fragment shader.

A More Complex Example

In this example, we’ll make a planet. We’ll have a texture with blue (ocean) and green (land) and we’ll put it on a sphere.

Then, we’ll add mountains - we’ll use the green brightness as a height, and use that to displace the vertices of the sphere. We’ll have to do this in the vertex shader.

The interesting part of this is the vertex shader shaders/10-08-02.vs. Note how I look up values in the texture, and use the amount of green to move the vertices in the normal direction.

This is the opposite of a normal map: there is no lighting, but we are really moving the geometry. It is a displacement map.

You can also look at 10-08-02.js ( 10-08-02.html) and shaders/10-08-02.fs - but these are pretty much the same as prior examples.

Procedural Displacement Map

Displacement maps don’t have to come from a texture. Just like a texture, the displacement itself can be procedural. 10-08-03.js ( 10-08-03.html) combines a procedural texture like you saw on page 5 with a displacement map. Notice that there’s no reason we have to display the texture as a color on the object - we could have one texture for the object color, and another for the displacement.

The “disp” parameter is how much the dots move the vertices away from the sphere. If you look at the shader (recommended) notice how we move the position in the direction of the normal.

Because we are moving vertices, the quality of the displacement map depends on the number of vertices. The “segs” slider changes the number of segments on the sphere (basically, the sphere is a segs x segs grid of triangles). Notice how the shape becomes better defined with more segments.

Aliasing is a problem with the vertex shader: because we only compute the displacement at a vertex, if the value changes quickly, we might miss something. It could be an entire dot that looks flat because there are no vertices in it. Or, if there are lots of vertices, we might see unusual patterns around sharp edges. Blurring the edges of the dots makes this less of a problem.

Note: when you change the nsegs slider, we change the triangles of the sphere. In general, we try to avoid this in real programs (because it requires creating new attribute buffers that must be allocated, computed, send to the graphics hardware, …) For this special case, it’s OK - it’s a simple program with one sphere. But in a “real program” we want to avoid changing buffers within a draw loop.

Summary: Textures in Shaders

Now that we’ve seen how to use textures in shaders, we have all the main pieces. Now it’s time to make some shaders on Next: All Together .

There are no points associated with this page.