Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Advanced_Renderman_Book[torrents.ru]

.pdf
Скачиваний:
1714
Добавлен:
30.05.2015
Размер:
38.84 Mб
Скачать

1 92 8 Texture Mapping and Displacement

By default, txmake creates 8-bit, unsigned textures, regardless of the input file. Therefore, if your input file is 16 bits per channel and you want to retain full precision, you must use the -short flag. Similarly, the -float flag will create a 32-bit floating-point texture if you have a floating-point input file.

Several other options exist for txmake, and you are encouraged to read the PRMan User Manual for more information.

BMRT, on the other hand, allows you to use TIFF files directly as textures, though this is not terribly efficient or flexible for ordinary scanline-oriented TIFF. However, BMRT comes with a utility called mkmip, which operates much like txmake, preprocessing your textures so that memory and time are greatly reduced for the renderer when using your textures. The texture files created by mkmip are still legal TIFF files-they just happen to be multiresolution, tile-oriented TIFF, which can be very efficiently accessed by BMRT. The mkmip program also takes the -mode/smode/-tmode parameters to determine wrapping. There are no -short or -float options -mkmip will always create a texture file with the same bit depth and format of your original TIFF file. Consult the BMRT User Manual for details and other options.

8.2Displacement and Bump Mapping

You now know that you can write shaders to modulate the surface color by a texture or function. Using RenderMan Shading Language, you can also add small geometric surface details, such as bumps, grooves, general crust, and roughness. Also, some details or shapes that are difficult to build into the model because they are too tedious, too fine, or need information only available at render time can be better implemented through displacements.

8.2.1Basic Displacement

RenderMan actually allows you to modify the surface position from inside a Displacement*1 shader, using the following idiom:

P += offsetvector ;

N = calculatenormal(P);

*1 Current releases of PRMan can do true displacement inside Surface shaders as well as inside Displacement shaders. However, we still recommend using Displacement shaders whenever possible. Some renderers, such as BMRT, can do true displacement only when in a Displacement shader, whereas displacing inside Surface shaders will result in bump mapping only. Additionally, it is possible that PRMan will someday have additional capabilities or efficiencies that are possible only if displacements happen in their own shader.

8.2 Displacement and Bump Mapping

Figure 8.3 Displacements move surface positions.

Typically, offsetvector is along the normal vector, like this:

P += bumpheight * normalize(N);

N = calculatenormal(P);

Displacements work roughly in the following manner. First, surface primitives are turned into grids. The displacement (or surface) shaders are run at the grid vertices The shader may move the actual grid points by altering P. This moves the vertices of the grid, shifting their positions, as shown schematically in Figure 8.3.

Once you have moved P to its new location, you must recalculate a new N based on the displaced surface position. This doesn't happen automatically, but you may do it quite simply with the function cal culatenormal. The following line sets N to the normal of the surface given its new P values:

N = calculatenormal(P);

More generally, calculatenormal() will compute the normal to the surface "defined" by any point expression. How can calculatenormal() work? We have an apparent paradox: normals are calculated by a cross product of the tangents (derivatives) of surface position, yet we've only specified how P moves at a single point. How cancalculatenormal() know about the other points on the surface and where they displace to? The answer is that Shading Language requires that it will work somehow. In the case of PRMan, you're really shading grids in an SIMD fashion, so the P+=foo statemet is executed everywhere at once, then the N=calculatenormal(P) statement is executed everywhere. So by the time you call calculatenormal(), its arguments are already known everywhere. Different renderers may use different strategies to make this work,

but you can be assured that it will work somehow on any RenderMan compliant renderer.

8.2.2 Bump Mapping

Consider the following code fragment:

N = calculatenormal(P + amp*normalize(N));

194 8 Texture Mapping and Displacement

Figure 8.4 Lumpy teapot. Top row: plain (left), bump mapping (center), displacements (right). Bottom row: close-up. Note the difference between bumping and displacing apparent on the silhouette.

This sets N to the normal of the surface that you would have gotten if you had offset P but doesn't actually move P! This is basic bump mapping-moving the normals without actually displacing the surface.

Figure 8.4 shows the difference between bump mapping and true displacements. An object with displacements will have a ragged silhouette, and its dents and protuberances can self-shadow. Bumps may look just like displacements when viewed from the front, but a bump-mapped surface will still have a smooth silhouette. At certain scales bump mapping and displacing are virtually indistinguishable, and the self-shadowing and ragged silhouette are not significant visual features. Therefore, it is an empirical question whether bumping is sufficient or displacements are necessary. It largely depends on the size of the bumps and the scale at which you are viewing the object.

Bump mapping is less likely to suffer from artifacts or renderer errors such as cracking and may be faster to render. In addition, with bump mapping, you need not worry about displacement bounds, which are explained in Section 8.2.4. For these reasons, we suggest that you use bump mapping whenever possible and save true displacements for those times when they are really needed. In reality, there is little cost associated with starting out using bumps and only going to displacements when the need becomes clear.

The reader with a photographic memory may wonder about the RenderMan Interface's direct bump map support through the RIB MakeBump and SL bump() calls. These were never implemented in any RenderMan renderer (that we know of) because they turned out to be less powerful than the faked-displacement method just described. So now this functionality has been completely subsumed by the displacement mechanism, and with 20/20 hindsight we just ignore MakeBump/bump.

8.2 Displacement and Bump Mapping

195

8.2.3 Scaling Displacements

The idiom: P += amp * normalize(N) displaces the surface amp units in the direction of N, but what coordinate space are these units measured in? Remember that N, like all other variables passed to the shader, is represented in "current" space. So the statement above pushes the surface amp units in the direction of N as measured in "current" space. This is almost certainly not what you want, if for no other reason than that the definition of "current" space is implementation dependent and may vary from renderer to renderer.

Instead, you probably want your displacements measured in "shader" space units. That way, if you scale the overall size of your model, the bumps will scale just as the rest of the geometry does. We could do this by transforming everything to "shader" space, displacing, then transforming back:

vector Nshad = vtransform("shader", N); point Pshad = transform("shader"

Pshad += amp * normalize(Nshad);

P = transform("shader", "current", Pshad); N = calculatenormal(P);

This will work, and this idiom was commonly used in shaders for many years. But there is another way to measure displacements in an arbitrary space that requires fewer transforms and temporary variables (and is thus more compact and efficient). Consider the following alternate idiom for displacing amp units in "shader" space:

vector Nn = normalize(N);

P += Nn * (amp / length(vtransform("shader",Nn)));

That statement may be tricky to decipher, so let's dissect it carefully.

1.By definition, normalize(N) has unit length in "current" space, but if any scaling has happened, it may have some other length in another space (such as "shader" space in this example).

2.Thus, vtransform("shader",normalize(N))isthe "shader" space representation of a vector that had unit length in "current" space.

3.So the length of that vector, let's call it e, is a scaline factor such that a vector of length m in "current" space will have length me in "shader" space.

4.Conversely, a vector of length n in "shader" space will have length n/-e in "current" space.

5.Putting all this together, we see that the statement above will push P in the direction of N by an amount that is amp units in "shader" space.

In summary, the length( ) expression is a correction factor between "current" space units and "shader" space units. Obviously, if you wanted to displace in another space, you could simply substitute another space name where we have used "shader" in the previous example. You might want to displace in "world" space units if you knew that you wanted the bumps to have a particular amplitude in

196 8 Texture Mapping and Displacement

Figure 8.5 Example use of the emboss shader.

absolute units, regardless of how you might want to scale your object model. Also, you don't have to displace along N, but you almost always want to.

Now we can construct a function that combines displacement, bump mapping, and the idioms for displacing relative to a particular coordinate space. This function is shown in Listing 8.3. Listing 8.3 also shows a simple displacement shader that allows a single-channel texture map to determine the amount of displacement on a surface, as if embossing an image into the surface. Notice how most of the smarts are inside the Displace function, so the emboss shader itself is implemented in very few lines of code. An example use of the shader is shown in Figure 8.5.

8.2.4Displacement Bounds

Most renderers need to calculate spatial bounds for all primitives. The renderer knows how to bound each primitive, but the problem is that displacements can move the surface arbitrary amounts. Thus, displaced geometry may "poke out" of the bounding box that the renderer calculates for the primitive. This can cause the geometry to be incorrectly "clipped." These missing slivers of geometry tend to be aligned with scanlines (see Figure 8.6).

To avoid this artifact, you must tell the renderer, in advance, the maximum distance that any part of the primitive might displace so that the renderer can grow the bounds to accommodate the displacements. This can be done in RIB with the following line:

Attribute "displacementbound" "sphere" [radius]

..coordinatesystem" ["space"]

8.2Displacement and Bump Mapping

Listing 8.3 Displace function that combines displacement and bumping, relative to a given coordinate system, and the emboss displacement shader.

/ * Combine displacement and bump mapping, with units relative to a particular space. When

*truedisp != 0, this function modifies P as a side effect.

*

*Inputs:

*

dir

direction in which to push the surface, assumed to already be in "current"

*space and normalized.

*

amp

amplitude of the actual displacement or bumping.

*

space

the name of the coordinate system against which the amplitude is measured.

*

truedisp

when 1, truly displace; when 0, just bump.

*Return value: the normal of the displaced and bumped surface, in "current" space,

*normalized.

*/

normal Displace (vector dir; string space; float amp; float truedisp;)

{

extern point P;

float spacescale = length(vtransform(space, dir)); vector Ndisp = dir * (amp / spacescale);

P += truedisp * Ndisp;

return normalize (calculatenormal (P + (1-truedisp)*Ndisp));

}

 

 

displacement

= ” ”

/* Name of image */

emboss ( string texturename

float Km

= 1

/* Max displacement amt */

string dispspace

= "shader";

/* Space to measure in */

float truedisp

= 1;

/* Displace or bump? */

float sstart = 0, sscale = 1; float tstart = 0, tscale = 1; float blur = 0; )

{

/*Only displace if a filename is provided */ if (texturename !- "") {

/* Simple scaled and offset s-t mapping */ float ss = (s - sstart) / sscale;

float tt = (t - tstart) / tscale;

/* Amplitude is channel 0 of the texture, indexed by s,t. */ float amp = float texture (texturename[0], ss, tt, "blur", blur); /* Displace inward parallel to the surface normal,

* Km*amp units measured in dispspace coordinates. */

N = Displace (normalize(N), dispspace, -Km*amp, truedisp);

}

}

8 Texture Mapping and Displacement

198

Figure 8.6 Missing slivers of geometry resulting from not supplying the correct displacementbound.

This line indicates that the shaders on subsequent objects may displace the surface up to (but no more than) radius units as measured in the coordinate system space. The parameter name "sphere" indicates that it could grow in any direction. It's currently the only option, but we imagine that someday it may be possible to indicate that the displacements will be only in a particular direction. Typical values for space might be

"shader", "object", or "world".

Specifying a displacement bound that is too small (or not specifying it when it is needed) results in a characteristic artifact where the top or left of the object is clipped along a vertical or horizontal boundary (see Figure 8.6). It's as if the object were stamped with a rectangular cookie cutter that wasn't quite large enough to miss the border of the object. On the other hand, if you give a displacement bound that is much larger than required by the shader, your rendering speed may suffer.

Unfortunately, it's up to you to make sure that the displacement bounds value in the RIB stream matches the behavior of the shader. You may recognize that it's sometimes tricky to correctly guess, in the model, what the biggest possible displacement in the shader may be. This task is much easier if your displacement shader is written with an explicit parameter giving the maximum amplitude for displacement and the space to measure against.

8.3Texture Projections

8.3.1Alternatives to s,t Mappings

The examples we have seen of texture and displacement mapping have all assumed that textures were somehow indexed based on the (s,t) coordinates of the surface. For example, both the simpletexmap shader in Listing 8.2 and the emboss shader in Listing 8.3 align the texture lookup with the surface geometry's s,t coordinates,

8.3 Texture Projections

allowing only for simple offsetting and scaling. However, it's likely that for many models and textures, this is not at all appropriate for a variety of reasons:

1.s, t may not be evenly distributed over a surface, and thus texture that is indexed by s, t will appear warped.

2.It is difficult for separate, but abutting, geometric primitives to have a continuous s, t mapping across the seams between them. Nevertheless, it is an entirely valid goal to have a seamless texture spanning several primitives.

3.Some primitives (such as subdivision surfaces) don't have a global s, t parameterization at all.

Luckily, the advantage of specifying surface appearances with a programming language is that you are not limited to such a simple texture coordinate mapping scheme, or even to one that was considered by the renderer authors. A typical method of dealing with these limitations is to eschew (s, t) mappings in favor of projective texture mappings. Such mappings may include spherical (like latitude and longitude on a globe), cylindrical, planar (simply using x, y of some coordinate system), or perspective projections. Listing 8.4 contains Shading Language function code for spherical and cylindrical projections. Both take points, assumed to be in an appropriate shading space, and produce ss and tt texture indices as well as derivative estimates ds and dt. Note that the derivative computations try to take into account the seams that inevitably result from these types of "wrapped" projections. (The background for and uses of derivatives will be discussed more thoroughly in Chapter 11.)

We may wish to keep our shader sufficiently flexible to allow any of these, including (s, t). This gives rise to the function ProjectTo2D in Listing 8.4, which takes a 3D point and the name of a projection type and computes the 2D texture coordinates and their estimated derivatives. For added flexibility, it also takes the name of a coordinate system in which to compute the mapping, as well as an additional 4 x 4 matrix for an added level of control over the mapping. The ProjectTo2D function supports the following projection types:

"st" ordinary (s, t) mapping

"planar", which maps the (x, y) coordinates of a named coordinate system to (ss, tt) values

"perspective", which maps the (x, y, z) coordinates of a named coordinate system as ss = x/z, tt = y/z

"spherical", which maps spherical coordinates to (ss, tt) values

"cylindrical", which maps cylindrical coordinates to (ss, tt) values

8.3.2Texture Application and supertexmap.sl

Now that we have a general mapping routine, we can complete our library of useful texture functions with routines that call ProjectTo2D, as in Listing 8.5. GetColorTextureAndAlpha( ) projects the input point, yielding 2D texture coordinates and derivatives, then performs an antialiased texture lookup to yield color and alpha

201

200 8 Texture Mapping and Displacement

Listing 8.4 Shading Language functions for projecting 3D to 2D coordinates

using one of several named projection types.

/* Project 3D points onto a unit sphere centered at the origin */ void spherical-projection (point p; output float ss, tt, ds, dt;)

{

extern float du, dv; /* Used by the filterwidth macro */ vector V = normalize(vector p);

ss= (-atan (ycomp(V), xcomp(V)) + PI) / (2*PI);

tt= 0.5 - acos(zcomp(V)) / PI;

ds = filterwidth (ss); if (ds > 0.5)

ds = max (1-ds, MINFILTWIDTH); dt = filterwidth (tt);

if (dt > 0.5)

dt = max (1-dt, MINFILTWIDTH);

}

/* Project 3D points onto a cylinder about the z-axis between z=0 and z=1 void cylindrical-projection (point p; output float ss, tt, ds, dt;)

{

extern

float du, dv;

/* Used by the filterwidth macro */

vector

V = normalize(vector p);

ss= (-atan (ycomp(V), xcomp(V)) + PI) / (2*PI);

tt= zcomp(p);

ds = filterwidth (ss); if (ds > 0.5)

ds = max (1-ds, MINFILTWIDTH); dt = filterwidth (tt);

}

void ProjectTo2D (string projection; point P; string whichspace; matrix xform;

output float ss, tt, ds, dt; )

{

point Pproj;

/* Used by the filterwidth macro */

extern float du, dv;

if (projection == "st") { extern float s, t;

Pproj = point (s, t, 0); } else {

Pproj = transform (whichspace, P);

}

Pproj = transform (xform, Pproj);

if (projection == "planar" || projection == "st") {

ss= xcomp(Pproj);

tt= ycomp(Pproj);

ds = filterwidth (ss); dt = filterwidth (tt);

}

}

Further Reading

201

Listing 8.4 (Continued)

else if (projection == "perspective") {

float z = max (zcomp(Pproj), 1.Oe-6); /* avoid zero division */

ss= xcomp(Pproj) / z;

tt= ycomp(Pproj) / z; ds = filterwidth (ss); dt = filterwidth (tt);

}

/* Special cases for the projections that may wrap */ else if (projection == "spherical")

spherical-projection (Pproj, ss, tt, ds, dt); else if (projection == "cylindrical")

cylindrical-projection (Pproj, ss, tt, ds, dt);

(assuming full opacity if no alpha channel is present). ApplyColorTextureOver( ) calls GetColorTextureAndAlpha and then applies the texture over the existing base color using the usual alpha composition rule. Similar routines for single-channel textures can be easily derived.

Finally, with these functions we can write yet another refinement of a basic texture-mapping shader. The standard paintedplastic.sl scales the surface color by a simple (s, t) lookup of a color texture. The simpletexmap.sl (Listing 8.2 in Section 8.1) supports textures with alpha channels applying paint over a surface and also allows simple positioning and scaling of the texture pattern. Now we present supertexmap.sl (Listing 8.6), which uses the flexible texture routines of this section. This shader allows the use of texture maps for color, opacity, specularity, and displacement. Furthermore, each one may use a different projection type, projection space, and transformation.

When using supertexmap.sl, you position the textures on the surface by manipulating the named texture coordinate space and/or providing 4 x 4 transformation matrices. Note that the 4 x 4 matrices are passed to the shader as arrays of 16 floats rather than as a matrix type, in order to avoid the transformation of the matrix relative to "current" space.

Further Reading

Texture mapping was first developed in Ed Catmull's doctoral dissertation (Catmull, 1974). While countless papers have been written on texture mapping, an excellent overall survey of texture-mapping techniques can be found in Heckbert (1986).

Bump mapping was proposed by Blinn (1978), displacements by Cook (1984) and Cook, Carpenter, and Catmull (1987), and opacity maps by Gardner (1984).

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]