Skip to main content

Command Palette

Search for a command to run...

2.1 - Vertex Transformation Deep Dive

Updated
46 min read
2.1 - Vertex Transformation Deep Dive

What We're Learning

Welcome to Phase 2! In Phase 1, we built our foundation: we mastered WGSL fundamentals, mapped the graphics pipeline, and solidified the essential math concepts that underpin all 3D graphics. We learned the rules of the rendering pipeline and how to correctly pass data through it.

Now, it's time to start bending those rules. We are ready to take direct control of our geometry by mastering the vertex shader.

Think of the vertex shader as your personal geometry engine. It's the stage in the pipeline where you, the developer, get to control the shape, position, and orientation of every single vertex in your scene. It's the engine that drives all 3D motion and deformation.

In our previous articles, we learned the theory behind transformations and relied on Bevy's standard functions to apply them. Now, we're going to peel back those layers of abstraction. We will move beyond simply using Bevy's built-in matrices and learn to implement the full transformation pipeline ourselves. This knowledge is the key that unlocks the ability to create powerful, custom vertex effects that bring your scenes to life in unique and dynamic ways.

By the end of this article, you will be able to:

  • Manually implement the complete transformation pipeline from a model's local space all the way to the screen's clip space.

  • Apply powerful vertex displacement techniques to create dynamic effects like procedural waves, noise-based deformations, and twists.

  • Master camera-facing techniques to create billboards, essential for particles, 2D sprites, and nameplates in a 3D world.

  • Calculate and transform normal vectors correctly, ensuring your custom geometry reacts realistically to light.

  • Employ crucial optimization strategies to keep your vertex shaders fast and efficient, even when deforming complex meshes.

  • Compose multiple displacement functions to build sophisticated, layered visual effects.

The Vertex Shader: Your Geometry Engine

Think of the vertex shader as your geometry engine. It takes the raw, static vertex data from your 3D models and forges it into its final form and position within the 3D world.

At its heart, this engine has one primary, non-negotiable job: to calculate the final position of each vertex in a special coordinate system called clip space. This final output, which you'll always see decorated with @builtin(position), is the vec4<f32> value that the GPU's rasterizer needs to figure out where on your 2D screen that vertex belongs.

Everything else a vertex shader might do - deforming a mesh into a waving flag, calculating normals for beautiful lighting, or passing UV coordinates for texturing - is ultimately in service of this one critical task. Get the clip space position right, and your object appears on screen. Get it wrong, and it vanishes.

The Entry Point Signature

Let's re-examine the anatomy of a typical vertex shader entry point. This function is the main gate to our geometry engine; it takes raw mesh data in and pushes the final, transformed data out.

// Input: Data for a single vertex, as read from the mesh buffer on the GPU.
struct VertexInput {
    @location(0) position: vec3<f32>,
    // Other attributes like normals (@location(1)) or UVs (@location(2)) go here.
}

// Output: Data for the rasterizer and, optionally, the fragment shader.
struct VertexOutput {
    // This is the one REQUIRED output for the rasterizer!
    @builtin(position) clip_position: vec4<f32>,

    // Other data (e.g., world position, normals, UVs) can be passed "downstream"
    // to the fragment shader for coloring and texturing.
    @location(0) world_position: vec3<f32>,
}

@vertex
fn vertex(in: VertexInput) -> VertexOutput {
    var out: VertexOutput;

    // --- The core transformation logic happens here ---

    out.clip_position = /* the final vec4<f32> result */;
    return out;
}
  • The VertexInput struct defines the "raw materials" for our engine. This data - position, normal, UVs - is read directly from the mesh asset's buffers on the GPU. The @location(N) decorator tells the GPU which buffer to read from for that specific attribute.

  • The VertexOutput struct is the "finished product." It's a package of data we're sending to the next stages of the pipeline.

    • The special @builtin(position) field is for the rasterizer.

    • Any other fields, marked with @location(N), are passed downstream to the fragment shader, where they will be interpolated across the surface of the triangle.

Our entire focus in this article is on what happens inside that transformation logic. How do we get from a simple vec3<f32> in a model file to the final vec4<f32> that the hardware needs? The journey begins with a step-by-step tour through the coordinate spaces of the 3D world.

The Transformation Journey, Step-by-Step

Before we can start creating custom effects, we must fully understand and manually rebuild the standard transformation pipeline. Every vertex in a 3D scene embarks on a journey through multiple coordinate systems, moving from a static model file to a dynamic position on your screen. Our job in the vertex shader is to act as its guide.

The Standard Pipeline

This is the path every vertex must travel. Each step is a change in its frame of reference, managed by a specific matrix multiplication.

Local Space (The model's private blueprint coordinates)
    ↓ [Model Matrix]
World Space (The shared scene coordinates where all objects coexist)
    ↓ [View Matrix]
View Space (The world from the camera's unique perspective)
    ↓ [Projection Matrix]
Clip Space (A normalized 3D cube ready for the GPU to process)
    ↓ [Perspective Divide & Viewport Transform - automatic GPU steps]
Screen Space (The final 2D pixel coordinates on your monitor)

In previous articles, we relied on Bevy's high-level functions to handle this entire journey in a single step. Now, we're going to peel back the layers of abstraction and build this pipeline ourselves.

Step 1: From Local Space to World Space (The Model Matrix)

Every 3D model begins in its own private universe called Local Space (or Model Space). In this space, the model is the center of everything, typically with its pivot point at the origin (0,0,0). The position attribute we receive in our vertex shader is defined in these local coordinates.

To build a scene, we must place all these individual models into a single, shared universe called World Space. This is the common coordinate system where all objects, lights, and the camera coexist. The magic that moves, rotates, and scales a model from its local origin to its final place in the scene is the Model Matrix.

As we learned in our math deep dive, this matrix is the result of applying Scale, then Rotation, then Translation:

Model Matrix = Translation Matrix * Rotation Matrix * Scale Matrix

When you set a Transform on an entity in Bevy, you are defining the components that Bevy's engine will use to construct this exact matrix on the CPU. It does this once per object, per frame, which is far more efficient than us trying to build it for every single vertex. This pre-calculated matrix is then made available to our shader as a uniform.

Accessing Bevy's Model Matrix in WGSL

By importing bevy_pbr::mesh_functions, we gain access to a helper that retrieves the pre-built model matrix for the object currently being rendered.

#import bevy_pbr::mesh_functions

@vertex
fn vertex(
    // We need the instance_index to get the correct model matrix for this object.
    @builtin(instance_index) instance_index: u32,
    @location(0) position: vec3<f32>,
    // ... other inputs
) -> ... {
    // Fetch the 4x4 model matrix for this specific mesh instance.
    let model_matrix = mesh_functions::get_world_from_local(instance_index);

    // ... now we can use this matrix to transform our vertex position ...
}

Now we have the two key ingredients: the vertex position in local space (in.position) and the model_matrix. To perform the transformation, we multiply them:

  1. Promote to vec4: A mat4x4 matrix multiplication requires a four-component vector. We convert our vec3 position to a vec4, setting the w component to 1.0. This 1.0 is crucial; it acts as the "on-switch" for translation, signifying that this is a point in space that should be moved.

  2. Multiply: We apply the model matrix. Remember, in WGSL (and most graphics math), the order is always matrix * vector.

// Inside the vertex function:

// 1. Fetch the model matrix.
let model_matrix = mesh_functions::get_world_from_local(in.instance_index);

// 2. Promote local position to vec4 with w=1.0.
let local_position_vec4 = vec4<f32>(in.position, 1.0);

// 3. Transform from local space to world space.
let world_position = model_matrix * local_position_vec4;

That's it! The world_position variable now holds the vertex's precise coordinates within the global scene.

Step 2 & 3: From World Space to Clip Space (The View and Projection Matrices)

With our vertex now in world space, our next goal is to get it into the final Clip Space required by the rasterizer. This involves two more transformations:

  1. The View Matrix reorients the entire world so that it is relative to the camera's position and orientation. It's like moving the whole world so the camera is at (0,0,0) looking down a specific axis.

  2. The Projection Matrix simulates the camera's lens (e.g., perspective or orthographic). It squashes the 3D scene into a normalized cube (typically from -1 to +1 on all axes) and, for perspective cameras, makes distant objects appear smaller.

For now, we will continue to use a single Bevy PBR helper function that conveniently combines both the View and Projection transformations for us. We will deconstruct this function in the next article, which is dedicated to camera and projection mathematics.

#import bevy_pbr::view_transformations::position_world_to_clip

// ... inside the vertex shader, after calculating world_position ...

// This single function handles both the View and Projection matrix multiplications.
// It takes a vec3 because it operates on the position part of the vector.
out.clip_position = position_world_to_clip(world_position.xyz);

Common Vertex Transformation Mistakes

This process is straightforward, but a few common mistakes can trip you up.

Mistake 1: Multiplying a mat4x4 by a vec3

This is the most frequent error and will cause your shader to fail compilation. The dimensions don't match.

// ✗ WRONG
// Error: cannot multiply mat4x4<f32> by vec3<f32>
let world_pos = model_matrix * in.position;

// ✓ CORRECT
let world_pos = model_matrix * vec4<f32>(in.position, 1.0);

Solution: The mat4x4 is specifically designed to handle 3D translation using a mathematical tool called homogeneous coordinates. For this to work, it requires a four-component vector as input. That fourth component, w, is what allows a single matrix multiplication to handle rotation, scale, and translation simultaneously.

Mistake 2: Transforming a Direction Vector as a Position

What if you're transforming something that isn't a point in space, like a surface normal or a tangent? These are directions, and they should not be affected by translation. If you move an object, its surface normals should rotate with it, but they shouldn't shift their origin. To achieve this, you must set the w component of the vector to 0.0. This acts as an "off-switch" for translation.

// Transforming a normal vector (a direction)
let local_normal = vec4<f32>(in.normal, 0.0); // w = 0.0 for directions!
let world_normal = model_matrix * local_normal;

Solution: Before transforming any vec3, ask yourself: "What does this vector represent?"

  • If it's a position (a vertex's location), use w = 1.0 to ensure translation is correctly applied.

  • If it's a direction (a surface normal, a tangent, a light vector), use w = 0.0 to ensure only rotation and scale are applied, ignoring translation.

This mental check is crucial for achieving correct lighting and other advanced effects.

Note: As we learned in article 1.8, this is still not the fully correct way to transform normals if non-uniform scaling is involved. For that, we need the inverse transpose of the model matrix. But the w=0.0 principle is the essential first step.

Mistake 3: Incorrect Multiplication Order

Matrix multiplication is not commutative. A * B is not the same as B * A. The transformation matrix always comes first.

// ✗ WRONG (and won't compile in WGSL, but is a common logic error in math)
let world_pos = vec4<f32>(in.position, 1.0) * model_matrix;

// ✓ CORRECT
let world_pos = model_matrix * vec4<f32>(in.position, 1.0);

Solution: The standard order is matrix * vector. Stick to it.

Taking Control: Our First Custom Displacement

We have now successfully rebuilt Bevy's standard transformation pipeline. We take a local position, apply the model matrix to get to world space, and then use a helper function to get to the final clip space.

// The standard pipeline we just built:
let model_matrix = mesh_functions::get_world_from_local(in.instance_index);
let world_position = model_matrix * vec4<f32>(in.position, 1.0);
out.clip_position = position_world_to_clip(world_position.xyz);

This is powerful, but so far, we've only recreated what Bevy already does for us. The real magic begins when we intervene in this process. Vertex displacement is the technique of modifying a vertex's position after reading it from the mesh but before the final transformation. This allows us to create dynamic and custom geometric effects directly on the GPU.

The ideal moment to apply a custom displacement is in local space. Why? Because any effect applied in local space becomes an intrinsic part of the model. When the model rotates, the effect rotates with it. When it scales, the effect scales too. If we were to apply a wave effect in world space, the waves would remain stationary while the object moved through them, which is a completely different (and usually undesirable) result.

Let's create our first, simple effect: making an object gently bob up and down. We can achieve this with a single line of code by hijacking the pipeline, using a sine wave driven by time.

// Assume we have a material uniform providing the elapsed time.
@group(2) @binding(0)
var<uniform> material: MyMaterial; // This struct contains a 'time: f32' field.

// --- Inside the vertex shader ---

// 1. Start with a mutable copy of the original local position.
var local_position = in.position;

// 2. THE DISPLACEMENT: Modify the Y-coordinate.
//    sin() naturally oscillates between -1.0 and 1.0. We scale it down
//    to create a gentle up-and-down motion.
let bob_amount = sin(material.time * 3.0) * 0.2;
local_position.y += bob_amount;

// 3. Continue with the standard pipeline, but use our MODIFIED position.
let model_matrix = mesh_functions::get_world_from_local(in.instance_index);
let world_position = model_matrix * vec4<f32>(local_position, 1.0);
out.clip_position = position_world_to_clip(world_position.xyz);

By inserting that one line, we've fundamentally changed the behavior of our shader. We are no longer just relaying data; we are generating motion. This is the core principle behind every advanced vertex effect you will ever create. The position that comes from the mesh asset is not the final word - it's just a starting point for our imagination.

Now that we understand this fundamental concept, let's build a library of more sophisticated displacement techniques.

Vertex Displacement: A Library of Effects

Vertex displacement is the foundation of countless visual effects, from rippling water to procedural terrain. The key is to displace vertices in a way that looks natural or achieves a specific artistic goal. Most techniques involve moving a vertex along its normal vector - the vector that points directly "out" from the surface at that vertex's location. This creates the effect of inflating, deflating, or waving from the surface.

Let's explore some common and powerful displacement patterns.

Wave Displacement

This is the classic technique for creating flowing, organic motion like water surfaces or waving flags. We use sine and cosine functions, which naturally produce smooth, oscillating values that repeat over space and time.

fn apply_wave_displacement(
    position: vec3<f32>,
    normal: vec3<f32>,
    time: f32
) -> vec3<f32> {
    // Create waves based on the vertex's X and Z position and the current time.
    // Using different frequencies and directions for each wave adds complexity.
    let wave_x = sin(position.x * 4.0 + time * 2.0);
    let wave_z = cos(position.z * 4.0 + time * 1.5);

    // Combine the waves. We multiply by 0.5 to keep the final result
    // in a predictable -1.0 to 1.0 range.
    let combined_waves = (wave_x + wave_z) * 0.5;

    // Define how much the wave should push the vertex out.
    let wave_amplitude = 0.1;

    // Displace the position along its normal by the final wave amount.
    return position + normal * combined_waves * wave_amplitude;
}

Noise-Based Displacement

While waves are periodic and regular, noise functions create more complex, natural-looking deformations. This is perfect for effects like bubbling lava, flickering energy shields, or randomized terrain. A good noise function is an essential tool in any graphics programmer's toolkit.

// A simple hash function to generate pseudo-random numbers.
fn hash(n: f32) -> f32 {
    return fract(sin(n) * 43758.5453123);
}

// A simple 3D value noise function. It's not as high-quality as Perlin or 
// Simplex noise, but it's very fast and effective for many real-time effects.
fn noise3d(p: vec3<f32>) -> f32 {
    let ip = floor(p);
    var fp = fract(p);
    // Use smoothstep for smoother interpolation between grid points.
    fp = fp * fp * (3.0 - 2.0 * fp);

    let n = ip.x + ip.y * 57.0 + ip.z * 113.0;
    let res = mix(
        mix(
            mix(hash(n + 0.0), hash(n + 1.0), fp.x),
            mix(hash(n + 57.0), hash(n + 58.0), fp.x),
            fp.y
        ),
        mix(
            mix(hash(n + 113.0), hash(n + 114.0), fp.x),
            mix(hash(n + 170.0), hash(n + 171.0), fp.x),
            fp.y
        ),
        fp.z
    );
    // Remap the result from a 0.0 to 1.0 range to a -1.0 to 1.0 range.
    return res * 2.0 - 1.0;
}

fn apply_noise_displacement(
    position: vec3<f32>,
    normal: vec3<f32>,
    time: f32,
    strength: f32
) -> vec3<f32> {
    // Sample the noise function. Scaling the position makes the noise pattern
    // larger or smaller, and adding time makes it evolve.
    let noise_val = noise3d(position * 2.0 + time);

    // Displace the position along the normal by the noise value.
    return position + normal * noise_val * strength;
}

Twist Displacement

Twisting is a great example of a deformation that isn't based on normals. Instead, it rotates vertices around an axis, with the amount of rotation increasing along that axis. This is perfect for effects like a tornado, a drill, or wringing out a wet cloth.

This function is more complex than the others because it must be robust against a common mathematical edge case.

fn apply_twist(
    position: vec3<f32>,
    twist_amount: f32,
    axis: vec3<f32>
) -> vec3<f32> {
    // 1. Find the vertex's height along the twist axis.
    let height = dot(position, axis);

    // 2. Calculate the rotation angle. It increases with height.
    let angle = height * twist_amount;
    let c = cos(angle);
    let s = sin(angle);

    // 3. Separate the position into a part parallel to the axis
    //    and a part perpendicular to it. We only want to rotate the perpendicular part.
    let parallel = axis * height;
    let perpendicular = position - parallel;

    // 4. Create a robust, temporary 2D coordinate system on the perpendicular plane.
    //    This is the key to avoiding a singularity.
    let arbitrary_vec = vec3<f32>(0.0, 0.0, 1.0);
    //    If our axis is too close to arbitrary_vec, use a different one!
    let safe_arbitrary = mix(
        arbitrary_vec, 
        vec3<f32>(1.0, 0.0, 0.0), 
        step(0.999, abs(dot(axis, arbitrary_vec)))
    );
    let right = normalize(cross(axis, safe_arbitrary));
    let forward = cross(axis, right);

    // 5. Project the perpendicular part onto our new 2D basis.
    let x = dot(perpendicular, right);
    let y = dot(perpendicular, forward);

    // 6. Perform a standard 2D rotation on the projected coordinates.
    let x_rotated = x * c - y * s;
    let y_rotated = x * s + y * c;

    // 7. Reconstruct the full 3D position from the parallel and rotated perpendicular parts.
    return parallel + right * x_rotated + forward * y_rotated;
}

A Note on Robustness: Avoiding the "Singularity"

The code in Step 4 looks more complex than you might expect. This complexity is crucial for making our function robust by avoiding a mathematical problem known as a singularity.

  • The Problem: To rotate the perpendicular component, we need to define a 2D coordinate system on the plane of rotation. We do this by taking the cross product between our axis and some other arbitrary vector. In our code, we initially chose the world Z-axis (0,0,1). But what happens if the user wants to twist the mesh around the Z-axis itself? The cross product of two parallel vectors is the zero vector (vec3(0.0, 0.0, 0.0)). When we then try to normalize() this zero vector, we are performing a division by zero, which results in NaN (Not a Number) values. These NaNs propagate through the rest of the calculation, corrupting the mesh and likely causing it to disappear. This specific edge case is a singularity.

  • The Solution: Our safeguard in Step 4 prevents this. It uses the dot product to check if our axis is almost parallel to our chosen arbitrary_vec.

    • If they are not parallel, abs(dot(...)) is less than 0.999, so step returns 0.0. The mix function then selects our default arbitrary_vec.

    • If they are parallel, step returns 1.0, and mix intelligently switches to a different arbitrary vector (the world X-axis, 1,0,0), which is guaranteed not to be parallel.

This ensures we never try to take the cross product of two parallel vectors, making our function safe to use with any axis. This is a common and important practice in graphics programming: always think about the edge cases!

Inflation (or "Breathing")

Sometimes, the simplest effects are the most useful. Inflation moves every vertex outward along its normal by a uniform amount. By animating this amount over time, you can create a "breathing" or pulsing effect, useful for highlighting objects or indicating damage.

fn apply_inflation(
    position: vec3<f32>,
    normal: vec3<f32>,
    amount: f32
) -> vec3<f32> {
    // Simply push the vertex along its normal vector.
    return position + normal * amount;
}

Combining Multiple Displacements

The true power of this approach comes from composition. You can layer multiple displacement effects to create much more sophisticated and detailed results.

fn apply_combined_displacement(
    position: vec3<f32>,
    normal: vec3<f32>,
    time: f32
) -> vec3<f32> {
    var displaced_pos = position;

    // First, apply a twist to the base geometry.
    let twist_axis = vec3<f32>(0.0, 1.0, 0.0);
    displaced_pos = apply_twist(displaced_pos, sin(time) * 1.0, twist_axis);

    // Next, layer some wave motion on top of the twisted shape.
    // NOTE: We pass the *original normal* here. Calculating a new normal
    // after the twist is complex and often not necessary for visual effects.
    displaced_pos = apply_wave_displacement(displaced_pos, normal, time * 0.5);

    // Finally, add a subtle "breathing" pulse to the whole thing.
    let pulse_amount = (sin(time * 2.0) + 1.0) * 0.05; // from 0.0 to 0.1
    displaced_pos = apply_inflation(displaced_pos, normal, pulse_amount);

    return displaced_pos;
}

Camera-Facing Techniques: Billboards

Sometimes you don't want an object to behave like a normal 3D model. Instead, you want a 2D image or quad to always face the camera, no matter how the camera moves. This technique is called billboarding, and it's essential for effects like particles, nameplates, lens flares, and 2D sprites in a 3D world.

The core idea is to completely override the model's local rotation. Instead of transforming the vertex by a model matrix, we manually construct its world position using the camera's orientation vectors as our guide.

Full Billboarding: Always Face the Camera

A full billboard is a flat quad that perfectly mimics the camera's orientation. As the camera rotates, the quad rotates with it, always presenting its flat face to the viewer.

To build this, we need three pieces of information from the CPU, passed in as uniforms:

  1. The desired center of our billboard in world space (world_center).

  2. The camera's "right" vector (camera_right).

  3. The camera's "up" vector (camera_up).

The vertex shader then uses the mesh's local XY coordinates not as positions, but as 2D offsets from the center along the camera's axes.

fn billboard_transform(
    local_position_xy: vec2<f32>, // The 2D position on the quad's surface.
    world_center: vec3<f32>,      // The 3D point where the billboard is anchored.
    camera_right: vec3<f32>,      // The camera's right-pointing vector.
    camera_up: vec3<f32>         // The camera's up-pointing vector.
) -> vec3<f32> {
    // 1. Calculate the offset along the camera's right axis.
    let right_offset = camera_right * local_position_xy.x;

    // 2. Calculate the offset along the camera's up axis.
    let up_offset = camera_up * local_position_xy.y;

    // 3. Add these offsets to the world center to get the final position.
    return world_center + right_offset + up_offset;
}

Crucial Implementation Details

  1. Use the Right Mesh: For this technique to work intuitively, you must use a mesh that lies flat on the XY plane in local space, like Bevy's Rectangle primitive. This ensures that the mesh's local x coordinate correctly maps to the camera's right direction, and its y maps to up. If you were to use a Plane3d mesh, which lies on the XZ plane, its z coordinate would map to the camera's up direction, resulting in a sideways orientation.

  2. Passing Camera Data: You might be tempted to access Bevy's global View uniform to get camera data. Do not do this in a custom Material. Bevy's material system manages its own bindings, and trying to add another global View binding will lead to conflicts and unpredictable behavior. The correct and robust approach is to add the camera's position, right, and up vectors directly to your material's uniform struct. Then, you can use a Rust system to update these values on your material every frame.

Cylindrical Billboarding

Full billboarding is perfect for things that have no connection to the ground, like particles or text floating in space. But what happens when you use it for an object that should feel grounded, like a 2D tree sprite in a 3D world?

If you use a full billboard for a tree, it will always face the camera perfectly. This sounds good, but watch what happens when your camera flies up high and looks down: the tree sprite will tilt backwards, pointing up at the camera. It will no longer look like it's growing out of the ground. This breaks the illusion.

The solution is cylindrical billboarding.

The Core Concept: Turn, Don't Tilt

Imagine a person standing still. As you walk around them, they can turn their body to keep facing you. This is rotation around a vertical axis (their spine). However, if you climb a ladder, they don't lean their whole body backwards to look up at you; they just tilt their head.

Cylindrical billboarding is like that person's body. It allows an object to rotate around a fixed axis (usually the world's "up" vector) to face the camera's horizontal position, but it forbids the object from tilting along any other axis.

How It Works: A Step-by-Step Breakdown

To achieve this, we need to effectively ignore the camera's vertical position and only consider its position on a flat, horizontal plane.

fn cylindrical_billboard(
    local_position_xy: vec2<f32>,
    world_center: vec3<f32>,
    camera_position: vec3<f32>,
    up_axis: vec3<f32> // The fixed axis of rotation, e.g., (0, 1, 0).
) -> vec3<f32> {
    // 1. Get the true direction from the object's center to the camera.
    let to_camera = camera_position - world_center;

    // 2. Project this vector onto the horizontal plane by removing its vertical component.
    //    Think of this as finding the "shadow" of the to_camera vector on the ground.
    let vertical_part = dot(to_camera, up_axis) * up_axis;
    let horizontal_dir = to_camera - vertical_part;

    // 3. From this purely horizontal direction, create a new forward-facing vector.
    //    This is the direction the billboard should face.
    let forward = normalize(horizontal_dir);

    // 4. Create a right-facing vector perpendicular to both our new forward and the fixed up axis.
    let right = normalize(cross(forward, up_axis));

    // 5. Build the final position.
    //    Notice the key difference from full billboarding: we use our new `right`
    //    vector, but the original, fixed `up_axis` instead of the camera's up vector.
    return world_center + right * local_position_xy.x + up_axis * local_position_xy.y;
}

Let's focus on Step 2, which is the clever part:

  • dot(to_camera, up_axis) calculates how much of the to_camera vector points in the up direction (its vertical magnitude).

  • Multiplying this scalar value by the up_axis vector reconstructs that vertical component as a vector (e.g., vec3(0.0, 5.2, 0.0)).

  • to_camera - vertical_part subtracts only the vertical part of the direction, leaving a vector that is perfectly flat on the horizontal plane.

The rest of the function is similar to the full billboard, but with one critical difference in Step 5: we construct the final position using our calculated right vector but the original up_axis. This is what forces the billboard to remain upright, achieving the "turn, don't tilt" behavior we wanted.

Keeping the Lights On: Transforming Normals Correctly

A normal vector is a unit vector that points directly "out" from a mesh's surface at a vertex's location. The lighting engine uses this vector to determine how much light from various sources should reflect off that point.

  • If the normal points towards a light, the surface is bright.

  • If the normal points away, the surface is dark.

The Problem: When we displace a vertex with a wave or noise function, we are changing the slope of the surface. However, the original normal vector stored in the mesh data still points in the old direction. If we use this outdated normal, our wavy surface will be lit as if it were still perfectly flat, completely destroying the illusion of depth.

So, how do we update the normals? The correct method depends on what kind of transformation we're performing.

Case 1: Standard Transformations (No Custom Displacement)

If you are only using the standard model_matrix to scale, rotate, and translate your mesh, there is a mathematically pure solution. Due to the complexities of non-uniform scaling (e.g., scale = vec3(2.0, 1.0, 1.0)), you cannot simply multiply the normal by the model matrix.

The correct tool is the Normal Matrix, which is the inverse transpose of the upper-left 3x3 portion of the model matrix. While the math is fascinating, you don't need to implement it yourself. Bevy provides a function that handles this perfectly.

// Bevy handles the Normal Matrix calculation internally on the CPU.
let world_normal = mesh_functions::mesh_normal_local_to_world(
    in.normal, // The original vec3<f32> normal
    in.instance_index
);

Rule: If you are not applying custom displacement, always use mesh_normal_local_to_world to get the correct world-space normal.

Case 2: Custom Displacement

This is our situation. When we apply a wave, twist, or noise effect, mesh_normal_local_to_world is no longer enough. It can correctly transform the original normal, but it has no knowledge of how our custom displacement function has altered the surface's slope.

We must calculate a new normal. There are two main approaches.

The Practical Approach: Normal Perturbation

For most real-time effects, the most efficient solution is to perturb (or "nudge") the original normal based on the logic of our displacement function. We don't need a perfectly recalculated normal; we just need one that's a good enough approximation to look correct.

For a wave, we can use the mathematical derivative of our wave function to find its gradient (the direction of steepest ascent). This gradient gives us a vector that lies on the surface's new slope, which we can use to adjust the original normal.

// Perturb a normal to account for our specific wave displacement function.
fn perturb_normal_for_waves(
    original_normal: vec3<f32>,
    position: vec3<f32>, // The original, pre-displacement position
    time: f32,
    frequency: f32,
    amplitude: f32
) -> vec3<f32> {
    // Our wave is roughly: wave_amplitude * sin(position.x * frequency + time)
    // The derivative of sin(x) with respect to x is cos(x).
    // By the chain rule, the derivative of sin(ax) is a*cos(ax).

    // Calculate the gradient (slope) of the wave in the X and Z directions.
    let dx = amplitude * frequency * cos(position.x * frequency + time * 2.0);
    let dz = -amplitude * frequency * sin(position.z * frequency + time * 1.5);

    // This gives us a vector tangent to the new surface slope.
    // We construct a new surface normal from the partial derivatives.
    let new_normal = vec3<f32>(-dx, 1.0, -dz);

    // Combine with the original and re-normalize to get the new direction.
    // This is a simplified approach but effective for many cases.
    return normalize(original_normal + new_normal);
}

This method is fast and produces visually convincing results, making it the preferred choice for performance-critical applications.

The Brute-Force Approach: Numerical Derivatives

What if your displacement function is too complex to calculate the derivative analytically? You can fall back on a brute-force method that approximates the normal by sampling the displacement function at nearby points.

  1. Calculate the final displaced position for the current vertex (P).

  2. Pick a tiny distance (epsilon).

  3. Calculate the displaced position for a virtual vertex slightly offset on the X-axis (Px).

  4. Calculate the displaced position for a virtual vertex slightly offset on the Z-axis (Pz).

  5. Create two vectors: one from P to Px and one from P to Pz. These two vectors lie on the new surface.

  6. The cross product of these two vectors gives us a vector perpendicular to the new surface - our new normal!

While this works for any displacement function, it has a significant performance cost: you must run your entire displacement logic three times for every single vertex. For this reason, it's generally avoided in real-time shaders unless absolutely necessary. Perturbing the normal is almost always the better option.

A Quick Clarification: Geometric vs. Shading Normals

The techniques we are discussing here are for updating the geometric normal - the true orientation of the mesh after we've deformed it in the vertex shader.

This is distinct from a popular technique called normal mapping (or bump mapping), which happens in the fragment shader. Normal mapping uses a texture to fake fine-grained surface detail on a low-poly mesh, creating the illusion of bumps and dents without actually moving any vertices.

We will cover normal mapping in depth when we move on to Phase 4, which is focused on advanced texturing techniques. For now, know that we are dealing with the actual, physical slope of our geometry.

Performance and Optimization

Vertex shaders are a performance-critical part of the rendering pipeline. Your shader code will be executed for every single vertex of a mesh, potentially millions of times per frame. A small inefficiency in your code can quickly multiply into a major performance bottleneck.

Here are the most important strategies for writing fast, optimized vertex shaders.

1. Do the Work Once

This is the golden rule of optimization. If you need the result of a calculation in more than one place, store it in a local variable and reuse it. Avoid calling the same function or performing the same complex math multiple times.

// ✗ Inefficient: The same expensive noise function is called twice.
@vertex
fn vertex(in: VertexInput) -> VertexOutput {
    let displacement = some_expensive_noise_calculation(in.position, material.time);
    out.position = in.position + in.normal * displacement;

    // The same value is wastefully re-calculated for coloring.
    let color_intensity = some_expensive_noise_calculation(in.position, material.time);
    out.color = vec3(color_intensity);
    // ...
}

// ✓ Efficient: Calculate once, store the result, and reuse it.
@vertex
fn vertex(in: VertexInput) -> VertexOutput {
    let noise_result = some_expensive_noise_calculation(in.position, material.time);
    out.position = in.position + in.normal * noise_result;
    out.color = vec3(noise_result);
    // ...
}

2. Use Built-in Functions

The built-in WGSL functions (normalize, length, dot, sin, mix, smoothstep, etc.) are your best friends. They are implemented at a low level, often directly in the GPU hardware, and are guaranteed to be significantly faster than any manual implementation you could write yourself.

// ✓ Good: Uses the highly optimized, built-in normalize() function.
let normalized_vec = normalize(my_vector);

// ✗ Bad: Manually implementing the function is much slower and less precise.
let len = sqrt(dot(my_vector, my_vector));
let normalized_vec = my_vector / len; // Also risks division by zero!

3. Avoid Divergent Branching

GPUs achieve their incredible speed by executing the same instruction on large groups of threads (vertices) in parallel. This is called SIMD (Single Instruction, Multiple Data). A conditional if/else statement can break this parallel execution if the condition depends on per-vertex data, a problem known as thread divergence.

If one vertex in a group takes the if path and another takes the else path, the GPU has to run both branches sequentially for the whole group, with threads disabling themselves for the path they didn't take. This can effectively kill your parallelism.

// ✗ Less efficient: This `if` statement depends on vertex position,
// causing threads in the same group to diverge, hurting performance.
var displacement: f32;
if (in.position.x > 0.0) {
    displacement = expensive_calculation_A(in.position);
} else {
    displacement = expensive_calculation_B(in.position);
}

// ✓ More efficient: Calculate both outcomes and use a branchless function like
//   mix() or select() to blend between them. All threads execute the same instructions.
let outcome_A = expensive_calculation_A(in.position);
let outcome_B = expensive_calculation_B(in.position);
// The third argument to mix() acts as the selector. step() returns 0 or 1.
let t = step(0.0, in.position.x);
let displacement = mix(outcome_B, outcome_A, t);

However, there is a crucial exception: uniform-based branching is perfectly fine. If the if condition is based on a uniform value (which is the same for all vertices being rendered in a draw call), then all threads will take the same path. There is no divergence, and the shader compiler can optimize the unused branch away completely. This is the technique we will use in our final example.

// ✓ OK: This branch is based on a uniform. All vertices will take the
// same path, so there is no performance penalty.
if (material.effect_mode == 1u) {
    // All vertices do this...
} else {
    // ...or all vertices do this.
}

4. Offload Work to the CPU via Vertex Attributes

The GPU is a massively parallel processor, but the CPU is often better for complex, sequential, or one-off tasks. If you need a value that is unique to each vertex but doesn't change every frame, pre-calculate it on the CPU and pass it to the shader as a custom vertex attribute.

Scenario: Imagine you want each blade of grass in a field to sway with a slightly different timing and direction.

  • The Slow Way: Try to invent a "random" number inside the vertex shader based on in.position or @builtin(vertex_index). This is computationally awkward and often produces repetitive, low-quality patterns.

  • The Fast Way (in Rust): When you build the grass Mesh on the CPU, add a new custom attribute. For each vertex, generate a single random f32 and store it in that attribute's buffer.

// In Rust, when creating your mesh:
let mut random_seeds = Vec::with_capacity(vertex_count);
for _ in 0..vertex_count {
    random_seeds.push(rand::random::<f32>());
}
// Store this data in the mesh asset itself.
// Note: It's common to reuse an existing attribute slot
// if it's not otherwise needed (e.g., skinning attributes).
my_mesh.insert_attribute(
    Mesh::ATTRIBUTE_JOINT_INDICES, 
    random_seeds,
);

Then, in your shader, you simply read this pre-calculated value.

// In your shader's VertexInput struct:
@location(4) random_seed: f32, // Or whichever location you chose

// In the vertex function:
// The random offset is instantly available, no calculation needed!
let sway_offset = sin(material.time + in.random_seed * 6.28); // 6.28 is 2*PI

This is a powerful optimization principle: do the work once on the CPU during setup, not every frame for every vertex on the GPU.

5. Scale Work with Distance (LOD)

It is wasteful to execute a complex, detailed vertex displacement effect on an object so far away that it only covers a few pixels on screen. Smartly reducing or eliminating this work for distant objects is a core optimization strategy known as LOD (Level of Detail).

The first step is always to calculate a lod_factor based on the object's distance from the camera. The smoothstep function is perfect for creating a smooth falloff range.

// Assume camera_position is passed in via a uniform.
// We get the world position of the vertex *before* any displacement.
let model = mesh_functions::get_world_from_local(in.instance_index);
let world_pos_vec4 = model * vec4<f32>(in.position, 1.0);
let distance_from_camera = length(world_pos_vec4.xyz - material.camera_position);

// Create a blend factor. It will be 0 for objects closer than 20 units,
// 1 for objects farther than 100 units, and smoothly transition in between.
let lod_factor = smoothstep(20.0, 100.0, distance_from_camera);

Simply multiplying your final displacement by (1.0 - lod_factor) will scale the visual intensity of the effect, but it does not reduce the amount of calculation and therefore provides no performance benefit on its own. The real performance gain comes from using this lod_factor to avoid doing work.

Technique 1: Conditional Execution

We can use the lod_factor to create a branch that completely skips the expensive calculations for distant objects.

var final_position = in.position;

// Only run the expensive displacement functions if the object is close enough.
if (lod_factor < 0.999) {
    // Perform all expensive calculations inside this block.
    var displaced_position = apply_wave_displacement(in.position, ...);
    displaced_position = apply_noise_displacement(displaced_position, ...);

    // We can still use the lod_factor to smoothly fade the effect out as it
    // approaches the cutoff distance, preventing a sudden "pop".
    let fade = 1.0 - lod_factor;
    final_position = mix(in.position, displaced_position, fade);
}
// If lod_factor is >= 0.999, the code inside the 'if' is never executed.
// The shader becomes extremely cheap, only performing the basic model transformation.

This is a safe and effective use of branching. Because the lod_factor will be very similar for all vertices in a distant object, the entire group of threads will likely take the same "do nothing" path, avoiding the thread divergence problem while saving a massive amount of computation.

Technique 2: Swapping Materials (The Best Method)

The most robust and performant LOD systems are implemented on the CPU. Instead of a single complex shader, you create multiple versions:

  1. effect_lod0.wgsl: The full-quality shader with all effects enabled.

  2. effect_lod1.wgsl: A cheaper version with only the most prominent effect (e.g., just the wave).

  3. effect_lod2.wgsl: A "fall-back" shader that does no displacement at all, only the standard transformations.

Then, a system in your Rust application is responsible for checking the entity's distance from the camera and swapping the Handle<CustomMaterial> to the appropriate, cheaper version. This architectural approach ensures the GPU is only ever running the absolute minimum code necessary for the required level of detail, providing the best possible performance.


Complete Example: Advanced Vertex Effect System

It's time to combine everything we've learned into a single, powerful shader controlled by a Bevy application. This project will allow you to cycle through different vertex effects in real-time and adjust their parameters to see the immediate impact of your changes.

Our Goal

We will create a custom material that can apply several distinct vertex shader effects to a mesh: a classic sine-wave deformation, organic noise, a "breathing" pulse, a rotational twist, and a camera-facing billboard. This single, versatile shader will serve as a playground for experimenting with the core techniques of vertex manipulation.

What This Project Demonstrates

  • Manual Transformation Pipeline: Implementing the Model -> World -> View -> Clip journey in our shader for all standard deformation effects.

  • Uniform-Based Branching: Using a material.effect_mode uniform to safely and efficiently switch between different effects without performance loss from thread divergence.

  • Effect Composition: Demonstrating how to layer multiple displacement functions for more complex and interesting results.

  • Normal Perturbation: Calculating new normals for our wave effect to ensure lighting reacts correctly to the deformed surface.

  • Passing Camera Data: Correctly sending the camera's transform vectors to our material via uniforms to build a robust billboard, avoiding GPU binding conflicts.

  • CPU-Side Logic: Using a Bevy system to dynamically swap a mesh between a sphere and a plane to best suit the selected effect (e.g., a plane for the billboard).

The Shader (assets/shaders/d02_01_advanced_vertex_effects.wgsl)

This is the heart of our project. The WGSL code contains our library of displacement functions (apply_wave_displacement, noise3d, apply_twist) and the main @vertex function. The vertex shader's logic is controlled by a large if/else if chain that checks the material.effect_mode uniform. This is a performant way to manage multiple effects in one shader, as all vertices in a draw call will take the same code path.

For most effects, it modifies the vertex's local_position before applying the standard model-to-world transformation. For the billboard mode (mode 5), it uses a completely different logic path, ignoring the model matrix entirely and constructing the world position from the camera's right and up vectors. The fragment shader provides simple lighting and visualizes the intensity of each effect with color.

#import bevy_pbr::mesh_functions
#import bevy_pbr::view_transformations::{position_world_to_clip, position_view_to_world}

// Advanced vertex effect parameters
struct VertexEffectMaterial {
    effect_mode: u32,
    time: f32,
    wave_frequency: f32,
    wave_amplitude: f32,
    noise_strength: f32,
    inflation: f32,
    twist_amount: f32,
    billboard_size: f32,
    camera_position: vec3<f32>,
    _padding: f32,
    camera_right: vec3<f32>,
    _padding2: f32,
    camera_up: vec3<f32>,
    _padding3: f32,
}

@group(2) @binding(0)
var<uniform> material: VertexEffectMaterial;

struct VertexInput {
    @builtin(instance_index) instance_index: u32,
    @location(0) position: vec3<f32>,
    @location(1) normal: vec3<f32>,
    @location(2) uv: vec2<f32>,
}

struct VertexOutput {
    @builtin(position) clip_position: vec4<f32>,
    @location(0) world_position: vec3<f32>,
    @location(1) world_normal: vec3<f32>,
    @location(2) uv: vec2<f32>,
    @location(3) effect_intensity: f32,
}

// ===== Displacement Functions =====

fn apply_wave_displacement(
    position: vec3<f32>,
    normal: vec3<f32>,
    time: f32,
    frequency: f32,
    amplitude: f32
) -> vec3<f32> {
    let wave1 = sin(position.x * frequency + time * 2.0);
    let wave2 = cos(position.z * frequency + time * 1.5);
    let combined_wave = (wave1 + wave2) * 0.5;

    return position + normal * combined_wave * amplitude;
}

// Simple hash function for noise
fn hash(n: f32) -> f32 {
    return fract(sin(n) * 43758.5453123);
}

fn noise3d(p: vec3<f32>) -> f32 {
    let ip = floor(p);
    var fp = fract(p);
    fp = fp * fp * (3.0 - 2.0 * fp);

    let n = ip.x + ip.y * 57.0 + ip.z * 113.0;
    return mix(
        mix(
            mix(hash(n + 0.0), hash(n + 1.0), fp.x),
            mix(hash(n + 57.0), hash(n + 58.0), fp.x),
            fp.y
        ),
        mix(
            mix(hash(n + 113.0), hash(n + 114.0), fp.x),
            mix(hash(n + 170.0), hash(n + 171.0), fp.x),
            fp.y
        ),
        fp.z
    );
}

fn apply_noise_displacement(
    position: vec3<f32>,
    normal: vec3<f32>,
    time: f32,
    strength: f32
) -> vec3<f32> {
    let noise_pos = position * 2.0 + vec3<f32>(time * 0.3);
    let noise = noise3d(noise_pos) * 2.0 - 1.0; // -1 to 1

    return position + normal * noise * strength;
}

fn apply_twist(
    position: vec3<f32>,
    amount: f32,
    axis: vec3<f32>
) -> vec3<f32> {
    // Calculate height along axis
    let height = dot(position, axis);

    // Rotation amount increases with height
    let angle = height * amount;
    let c = cos(angle);
    let s = sin(angle);

    // Get perpendicular components
    let parallel = height * axis;
    let perpendicular = position - parallel;

    // Rotate in the perpendicular plane
    // We need a consistent coordinate system perpendicular to the axis
    let right = normalize(cross(axis, vec3<f32>(0.0, 0.0, 1.0)));
    let forward = cross(axis, right);

    let perp_x = dot(perpendicular, right);
    let perp_y = dot(perpendicular, forward);

    let rotated_x = perp_x * c - perp_y * s;
    let rotated_y = perp_x * s + perp_y * c;

    return parallel + right * rotated_x + forward * rotated_y;
}

// ===== Normal Perturbation =====

fn perturb_normal_for_waves(
    normal: vec3<f32>,
    position: vec3<f32>,
    time: f32,
    frequency: f32,
    amplitude: f32
) -> vec3<f32> {
    let epsilon = 0.01;

    // Calculate wave gradients
    let grad_x = cos(position.x * frequency + time * 2.0) * frequency;
    let grad_z = cos(position.z * frequency + time * 1.5) * frequency;

    let tangent_offset = vec3<f32>(grad_x, 0.0, grad_z) * amplitude;

    return normalize(normal + tangent_offset * 0.5);
}

// ===== Camera Utilities =====

fn get_camera_right() -> vec3<f32> {
    return normalize(material.camera_right);
}

fn get_camera_up() -> vec3<f32> {
    return normalize(material.camera_up);
}

// ===== Main Vertex Shader =====

@vertex
fn vertex(in: VertexInput) -> VertexOutput {
    var out: VertexOutput;

    let model = mesh_functions::get_world_from_local(in.instance_index);

    var local_position = in.position;
    var local_normal = in.normal;
    var effect_intensity = 0.0;
    var world_position: vec4<f32>;

    // Mode 5: Billboard (Camera-facing quad) - handle differently!
    if material.effect_mode == 5u {
        // For billboard with Rectangle mesh (XY plane), rebuild position in world space
        let center = vec3<f32>(0.0, 0.0, 0.0);

        let camera_right = get_camera_right();
        let camera_up = get_camera_up();

        // Rectangle mesh has vertices in XY plane, use them directly as 2D offsets
        let billboard_pos = center
                          + camera_right * in.position.x * material.billboard_size
                          + camera_up * in.position.y * material.billboard_size;

        // Use billboard position directly as world position (no model transform!)
        world_position = vec4<f32>(billboard_pos, 1.0);

        // Billboard normal faces camera
        local_normal = normalize(material.camera_position - billboard_pos);

        effect_intensity = 1.0;
    }
    // All other modes: normal transformation pipeline
    else {
        // Mode 0: Wave Displacement
        if material.effect_mode == 0u {
            local_position = apply_wave_displacement(
                local_position,
                local_normal,
                material.time,
                material.wave_frequency,
                material.wave_amplitude
            );

            local_normal = perturb_normal_for_waves(
                local_normal,
                in.position,
                material.time,
                material.wave_frequency,
                material.wave_amplitude
            );

            effect_intensity = (sin(in.position.x * material.wave_frequency + material.time * 2.0) + 1.0) * 0.5;
        }
        // Mode 1: Noise Displacement
        else if material.effect_mode == 1u {
            local_position = apply_noise_displacement(
                local_position,
                local_normal,
                material.time,
                material.noise_strength
            );

            effect_intensity = noise3d(in.position * 2.0 + vec3<f32>(material.time * 0.3));
        }
        // Mode 2: Inflation (Breathing effect)
        else if material.effect_mode == 2u {
            let pulse = (sin(material.time * 2.0) + 1.0) * 0.5;
            let inflate_amount = material.inflation * pulse;

            local_position = local_position + local_normal * inflate_amount;
            effect_intensity = pulse;
        }
        // Mode 3: Twist
        else if material.effect_mode == 3u {
            let axis = vec3<f32>(0.0, 1.0, 0.0);
            local_position = apply_twist(
                local_position,
                material.twist_amount * sin(material.time),
                axis
            );

            effect_intensity = abs(sin(material.time));
        }
        // Mode 4: Combined Effects
        else if material.effect_mode == 4u {
            // Layer multiple effects
            local_position = apply_wave_displacement(
                local_position,
                local_normal,
                material.time,
                material.wave_frequency * 0.5,
                material.wave_amplitude * 0.5
            );

            local_position = apply_noise_displacement(
                local_position,
                local_normal,
                material.time,
                material.noise_strength * 0.3
            );

            let pulse = (sin(material.time * 2.0) + 1.0) * 0.5;
            local_position = local_position + local_normal * material.inflation * pulse * 0.3;

            local_normal = perturb_normal_for_waves(
                local_normal,
                in.position,
                material.time,
                material.wave_frequency * 0.5,
                material.wave_amplitude * 0.5
            );

            effect_intensity = pulse;
        }

        // Transform to world space for non-billboard modes
        world_position = mesh_functions::mesh_position_local_to_world(
            model,
            vec4<f32>(local_position, 1.0)
        );
    }

    // Transform normal to world space (for all modes)
    let world_normal = mesh_functions::mesh_normal_local_to_world(
        local_normal,
        in.instance_index
    );

    // Transform to clip space
    out.clip_position = position_world_to_clip(world_position.xyz);
    out.world_position = world_position.xyz;
    out.world_normal = normalize(world_normal);
    out.uv = in.uv;
    out.effect_intensity = effect_intensity;

    return out;
}

@fragment
fn fragment(in: VertexOutput) -> @location(0) vec4<f32> {
    let normal = normalize(in.world_normal);

    // Simple lighting
    let light_dir = normalize(vec3<f32>(1.0, 1.0, 1.0));
    let diffuse = max(0.3, dot(normal, light_dir));

    // Base color with effect intensity
    var base_color = vec3<f32>(0.3, 0.6, 0.9);

    // Modulate color based on effect intensity
    let effect_color = vec3<f32>(1.0, 0.7, 0.3);
    base_color = mix(base_color, effect_color, in.effect_intensity * 0.5);

    // Special rendering for billboard mode - add a clear directional pattern
    if material.effect_mode == 5u {
        // Create an arrow pattern using UV coordinates to show orientation
        // Rectangle mesh has UVs from 0 to 1 in both X and Y
        let uv = in.uv;

        // Vertical stripe in center (arrow shaft)
        let center_stripe = step(abs(uv.x - 0.5), 0.08);
        let shaft = center_stripe * step(uv.y, 0.6);

        // Arrow head at top (two diagonal lines forming V shape)
        let arrow_region = step(0.55, uv.y) * step(uv.y, 0.8);

        // Left diagonal of arrow head
        let left_dist = abs((uv.x - 0.3) - (uv.y - 0.55) * 0.5);
        let left_arm = step(left_dist, 0.04) * arrow_region;

        // Right diagonal of arrow head
        let right_dist = abs((uv.x - 0.7) + (uv.y - 0.55) * 0.5);
        let right_arm = step(right_dist, 0.04) * arrow_region;

        // Combine all parts of arrow
        let arrow = max(shaft, max(left_arm, right_arm));

        // Yellow arrow on blue background
        base_color = mix(vec3<f32>(0.2, 0.5, 1.0), vec3<f32>(1.0, 1.0, 0.0), arrow);
    }

    let final_color = base_color * diffuse;

    return vec4<f32>(final_color, 1.0);
}

The Rust Material (src/materials/d02_01_advanced_vertex_effects.rs)

The Rust Material definition is the bridge to our shader. The AdvancedVertexEffectsMaterial struct mirrors the uniform block in WGSL, including the necessary padding to satisfy memory alignment rules. This allows Bevy to correctly format our data for the GPU.

use bevy::prelude::*;
use bevy::render::render_resource::{AsBindGroup, ShaderRef};

#[derive(Asset, TypePath, AsBindGroup, Debug, Clone)]
pub struct AdvancedVertexEffectsMaterial {
    #[uniform(0)]
    pub effect_mode: u32,
    #[uniform(0)]
    pub time: f32,
    #[uniform(0)]
    pub wave_frequency: f32,
    #[uniform(0)]
    pub wave_amplitude: f32,
    #[uniform(0)]
    pub noise_strength: f32,
    #[uniform(0)]
    pub inflation: f32,
    #[uniform(0)]
    pub twist_amount: f32,
    #[uniform(0)]
    pub billboard_size: f32,
    #[uniform(0)]
    pub camera_position: Vec3,
    #[uniform(0)]
    pub _padding: f32,
    #[uniform(0)]
    pub camera_right: Vec3,
    #[uniform(0)]
    pub _padding2: f32,
    #[uniform(0)]
    pub camera_up: Vec3,
    #[uniform(0)]
    pub _padding3: f32,
}

impl Material for AdvancedVertexEffectsMaterial {
    fn vertex_shader() -> ShaderRef {
        "shaders/d02_01_advanced_vertex_effects.wgsl".into()
    }

    fn fragment_shader() -> ShaderRef {
        "shaders/d02_01_advanced_vertex_effects.wgsl".into()
    }
}

Don't forget to add it to src/materials/mod.rs:

// ... other materials
pub mod d02_01_advanced_vertex_effects;

The Demo Module (src/demos/d02_01_advanced_vertex_effects.rs)

The Rust code sets up our scene and contains the logic for interactivity. Key systems include:

  • update_time: Updates the time uniform and, crucially, queries the Camera3d's Transform to pass its current position, right, and up vectors to the material every frame.

  • cycle_effect_mode & adjust_parameters: Listen for keyboard input to change the effect_mode and tweak the various effect parameters in real-time.

  • swap_mesh_for_billboard: An intelligent system that watches for the effect_mode changing. When the billboard mode is selected, it swaps the entity's Mesh3d handle to a Rectangle; for all other modes, it ensures a Sphere is used.

use crate::materials::d02_01_advanced_vertex_effects::AdvancedVertexEffectsMaterial;
use bevy::prelude::*;

// Marker component for our demo object
#[derive(Component)]
struct EffectDemoObject;

pub fn run() {
    App::new()
        .add_plugins(DefaultPlugins.set(AssetPlugin {
            watch_for_changes_override: Some(true),
            ..default()
        }))
        .add_plugins(MaterialPlugin::<AdvancedVertexEffectsMaterial>::default())
        .add_systems(Startup, setup)
        .add_systems(
            Update,
            (
                rotate_camera,
                update_time,
                cycle_effect_mode,
                adjust_parameters,
                display_controls,
                swap_mesh_for_billboard,
            ),
        )
        .run();
}

fn setup(
    mut commands: Commands,
    mut meshes: ResMut<Assets<Mesh>>,
    mut materials: ResMut<Assets<AdvancedVertexEffectsMaterial>>,
    mut standard_materials: ResMut<Assets<StandardMaterial>>,
) {
    // Spawn a sphere with vertex effects
    commands.spawn((
        Mesh3d(meshes.add(Sphere::new(1.0).mesh().uv(64, 32))),
        MeshMaterial3d(materials.add(AdvancedVertexEffectsMaterial {
            effect_mode: 0,
            time: 0.0,
            wave_frequency: 3.0,
            wave_amplitude: 0.2,
            noise_strength: 0.3,
            inflation: 0.3,
            twist_amount: 2.0,
            billboard_size: 2.0,
            camera_position: Vec3::ZERO,
            _padding: 0.0,
            camera_right: Vec3::X,
            _padding2: 0.0,
            camera_up: Vec3::Y,
            _padding3: 0.0,
        })),
        EffectDemoObject,
    ));

    // Add a reference cube to show the camera is actually moving
    // This cube will rotate from the camera's perspective
    commands.spawn((
        Mesh3d(meshes.add(Cuboid::new(0.5, 0.5, 0.5))),
        MeshMaterial3d(standard_materials.add(StandardMaterial {
            base_color: Color::srgb(1.0, 0.3, 0.3),
            ..default()
        })),
        Transform::from_xyz(2.5, 0.0, 0.0),
    ));

    // Light
    commands.spawn((
        PointLight {
            shadows_enabled: true,
            intensity: 2000.0,
            ..default()
        },
        Transform::from_xyz(4.0, 8.0, 4.0),
    ));

    // Camera
    commands.spawn((
        Camera3d::default(),
        Transform::from_xyz(-3.0, 2.5, 6.0).looking_at(Vec3::ZERO, Vec3::Y),
    ));
}

fn rotate_camera(time: Res<Time>, mut camera_query: Query<&mut Transform, With<Camera3d>>) {
    for mut transform in camera_query.iter_mut() {
        let radius = 6.0;
        let angle = time.elapsed_secs() * 0.3;
        transform.translation.x = angle.cos() * radius;
        transform.translation.z = angle.sin() * radius;
        transform.look_at(Vec3::ZERO, Vec3::Y);
    }
}

fn update_time(
    time: Res<Time>,
    camera_query: Query<&Transform, With<Camera3d>>,
    mut materials: ResMut<Assets<AdvancedVertexEffectsMaterial>>,
) {
    // Get camera transform - unwrap because we expect exactly one camera
    let camera_transform = camera_query.single().unwrap();

    // Calculate camera axes from transform
    let camera_right = camera_transform.right().as_vec3();
    let camera_up = camera_transform.up().as_vec3();
    let camera_position = camera_transform.translation;

    for (_, material) in materials.iter_mut() {
        material.time = time.elapsed_secs();
        material.camera_position = camera_position;
        material.camera_right = camera_right;
        material.camera_up = camera_up;
    }
}

fn cycle_effect_mode(
    keyboard: Res<ButtonInput<KeyCode>>,
    mut materials: ResMut<Assets<AdvancedVertexEffectsMaterial>>,
) {
    if keyboard.just_pressed(KeyCode::Space) {
        for (_, material) in materials.iter_mut() {
            material.effect_mode = (material.effect_mode + 1) % 6;
        }
    }
}

fn adjust_parameters(
    keyboard: Res<ButtonInput<KeyCode>>,
    mut materials: ResMut<Assets<AdvancedVertexEffectsMaterial>>,
) {
    for (_, material) in materials.iter_mut() {
        // Wave frequency
        if keyboard.pressed(KeyCode::KeyQ) {
            material.wave_frequency = (material.wave_frequency - 0.1).max(0.5);
        }
        if keyboard.pressed(KeyCode::KeyW) {
            material.wave_frequency = (material.wave_frequency + 0.1).min(10.0);
        }

        // Wave amplitude
        if keyboard.pressed(KeyCode::KeyA) {
            material.wave_amplitude = (material.wave_amplitude - 0.01).max(0.0);
        }
        if keyboard.pressed(KeyCode::KeyS) {
            material.wave_amplitude = (material.wave_amplitude + 0.01).min(1.0);
        }

        // Noise strength
        if keyboard.pressed(KeyCode::KeyZ) {
            material.noise_strength = (material.noise_strength - 0.01).max(0.0);
        }
        if keyboard.pressed(KeyCode::KeyX) {
            material.noise_strength = (material.noise_strength + 0.01).min(1.0);
        }

        // Inflation
        if keyboard.pressed(KeyCode::KeyE) {
            material.inflation = (material.inflation - 0.01).max(0.0);
        }
        if keyboard.pressed(KeyCode::KeyR) {
            material.inflation = (material.inflation + 0.01).min(1.0);
        }

        // Twist amount
        if keyboard.pressed(KeyCode::KeyC) {
            material.twist_amount = (material.twist_amount - 0.1).max(0.0);
        }
        if keyboard.pressed(KeyCode::KeyV) {
            material.twist_amount = (material.twist_amount + 0.1).min(5.0);
        }
    }
}

fn display_controls(
    materials: Res<Assets<AdvancedVertexEffectsMaterial>>,
    mut commands: Commands,
    text_query: Query<Entity, With<Text>>,
) {
    // Remove old text
    for entity in text_query.iter() {
        commands.entity(entity).despawn();
    }

    // Get current material state
    if let Some((_, material)) = materials.iter().next() {
        let mode_text = match material.effect_mode {
            0 => "Wave Displacement",
            1 => "Noise Displacement",
            2 => "Inflation (Breathing)",
            3 => "Twist",
            4 => "Combined Effects",
            5 => "Billboard (switches to plane mesh)",
            _ => "Unknown",
        };

        let controls = format!(
            "SPACE: Cycle Effect Mode (Current: {})\n\
             Q/W: Wave Frequency ({:.1}) | A/S: Wave Amplitude ({:.2})\n\
             Z/X: Noise Strength ({:.2}) | E/R: Inflation ({:.2})\n\
             C/V: Twist Amount ({:.1})",
            mode_text,
            material.wave_frequency,
            material.wave_amplitude,
            material.noise_strength,
            material.inflation,
            material.twist_amount
        );

        commands.spawn((
            Text::new(controls),
            Node {
                position_type: PositionType::Absolute,
                top: Val::Px(10.0),
                left: Val::Px(10.0),
                ..default()
            },
        ));
    }
}

// System to swap mesh between sphere and plane for billboard mode
fn swap_mesh_for_billboard(
    materials: Res<Assets<AdvancedVertexEffectsMaterial>>,
    mut meshes: ResMut<Assets<Mesh>>,
    mut demo_query: Query<&mut Mesh3d, With<EffectDemoObject>>,
    mut last_mode: Local<Option<u32>>,
) {
    if let Some((_, material)) = materials.iter().next() {
        // Only swap if mode changed
        if *last_mode != Some(material.effect_mode) {
            let mut mesh = demo_query.single_mut().unwrap();

            if material.effect_mode == 5 {
                // Create a simple quad mesh for billboard
                // Use Rectangle which is a 2D shape in the XY plane
                mesh.0 = meshes.add(Rectangle::new(2.0, 2.0));
            } else {
                // Use sphere for all other modes
                mesh.0 = meshes.add(Sphere::new(1.0).mesh().uv(64, 32));
            }

            *last_mode = Some(material.effect_mode);
        }
    }
}

Don't forget to add it to src/demos/mod.rs:

// ... other demos
pub mod d02_01_advanced_vertex_effects;

And register it in src/main.rs:

Demo {
    number: "2.1",
    title: "Vertex Transformation Deep Dive",
    run: demos::d02_01_advanced_vertex_effects::run,
},

Running the Demo

When you run this application, you will see a blue sphere at the center of the scene. As the camera orbits, the lighting on the sphere will change. Use the keyboard to interact with the demo.

Controls

KeyAction
SpaceCycle through the different effect modes.
Q / WDecrease / Increase wave frequency.
A / SDecrease / Increase wave amplitude.
Z / XDecrease / Increase noise strength.
E / RDecrease / Increase inflation amount.
C / VDecrease / Increase twist amount.

What You're Seeing

ModeConcept VisualizedWhat to Do & Look For
0Wave DisplacementThe sphere's surface ripples with smooth waves. Notice that the lighting correctly follows the new contours, because our shader is perturbing the normals to match the wave.
1Noise DisplacementThe sphere deforms with an organic, lumpy motion. This effect uses the original normals for lighting, which is a common approximation that works well for low-frequency noise.
2InflationSee the sphere pulse in and out rhythmically. This is a simple but powerful effect that displaces every vertex along its original normal.
3TwistThe sphere twists around its vertical (Y) axis as if it's being wrung out. This effect manipulates vertex positions directly without using normals for displacement.
4Combined EffectsThis mode layers waves, noise, and inflation to create a more complex and dynamic result, demonstrating the power of composition.
5BillboardThe mesh automatically swaps from a sphere to a flat rectangle. Notice how this rectangle always rotates to perfectly face the orbiting camera. The yellow arrow pattern helps visualize its orientation, always pointing "up" relative to the camera's view.

Key Takeaways

You have now worked through one of the most fundamental and creatively empowering topics in shader development. Before moving on, take a moment to solidify your understanding of these core concepts:

  1. The Full Pipeline: You know how to manually transform a vertex from its local, model-space coordinates all the way to the final clip-space position required by the GPU, following the path: Local → World → View → Clip.

  2. The Power of Displacement: You understand that vertex effects are created by intervening in the transformation pipeline and modifying a vertex's position before applying the model matrix. You can create waves, noise, twists, and other deformations.

  3. Correct Normal Transformation: You know that when geometry deforms, its normals must be updated for lighting to work correctly. For standard transforms, you use Bevy's mesh_normal_local_to_world, and for custom displacements, you can perturb the normal based on the effect's logic.

  4. Camera-Relative Logic: You can create camera-facing billboards by abandoning the model matrix and instead constructing a vertex's world position from the camera's up and right vectors.

  5. Performance is Paramount: You are aware of the key optimization strategies: calculate once and reuse, prefer built-in functions, avoid divergent branching, and use Level of Detail (LOD) techniques to reduce work on distant objects.

  6. Composition is Key: The most interesting effects are often born from layering several simpler displacement functions together.

  7. Safe Shader Bindings: You understand the importance of passing camera data and other global state through your Material's uniforms rather than trying to bind Bevy's global View uniform, which prevents binding conflicts.

What's Next?

We have now manually rebuilt most of the standard transformation pipeline, but one crucial piece remains a "black box": the position_world_to_clip function. We know that it takes our world-space position and magically transforms it for the screen, but how does it work? What are the View and Projection matrices that live inside it, and how do they give us camera perspective, depth, and field-of-view?

In the next article, we will complete our mastery of the transformation pipeline by deconstructing that final step. You will learn to build the Model-View-Projection (MVP) matrix from scratch, giving you ultimate control over the virtual camera. This knowledge is the key to unlocking advanced effects like custom camera projections, fisheye lenses, and precision control over how your 3D world is mapped onto your 2D screen.

Next up: 2.2 - Camera and Projection Matrices


Quick Reference

The Full Transformation Pipeline

The goal is always to calculate clip space position. The complete formula is:

let clip_pos = projection_matrix * view_matrix * model_matrix * vec4(local_pos, 1.0);

Positions vs. Directions (w component)

  • Use w = 1.0 for positions so they are affected by translation.
    vec4(in.position, 1.0)

  • Use w = 0.0 for directions (like normals) to ignore translation.
    vec4(in.normal, 0.0)

The Standard Displacement Pattern

Always modify the vertex position in local space, before applying the model matrix. The most common pattern is displacing along the normal.

var local_pos = in.position;
local_pos += in.normal * sin(time) * amplitude;
// Now apply model matrix to the modified local_pos...
let world_pos = model * vec4(local_pos, 1.0);

The Billboard Pattern

Ignore the model matrix completely for rotation. Construct the world position directly from the camera's right and up vectors, anchored at the object's world-space center.

// Get world center from model matrix's translation column
let center = model[3].xyz;
let world_pos = center + camera_right * local_pos.x + camera_up * local_pos.y;

The Normal Correction Rule

If you displace vertices, the lighting will be wrong unless you also update the normals to match the new surface slope. Use analytical derivatives or approximations (perturbation) for this.

The Performance Golden Rule

Avoid if/else statements that depend on per-vertex data (like in.position). This causes threads to diverge and kills performance. Use mix(), step(), or select() for conditional logic instead. Branching on uniforms is safe and fast.