4.1 - Texture Sampling Basics

What We're Learning
Up until now, we've been generating colors procedurally - calculating them with math equations directly in our shaders. But most real-world 3D graphics rely on textures: images that wrap around 3D geometry to provide color, detail, and visual richness. A character's skin, the rough bark of a tree, or the rust on a metal barrel - these are all typically defined by textures.
Textures are fundamental to modern graphics. They allow us to decouple surface detail from geometric complexity. Instead of modeling every pore on a face or every brick in a wall, we simply "paint" them onto a simpler shape.
While the concept seems straightforward - "just paste an image onto a model" - there is surprising depth here. How do 2D flat images map seamlessly onto complex 3D curves? What happens when a texture is viewed from a sharp angle or from miles away? How does the GPU interpret colors between pixels?
By the end of this article, you'll understand:
Texture vs. Sampler: Why GPUs separate the image data from the instructions on how to read it.
UV Coordinates: The coordinate system that bridges 3D geometry and 2D images.
The textureSample() Function: How to retrieve color data in a fragment shader.
Texture Filtering: How to control the look of your textures (pixelated retro vs. smooth modern).
Address Modes: What happens at the edge of a texture (repeat, clamp, mirror).
Bevy Integration: How to load images and bind them to your custom materials.
Understanding Textures in WGSL
Before we write any code, we need to shift our mental model. To a CPU, an image might just be a file on a disk. To a GPU, a Texture and a Sampler are two distinct, specialized resources.
What Is a Texture?
A texture is a structured grid of data stored in high-speed GPU memory. While we usually use them for color images, they are just data containers.
- 1D Texture: A single row of pixels. Useful for gradients or lookup curves.
[R G B A | R G B A | R G B A | ...]
- 2D Texture: A grid of pixels (width × height). This is the standard "image" format.
[row 0: R G B A | R G B A | ...]
[row 1: R G B A | R G B A | ...]
[row 2: R G B A | R G B A | ...]
- 3D Texture: A volume of pixels (width × height × depth). Used for fog, smoke simulations, or MRI data.
[layer 0][layer 1][layer 2]...
- Cube Texture: A collection of 6 faces forming a cube. Used for skyboxes and reflections.
[+X face][-X face][+Y][-Y][+Z][-Z]
In this article, we will focus exclusively on 2D Textures.
Texture Formats
Textures can be stored in various formats to balance precision against memory usage. When you define a texture in WGSL, the type you choose generally corresponds to the return value, not necessarily the storage format on disk.
Common formats you'll encounter:
Rgba8Unorm(Standard): 4 channels (Red, Green, Blue, Alpha). Each channel is 8 bits (0-255). Shader View: The GPU automatically converts the 0-255 integer range to a 0.0 to 1.0 floating-point range when you sample it.R16Float/R32Float: Single channel floating point. Usage: Heightmaps, physics data, or non-color information.Rg8Unorm: Two channels. Usage: Often used for Normal maps (storing X and Y direction), allowing the Z component to be reconstructed mathematically.
Declaring Textures in WGSL
In WGSL, we declare a texture as a global variable. Note the specific type syntax:
// A standard 2D texture that returns floating point values (0.0 - 1.0)
@group(2) @binding(0)
var my_texture: texture_2d<f32>;
texture_2d: Specifies the dimensionality.<f32>: Specifies the return type. Even if the image on disk is 8-bit integers (PNG), the shader reads them as normalized floats. You can also use<i32>for integer textures or<u32>for unsigned integer textures, but these are for specialized use cases (like grid data), not standard images.
The Sampler: The "How"
Here is the most important concept to grasp: A texture does not know how to be read.
A texture is just a blob of data. If you ask for the color at coordinate (0.5, 0.5), the texture has the data, but it doesn't know:
Should I blend nearby pixels smoothly?
Should I just return the exact nearest pixel (pixel art style)?
What if you ask for coordinate
(1.5, 0.5)- should I wrap around to the start or stop at the edge?
These instructions are provided by a Sampler.
By separating the Texture (data) from the Sampler (logic), modern graphics APIs allow you to reuse the same image in different ways. You could sample a noise texture smoothly for clouds, and then sample the exact same texture with "nearest neighbor" filtering for a glitch effect, without reloading the image.
Declaring Samplers in WGSL
Samplers are declared as their own resource type:
// A sampler configuration
@group(2) @binding(1)
var my_sampler: sampler;
There is also a specialized type called sampler_comparison used for shadow mapping, but for standard materials, sampler is what you need.
Pairing Textures and Samplers in Bevy
Because they are separate resources, you need to define both in your Rust material struct. Bevy's AsBindGroup macro handles the boilerplate of binding them to the specific slots.
#[derive(Asset, TypePath, AsBindGroup, Clone)]
pub struct MyMaterial {
// 1. The Texture Data
// We bind it to group 2, binding 0
#[texture(0)]
// 2. The Sampler Configuration
// We bind it to group 2, binding 1
#[sampler(1)]
pub color_image: Handle<Image>,
}
Notice a convenience here: In Rust, we use a single Handle<Image>. The Image asset in Bevy bundles the pixel data and the sampler configuration together for convenience. However, AsBindGroup splits them apart behind the scenes so the shader receives them as two distinct variables:
@group(2) @binding(0) var color_texture: texture_2d<f32>;
@group(2) @binding(1) var color_sampler: sampler;
UV Coordinates: The Bridge to 3D
To paint a 2D image onto a 3D object, we need a translation map. We can't say "put this pixel on that vertex" because vertices move, rotate, and scale. Instead, we use a normalized coordinate system called UV Coordinates.
What Are UV Coordinates?
UVs act as anchors. They map a specific point on the 3D mesh surface to a specific point on the 2D texture.
U (Horizontal): Corresponds to the X-axis of the image (
0.0is Left,1.0is Right).V (Vertical): Corresponds to the Y-axis of the image (
0.0is Top,1.0is Bottom).
Note: While the coordinate system mathematically goes from
0.0to1.0, different engines and modeling software treat the vertical origin differently. In WGPU (and thus Bevy),(0, 0)is the top-left corner of the texture data.
Think of it like gift wrapping. The wrapping paper (texture) is flat. You wrap it around a box (mesh). The UV coordinates tell the engine exactly which part of the paper touches each corner of the box.
UVs in the Graphics Pipeline
UVs differ from textures in one major way: UVs are vertex data. They live on the mesh, not in the material.
Vertex Input: Each vertex in your mesh has a position
(x, y, z)and a UV coordinate(u, v).Interpolation: This is the magic step. The vertex shader passes the UVs to the rasterizer. When the GPU draws a triangle, it automatically interpolates the UVs between the three vertices for every single pixel inside that triangle.
Fragment Input: By the time the code reaches your fragment shader, the uv variable represents the precise coordinate for that specific pixel on the screen.
Passing UVs in WGSL
To use UVs, we must explicitly pass them from the Vertex Shader to the Fragment Shader.
struct VertexInput {
@location(0) position: vec3<f32>,
@location(1) normal: vec3<f32>,
@location(2) uv: vec2<f32>, // Input from Mesh
}
struct VertexOutput {
@builtin(position) clip_position: vec4<f32>,
// We create a location to pass UVs to the fragment stage
@location(0) uv: vec2<f32>,
}
@vertex
fn vertex(in: VertexInput) -> VertexOutput {
var out: VertexOutput;
// ... calculate clip_position ...
// Pass the UVs through unchanged
out.uv = in.uv;
return out;
}
The textureSample() Function
Now that we have the Texture (data), the Sampler (rules), and the UVs (coordinates), we can finally retrieve a color.
Basic Usage
The primary function for this is textureSample().
@fragment
fn fragment(in: VertexOutput) -> @location(0) vec4<f32> {
// Syntax: textureSample(texture_variable, sampler_variable, coordinates)
let color = textureSample(my_texture, my_sampler, in.uv);
return color;
}
Return Type: textureSample always returns a vec4<f32>.
Even if your texture is a JPEG with no transparency, the alpha channel (
.a) will be set to1.0.Even if your texture is black and white, you get a
vec4(usually with R=G=B).
Constraint: Fragment Shader Only
There is a critical rule you must remember: textureSample() can only be used in the Fragment Shader.
If you try to use it in a Vertex Shader, your code will fail to compile.
Why? To avoid "shimmering" or "aliasing" on distant objects, GPUs use Mipmaps (smaller versions of the texture). The GPU decides which mipmap level to use based on how fast the UV coordinates are changing from pixel to pixel (derivatives).
In the Fragment Shader, the GPU knows about neighboring pixels and can calculate this.
In the Vertex Shader, vertices are processed in isolation. The GPU has no concept of "neighbors" or "screen density," so it cannot automatically choose a mipmap level.
Note: If you absolutely must read a texture in the vertex shader - e.g., for a heightmap displacement - you must use
textureSampleLevel(), which forces you to manually specify the mipmap level, usually0.0.
Swizzling and Channels
Often, you don't need the full RGBA color. You can use standard vector swizzling to get what you need:
let raw_sample = textureSample(my_texture, my_sampler, in.uv);
// Just the Red channel (useful for grayscale masks)
let roughness = raw_sample.r;
// Just the RGB color (dropping alpha)
let base_color = raw_sample.rgb;
// Reordering channels (Swap Red and Blue)
let bgr_color = raw_sample.bgr;
Texture Filtering: The "Zoom" Problem
Textures are made of discrete pixels (texels). Screens are made of discrete pixels. Rarely do they align perfectly 1:1.
When a texture is displayed larger than its original size (Magnification) or smaller than its original size (Minification), the GPU has to make a decision about how to fill the gaps.
This behavior is controlled by the Filter Mode setting on the Sampler.
1. Nearest Neighbor (Nearest)
This is the simplest method. The GPU simply picks the single texel closest to the UV coordinate.
Look: Blocky, pixelated.
Best For: Pixel art games, Minecraft-style aesthetics, or debugging.
Performance: Extremely fast.
2. Linear Interpolation (Linear)
The GPU takes the 4 closest texels surrounding the UV coordinate and blends them together based on distance (bilinear interpolation).
Look: Smooth, slightly blurry at close range.
Best For: Realistic textures, photos, most standard 3D objects.
Performance: Standard (hardware optimized).
Mipmaps and Minification
When a texture is far away (minification), sampling just one pixel causes "noise" or "shimmering" because you might hit a bright pixel on one frame and a dark pixel on the next, even if the camera moves slightly.
To solve this, we use Mipmaps: a chain of progressively smaller versions of the image (100%, 50%, 25%...).
mipmap_filter: Nearest: Switches abruptly between detail levels. You can see a visible "line" where the quality drops.mipmap_filter: Linear: Blends between the two nearest mipmap levels (Trilinear filtering). This is the gold standard for smooth rendering.
Address Modes: The "Edge" Problem
UV coordinates are typically 0.0 to 1.0. But what happens if we pass 2.5 or -0.1? The Address Mode (or Wrap Mode) determines this behavior.
1. Repeat
The texture tiles infinitely. 1.1 behaves exactly like 0.1.
Use Case: Brick walls, grass, tile floors.
2. ClampToEdge
Any value greater than 1.0 reads the last pixel on the edge. Any value less than 0.0 reads the first pixel.
Use Case: UI elements, skyboxes (to prevent seams), or any object that shouldn't tile.
3. MirrorRepeat
The texture tiles, but flips direction every time (0-1, then 1-0, then 0-1).
Use Case: Creating seamless patterns from non-seamless images, or weird psychedelic effects.
Bevy Integration
In Bevy, textures are loaded as Image assets. The Image struct holds both the pixel data and the sampler configuration.
Configuring Samplers in Rust
By default, Bevy loads images with Repeat address mode and Linear filtering. If you want "pixel perfect" rendering or clamping, you need to modify the image asset.
use bevy::image::{ImageSampler, ImageSamplerDescriptor, ImageFilterMode, ImageAddressMode};
fn configure_texture(
mut images: ResMut<Assets<Image>>,
my_handle: Res<TextureHandle>, // Assuming you stored the handle
) {
if let Some(image) = images.get_mut(&my_handle) {
image.sampler = ImageSampler::Descriptor(ImageSamplerDescriptor {
// "Pixel Art" settings:
mag_filter: ImageFilterMode::Nearest, // Sharp pixels when close
min_filter: ImageFilterMode::Linear, // Smooth when far away
// Tiling settings:
address_mode_u: ImageAddressMode::Repeat,
address_mode_v: ImageAddressMode::Repeat,
..default()
});
}
}
Anisotropic Filtering
There is one more advanced sampler setting: Anisotropy.
When you look at a floor texture at a sharp, grazing angle, standard linear filtering makes it look blurry because the "footprint" of the pixel is long and thin, but linear filtering assumes a square.
Anisotropic Filtering solves this by taking multiple samples along the slope.
image.sampler = ImageSampler::Descriptor(ImageSamplerDescriptor {
// 16x Anisotropy (High Quality)
anisotropy_clamp: Some(16),
..default()
});
This is significantly more expensive but essential for ground planes and roads in first-person games.
Complete Example: Textured Quad with Custom UVs
We are going to build a demo that demystifies UVs. Instead of using a standard cube, we will manually build a Quad mesh so we can define the UVs ourselves. We will then manipulate these UVs in the shader to scroll, zoom, and tile a texture.
Our Goal
Load an external texture (or generate one).
Display it on a custom mesh.
Use uniforms to control Tiling (Zoom) and Offset (Pan) in real-time.
The Shader (assets/shaders/d04_01_textured_quad.wgsl)
#import bevy_pbr::mesh_functions
#import bevy_pbr::view_transformations::position_world_to_clip
struct TexturedMaterial {
base_color: vec4<f32>,
uv_scale: vec2<f32>,
uv_offset: vec2<f32>,
}
@group(2) @binding(0)
var<uniform> material: TexturedMaterial;
@group(2) @binding(1)
var base_texture: texture_2d<f32>;
@group(2) @binding(2)
var base_sampler: sampler;
struct VertexInput {
@builtin(instance_index) instance_index: u32,
@location(0) position: vec3<f32>,
@location(2) uv: vec2<f32>,
}
struct VertexOutput {
@builtin(position) clip_position: vec4<f32>,
@location(0) uv: vec2<f32>,
}
@vertex
fn vertex(in: VertexInput) -> VertexOutput {
var out: VertexOutput;
let world_from_local = mesh_functions::get_world_from_local(in.instance_index);
let world_position = mesh_functions::mesh_position_local_to_world(
world_from_local,
vec4<f32>(in.position, 1.0)
);
out.clip_position = position_world_to_clip(world_position.xyz);
// UV Manipulation
// 1. Scale (Tiling)
var transformed_uv = in.uv * material.uv_scale;
// 2. Offset (Panning)
transformed_uv = transformed_uv + material.uv_offset;
out.uv = transformed_uv;
return out;
}
@fragment
fn fragment(in: VertexOutput) -> @location(0) vec4<f32> {
let texture_color = textureSample(base_texture, base_sampler, in.uv);
return texture_color * material.base_color;
}
The Rust Material (src/materials/d04_01_textured_quad.rs)
use bevy::prelude::*;
use bevy::render::render_resource::{AsBindGroup, ShaderRef};
#[derive(Asset, TypePath, AsBindGroup, Debug, Clone)]
pub struct TexturedMaterial {
#[uniform(0)]
pub base_color: LinearRgba,
#[uniform(0)]
pub uv_scale: Vec2,
#[uniform(0)]
pub uv_offset: Vec2,
#[texture(1)]
#[sampler(2)]
pub base_texture: Handle<Image>,
}
impl Default for TexturedMaterial {
fn default() -> Self {
Self {
base_color: LinearRgba::WHITE,
uv_scale: Vec2::ONE,
uv_offset: Vec2::ZERO,
base_texture: Handle::default(),
}
}
}
impl Material for TexturedMaterial {
fn vertex_shader() -> ShaderRef {
"shaders/d04_01_textured_quad.wgsl".into()
}
fn fragment_shader() -> ShaderRef {
"shaders/d04_01_textured_quad.wgsl".into()
}
}
Don't forget to register it in src/materials/mod.rs:
pub mod d04_01_textured_quad;
The Demo Module (src/demos/d04_01_textured_quad.rs)
We'll procedurally generate a "Checkerboard" texture so you don't need to download any assets to run this demo. We also configure its sampler to use Repeat mode so we can scroll it infinitely.
use bevy::image::{ImageAddressMode, ImageFilterMode, ImageSampler, ImageSamplerDescriptor};
use bevy::prelude::*;
use bevy::render::render_asset::RenderAssetUsages;
use bevy::render::render_resource::{Extent3d, TextureDimension, TextureFormat};
use crate::materials::d04_01_textured_quad::TexturedMaterial;
pub fn run() {
App::new()
.add_plugins(DefaultPlugins)
.add_plugins(MaterialPlugin::<TexturedMaterial>::default())
.add_systems(Startup, setup)
.add_systems(
Update,
(handle_input, rotate_quads, update_ui, sync_loaded_icon),
)
.run();
}
#[derive(Component)]
struct Rotator;
#[derive(Resource)]
struct DemoState {
procedural_handle: Handle<Image>,
icon_handle: Handle<Image>,
// Settings
filter_mode: ImageFilterMode,
address_mode: ImageAddressMode,
// Rotation
auto_rotate: bool,
manual_rotation: f32,
// State tracking
icon_configured: bool,
}
fn setup(
mut commands: Commands,
mut meshes: ResMut<Assets<Mesh>>,
mut materials: ResMut<Assets<TexturedMaterial>>,
mut images: ResMut<Assets<Image>>,
asset_server: Res<AssetServer>,
) {
// 1. Initial Sampler Settings
let initial_descriptor = ImageSamplerDescriptor {
mag_filter: ImageFilterMode::Nearest,
min_filter: ImageFilterMode::Nearest,
address_mode_u: ImageAddressMode::Repeat,
address_mode_v: ImageAddressMode::Repeat,
..default()
};
// 2. Create Procedural "UV Test" Texture
// We create a colored gradient so address modes are obvious.
// Red = X axis, Green = Y axis.
let size = 256;
let mut data = Vec::with_capacity((size * size * 4) as usize);
for y in 0..size {
for x in 0..size {
let r = x as u8; // 0 to 255 horizontally
let g = y as u8; // 0 to 255 vertically
// Blue checkerboard overlay for detail
let b = if ((x / 16) + (y / 16)) % 2 == 0 {
255
} else {
0
};
data.extend_from_slice(&[r, g, b, 255]);
}
}
let mut proc_image = Image::new(
Extent3d {
width: size,
height: size,
depth_or_array_layers: 1,
},
TextureDimension::D2,
data,
TextureFormat::Rgba8Unorm,
RenderAssetUsages::default(),
);
// Apply initial sampler
proc_image.sampler = ImageSampler::Descriptor(initial_descriptor.clone());
let proc_handle = images.add(proc_image);
// 3. Load Icon (Async)
let icon_handle = asset_server.load("textures/bevy_icon.png");
// 4. Save State
commands.insert_resource(DemoState {
procedural_handle: proc_handle.clone(),
icon_handle: icon_handle.clone(),
filter_mode: ImageFilterMode::Nearest,
address_mode: ImageAddressMode::Repeat,
auto_rotate: true,
manual_rotation: 0.0,
icon_configured: false,
});
// 5. Scene Setup
let quad_handle = meshes.add(Plane3d::default().mesh().size(3.0, 3.0));
// Left Quad: Procedural
commands.spawn((
Mesh3d(quad_handle.clone()),
MeshMaterial3d(materials.add(TexturedMaterial {
base_texture: proc_handle,
..default()
})),
Transform::from_xyz(-2.0, 0.0, 0.0),
Rotator,
));
// Right Quad: Loaded Icon
commands.spawn((
Mesh3d(quad_handle),
MeshMaterial3d(materials.add(TexturedMaterial {
base_texture: icon_handle,
..default()
})),
Transform::from_xyz(2.0, 0.0, 0.0),
Rotator,
));
// Camera
commands.spawn((
Camera3d::default(),
Transform::from_xyz(0.0, 4.0, 6.0).looking_at(Vec3::ZERO, Vec3::Y),
));
// UI
commands.spawn((
Text::new(""), // Updated in update_ui
Node {
position_type: PositionType::Absolute,
top: Val::Px(12.0),
left: Val::Px(12.0),
..default()
},
));
}
fn handle_input(
time: Res<Time>,
keyboard: Res<ButtonInput<KeyCode>>,
mut materials: ResMut<Assets<TexturedMaterial>>,
mut images: ResMut<Assets<Image>>,
mut state: ResMut<DemoState>,
) {
let dt = time.delta_secs();
// 1. UV Manipulation
for (_, mat) in materials.iter_mut() {
if keyboard.pressed(KeyCode::ArrowLeft) {
mat.uv_offset.x -= dt;
}
if keyboard.pressed(KeyCode::ArrowRight) {
mat.uv_offset.x += dt;
}
if keyboard.pressed(KeyCode::ArrowUp) {
mat.uv_offset.y -= dt;
}
if keyboard.pressed(KeyCode::ArrowDown) {
mat.uv_offset.y += dt;
}
if keyboard.pressed(KeyCode::KeyW) {
mat.uv_scale += dt;
}
if keyboard.pressed(KeyCode::KeyS) {
mat.uv_scale = (mat.uv_scale - dt).max(Vec2::splat(0.1));
}
}
// 2. Rotation Controls
if keyboard.just_pressed(KeyCode::KeyR) {
state.auto_rotate = !state.auto_rotate;
}
if !state.auto_rotate {
if keyboard.pressed(KeyCode::KeyA) {
state.manual_rotation -= dt;
}
if keyboard.pressed(KeyCode::KeyD) {
state.manual_rotation += dt;
}
}
// 3. Sampler Switching
let mut update_samplers = false;
if keyboard.just_pressed(KeyCode::Digit1) {
state.filter_mode = match state.filter_mode {
ImageFilterMode::Nearest => ImageFilterMode::Linear,
_ => ImageFilterMode::Nearest,
};
update_samplers = true;
}
if keyboard.just_pressed(KeyCode::Digit2) {
state.address_mode = match state.address_mode {
ImageAddressMode::Repeat => ImageAddressMode::MirrorRepeat,
ImageAddressMode::MirrorRepeat => ImageAddressMode::ClampToEdge,
ImageAddressMode::ClampToEdge => ImageAddressMode::Repeat,
_ => ImageAddressMode::Repeat,
};
update_samplers = true;
}
if update_samplers {
update_all_images(&mut images, &mut state);
}
}
// Helper to apply current settings to all valid images
fn update_all_images(images: &mut Assets<Image>, state: &mut DemoState) {
let descriptor = ImageSamplerDescriptor {
mag_filter: state.filter_mode,
min_filter: state.filter_mode,
address_mode_u: state.address_mode,
address_mode_v: state.address_mode,
..default()
};
// Update procedural
if let Some(image) = images.get_mut(&state.procedural_handle) {
image.sampler = ImageSampler::Descriptor(descriptor.clone());
}
// Update loaded icon (if loaded)
if let Some(image) = images.get_mut(&state.icon_handle) {
image.sampler = ImageSampler::Descriptor(descriptor);
state.icon_configured = true;
}
}
// Automatically configures the icon once it finishes loading
fn sync_loaded_icon(mut images: ResMut<Assets<Image>>, mut state: ResMut<DemoState>) {
// If the icon is loaded but hasn't been configured yet
if !state.icon_configured && images.contains(&state.icon_handle) {
update_all_images(&mut images, &mut state);
}
}
fn rotate_quads(
time: Res<Time>,
mut state: ResMut<DemoState>,
mut query: Query<&mut Transform, With<Rotator>>,
) {
if state.auto_rotate {
state.manual_rotation += time.delta_secs() * 0.2;
}
for mut transform in &mut query {
transform.rotation =
Quat::from_rotation_y(state.manual_rotation) * Quat::from_rotation_x(0.5);
}
}
fn update_ui(state: Res<DemoState>, mut query: Query<&mut Text>) {
let filter_text = match state.filter_mode {
ImageFilterMode::Nearest => "Nearest (Pixelated)",
_ => "Linear (Smooth)",
};
let address_text = match state.address_mode {
ImageAddressMode::Repeat => "Repeat (Tile)",
ImageAddressMode::MirrorRepeat => "MirrorRepeat (Flip)",
_ => "ClampToEdge (Stretch)",
};
let rotate_text = if state.auto_rotate {
"Auto"
} else {
"Manual (A/D)"
};
for mut text in &mut query {
**text = format!(
"CONTROLS:\n\
[Arrows] Pan UVs\n\
[W/S] Zoom UVs\n\
[1] Filter: {}\n\
[2] Address: {}\n\
[R] Rotation: {}",
filter_text, address_text, rotate_text
);
}
}
Don't forget to add it to src/demos/mod.rs:
pub mod d04_01_textured_quad;
And register it in src/main.rs:
Demo {
number: "4.1",
title: "Texture Sampling Basics",
run: demos::d04_01_textured_quad::run,
},
Running the Demo
When you run this demo, you will see a large checkerboard spinning slowly in the void.
Controls
Key | Action | Description |
|---|---|---|
Arrow Keys | Pan | Modifies uv_offset. The texture slides across the surface. Because we set AddressMode::Repeat, it never ends. |
W / S | Zoom | Modifies uv_scale. Increasing scale makes the tiles smaller (more repetitions). Decreasing it zooms in. |
Space | Reset | Resets scale to 1.0 and offset to 0.0. |
What You're Seeing
Sampler Power: Even though our procedural texture is tiny (256x256), it looks sharp because of mag_filter: Nearest. If we changed it to Linear, the edges of the checks would be blurry.
Addressing: As you scroll with the arrow keys, notice how the pattern repeats seamlessly. This is the ImageAddressMode::Repeat setting doing the heavy lifting.
Vertex Shader Efficiency: We are doing the scrolling math in the vertex shader. Since our plane has only 4 vertices, we only run those additions/multiplications 4 times per frame! The GPU interpolates the result for the thousands of pixels in between.
Key Takeaways
Textures ≠ Samplers: A texture is raw data. A sampler is the set of rules for reading it. They are separate resources in WGSL (texture_2d vs sampler).
Fragment Only: You cannot use textureSample() in a vertex shader. You must calculate UVs in the vertex shader and pass them to the fragment shader.
Address Modes Matter: If you want a texture to tile, you must configure the sampler to Repeat.
UV Math: Tiling is multiplication (uv * 2.0). Scrolling is addition (uv + offset).
What's Next?
We've mastered the single texture. But real materials are rarely just one image. They are composites - a blend of base colors, detail maps, dirt layers, and decals.
In the next article, we will learn how to combine multiple textures, use grayscale textures as masks, and layer effects together.
Next up: 4.2 - Texture Filtering and Mipmapping
Quick Reference
WGSL Texture Declaration:
@group(1) @binding(0) var my_tex: texture_2d<f32>;
@group(1) @binding(1) var my_samp: sampler;
Sampling:
let color = textureSample(my_tex, my_samp, uv);
Common UV Math:
let tiled_uv = uv * 5.0; // Repeat 5 times
let scrolled_uv = uv + time; // Move diagonally
let centered_uv = uv - 0.5; // Move origin to center





