Download presentation
Presentation is loading. Please wait.
Published byBaldric Nicholson Modified over 8 years ago
1
David Luebke 3/17/2016 Advanced Computer Graphics Antialiasing David Luebke cs551dl@cs.virginia.eduhttp://www.cs.virginia.edu/~cs551dl
2
David Luebke 3/17/2016 Administrivia l Assignment 1 sample scenes
3
David Luebke 3/17/2016 Recap l Prefiltering –Before sampling the image, use a low- pass filter to eliminate frequencies above the Nyquist limit –This blurs the image… –But ensures that no high frequencies will be misrepresented as low frequencies
4
David Luebke 3/17/2016 Recap l Supersampling –Sample image at higher resolution than final image, then “average down” –“Average down” means multiply by low- pass function in frequency domain –Which means convolving by that function’s FT in space domain –Which equates to a weighted average of nearby samples at each pixel
5
David Luebke 3/17/2016 Recap l Supersampling cons –Doesn’t eliminate aliasing, just shifts the Nyquist limit higher n Can’t fix some scenes (e.g., checkerboard) –Badly inflates storage requirements l Supersampling pros –Relatively easy –Often works all right in practice –Can be added to a standard renderer
6
David Luebke 3/17/2016 Antialiasing in the Continuous Domain l Problem with prefiltering: –Sampling and image generation inextricably linked in most renderers n Z-buffer algorithm n Ray tracing –Why? l Still, some approaches try to approximate effect of convolution in the continuous domain
7
David Luebke 3/17/2016 Antialiasing in the Continuous Domain Pixel Grid Polygons Filter kernel
8
David Luebke 3/17/2016 Antialiasing in the Continuous Domain l The good news –Exact polygon coverage of the filter kernel can be evaluated –What does this entail? n Clipping n Hidden surface determination
9
David Luebke 3/17/2016 Antialiasing in the Continuous Domain l The bad news –Evaluating coverage is very expensive –The intensity variation is too complex to integrate over the area of the filter n Q: Why does intensity make it harder? n A: Because polygons might not be flat- shaded n Q: How bad a problem is this? n A: Intensity varies slowly within a pixel, so shape changes are more important
10
David Luebke 3/17/2016 Catmull’s Algorithm ABAB A1A1 A2A2 A3A3 l Find fragment areas l Multiply by fragment colors l Sum for final pixel color
11
David Luebke 3/17/2016 Catmull’s Algorithm l First real attempt to filter in continuous domain l Very expensive –Clipping polygons to fragments –Sorting polygon fragments by depth (What’s wrong with this as a hidden surface algorithm?) l Equates to box filter (Is that good?)
12
David Luebke 3/17/2016 The A-Buffer l Idea: approximate continuous filtering by subpixel sampling l Summing areas now becomes simple
13
David Luebke 3/17/2016 The A-Buffer l Advantages: –Incorporating into scanline renderer reduces storage costs dramatically –Processing per pixel depends only on number of visible fragments –Can be implemented efficiently using bitwise logical ops on subpixel masks
14
David Luebke 3/17/2016 The A-Buffer l Disadvantages –Still basically a supersampling algorithm –Not a hardware-friendly algorithm n Lists of potentially visible polygons can grow without limit n Work per-pixel non-deterministic
15
David Luebke 3/17/2016 The A-Buffer l Comments –Book claims this is most common algorithm for high-quality rendering –I’m not so sure, anymore –Book gives much gory detail –I won’t test you on it
16
David Luebke 3/17/2016 Stochastic Sampling l Sampling theory tells us that with a regular sampling grid, frequencies higher than the Nyquist limit will alias l Q: What about irregular sampling? l A: High frequencies appear as noise, not aliases l This turns out to bother our visual system less!
17
David Luebke 3/17/2016 Stochastic Sampling l An intuitive argument: –In stochastic sampling, every region of the image has a finite probability of being sampled –Thus small features that fall between uniform sample points tend to be detected by non-uniform samples
18
David Luebke 3/17/2016 Stochastic Sampling l Integrating with different renderers: –Ray tracing: n It is just as easy to fire a ray one direction as another –Z-buffer: hard, but possible n Notable example: REYES system (?) n Using Image jittering is easier (more later) –A-buffer: nope n Totally built around square pixel filter and primitive-to-sample coherence
19
David Luebke 3/17/2016 Stochastic Sampling l Idea: randomizing distribution of samples scatters aliases into noise l Problem: what type of random distribution to adopt? l Reason: type of randomness used affects spectral characteristics of noise into which high frequencies are converted
20
David Luebke 3/17/2016 Stochastic Sampling l Problem: given a pixel, how to distribute points (samples) within it?
21
David Luebke 3/17/2016 Stochastic Sampling l Poisson distribution: –Completely random –Add points at random until area is full. –Uniform distribution: some neighboring samples close together, some distant
22
David Luebke 3/17/2016 Stochastic Sampling l Poisson disc distribution: –Poisson distribution, with minimum- distance constraint between samples –Add points at random, removing again if they are too close to any previous points –Very even-looking distribution
23
David Luebke 3/17/2016 Stochastic Sampling l Jittered distribution –Start with regular grid of samples –Perturb each sample slightly in a random direction –More “clumpy” or granular in appearance
24
David Luebke 3/17/2016 Stochastic Sampling l Spectral characteristics of these distributions: –Poisson: completely uniform (white noise). High and low frequencies equally present –Poisson disc: Pulse at origin (DC component of image), surrounded by empty ring (no low frequencies), surrounded by white noise –Jitter: Approximates Poisson disc spectrum, but with a smaller empty disc.
25
David Luebke 3/17/2016 Stochastic Sampling l Watt & Watt, p. 134
26
David Luebke 3/17/2016
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.