Shawn Hargreaves Blog
This is almost too obvious to be worth pointing out, but no amount of clever tricks like bilinear filtering and mipmapping will help if your input data is itself already aliased!
Yet the example texture used in my previous post had nasty pixelization along the diagonal black lines:
I can improve this just by editing the source image in a paint program:
Compare these two versions at original size:
When a sprite is drawn without any resampling (at an integer pixel location, with no scaling or rotation) the resulting image is an exact copy of the source data, so it is obviously important that this source data contain as little aliasing as possible. Perhaps less obvious is that manually antialiasing my source data also helps when the texture is resampled. In mathematical terms, by smoothing the diagonal lines I have reduced the amount of high frequency information in my source signal, thus gaining more headroom to resample this signal before I will run into the Nyquist threshold and encounter aliasing.
Simple moral of this story: high quality antialiased textures are better than crappy ones with aliasing problems :-)
Alternative moral for those who find themselves suspicious of oversimplification: if you find yourself with the occasional texture that just won't stop aliasing no matter what you do, consider applying a blur to the relevant source images. Blurring removes high frequency data, which shifts the Nyquist threshold, which may be enough to fix the aliasing. This can obviously be taken too far (it's no good if we fix aliasing by making everything blurry!) but the occasional subtle blur, judiciously applied, can be a valuable weapon in the antialiasing arsenal.
Hmm very interesting. I ended up using this blurring approach years ago, but it was always a manual process - the user had to pick the right amount of blurring to suit the situation. It depended on the contents of the source texture, the size and angle of the object being texture mapped, and the resolution of the rendering. Blurring to make it look good on the screen meant it looked too blurry when rendering at a resolution suitable for print, and most users didn't understand or care about this - nor should they have to.
Do you think it'd be possible to automatically calculate a good level of blurring for a given texture and a particular usage of that texture? Perhaps based on some frequency analysis of the texture.
> Do you think it'd be possible to automatically calculate a good level of blurring for a given texture and a particular usage of that texture?
That's basically what mipmapping does.
Manual blurring like I talk about here is a not-so-often-needed solution for the rare cases where mipmapping doesn't entirely work right (usually because subsequent shader math introduces higher frequencies beyond those in the original texture signal).
Mipmapping certainly helps a lot (as does anisotropic filtering) but, as you said, it doesn't always fully eliminate aliasing. Unfortunately, due to the nature of texture content (complex stripe patterns), repetition, and camera angles that tend to be used in my application, this problem comes up less rarely than you might expect. I suppose I should further study the weaknesses of mipmapping and try to deeply analyze some example cases where it clearly fails.
Thanks for this series of posts, by the way! It's been really helpful and relevant.
Tom - if you're able to break out one of your problem cases (texture + example geometry and problematic camera angle) into something that you are willing to share, I'd love to take a look and maybe use it as a case study for this series of articles?