Geek 101: Making Sense of Anti-Aliasing

06.04.2011
If you've played a PC game in the past five years, you've probably stumbled across an anti-aliasing toggle while mucking about with your graphics settings. Switching it on can make everything on your screen look smoother, but why does it also make your games run slower? And what's the difference between "2x Multisampling" or "4x Supersampling," and which is the best choice for your machine?

We did a bit of lab testing to work our way through the jargon, so you can turn these settings to your advantage.

Have you ever been confused by edges of supposedly smooth objects in your favorite game looking jagged or blurry? This issue is generally identifiable by a "stairstep" pattern on objects in a digital scene, and it happens because the contrast between dark and light pixels can often make the edges of an object onscreen appear jagged (thus the dreaded term "jaggies.") Anti-aliasing is simply a term for complex algorithms your graphics card employs to make the pixels along the edges of an object appear smoother by blending their colors together.

Of course, from a distance of 12-18 inches the human eye typically can't pick out individual pixels on an image with pixel density greater than 300 pixels-per-inch (PPI). For reference, a 24-inch LCD monitor running at a 1920-by-1200 resolution displays roughly 94 pixels-per-inch. Until you own a monitor that can tackle 300 pixels-per-inch, you'll have to rely on your graphics card to tastefully blur the image.

In order for a modern graphics card to eliminate jagged edges, it first has to know where the edges are in any given image. To do that, your graphics card does something called full screen sampling: The action on screen is calculated at a much higher resolution before actually being displayed--essentially "faking" a higher-resolution. Color data from every pixel is then gathered and averaged before being condensed into the smoother, prettier final image that is actually displayed.