Op X-bit labs is er een artikel verschenen dat gewijd is aan anti-aliasing technieken bij de nieuwste videokaarten.
With the launch of every new generation of graphics chips, video cards performance steps onto a new level and the quality and quantity of various special effects grow in an explosion-like way. But however perfect present-day game graphics may be, however realistic the characters and the environment, the main feature of the display picture remains the same: the picture consists of pixels - Picture Elements. And it means that polygon edges are displayed as "jaggies" (stairstep-like lines) made of these pixels and it's not possible to reach an ideal picture quality until the "jaggies" are done away with.
So the challenge is issued: Ideal Picture must have no "jaggies". You think it's a dream? None of that. For example, in today's cinematography they use computer rendering systems to create special effects or scenes, which are hard or costly or just impossible to shoot "live". The result of program computation looks so true-to-life that we often don't see through the "trick" and perceive the outcome of programmers and designers' work as ordinary video shooting. Of course, there's no track of "jaggies" in pictures like that.
But those are movies and for them even powerful multiprocessor systems may process shots for a minute, or an hour, or a whole day…But we want the shots and we want 'em to be done in real time! It means that highest-quality anti-aliasing methods (anti-aliasing - removing aliasing effects, i.e. "jaggies" and the like), which are used in rendering software, don't suit us. Not the speed we need, unfortunately.
But don't sink into despair: nothing is lost yet. Firstly, the quality can be sacrificed to speed and, secondly, modern gaming graphics chips, besides all their features, have also full-scene anti-aliasing support and it means the Ideal Picture dream may come true.