Originally posted by Paul
Jaggies wasn\'t an issue with PC coz it runs at hi-res and most cards can do 800x600(or higher) no problem. Coupled when running this on a typical 15" monitor, there is really no issue as u\'ll need to squint ur eyes to see jaggies.
On the PS2, the jaggies(or shall I say "shimmeries") is blantantly obvious due to the low res it\'s running on (640 x 240 interlaced). I can live with the jaggies but it\'s the shimmeries that is really a pain.
Anyway, it\'s really SONY fault for overlooking such a common issue. Although AA can be implemented, it takes severe performance hits..which is probably why very few games(except BG:DA, as far as I know) make use off.
Seems to me that you havent played enoigh PC games. Even on a 800x600 resolution jaggies can be quite apparent and i didnt have to squirt my eyes to see it so far. I play Jedi knight2 in that resolution actually but i can always see jaggies. Only when i raise it to 1024x768 resolution or higher i can say that my screen is "spotless" but the jaggies effect is always there unless you turn antialiasing on. Then again, i cant do that because my game starts to look like a slide show.

Only next gen cards like the Radeon 9700 or the NV30 will be able to implement antialiasing without sacrificing performance.
I dont think its Sony\'s fault btw. Antialiasing was hardly implemented in games back in the beggining of 1999 when the PS2 hardware specs were first shown to public. Remember that we are talking about hardware that is about 3 years old. 3 years are like 3 centuries of evolution in the computer industry. I agree though with the shimmeries part, sometime it gets on my nerves.
I still think though that the lack of antialiasing is just because of MANY lazy devs. There are many PS2 games out there that have very little or no jaggies at all on them . But its just as i said. Today\'s devs want everything served and ready for them... always lazy to figure out a solution on their own.