To be really honest, I didn\'t see anything in that article that really makes me jump for excitment. There are still a lot of questions unaswered, such as fillrate and clock speed and some other things that I wonder about that would have heaps of relevance.
"10-bit Gigacolor Technology?" What is the use of being able to have 1 billion simultaneously displayed colors, if the human eye can only distinguish about 12 million colours? I don\'t see how going above 24-bit will make a difference, even for those professionals.
512 bit does sound impressive, but only tells someone how much information passes through at one clock cycle. If these 512-bits are really so advanced is yet to be seen when they are tested against other graphicscards.
Apart from some mistakes in the article, definately a good read and shows how fast that the PC industry is advancing under the same old and limiting bottlenecks. I find it laughable though, to how much these graphicscards will end up costing. For what anyway, because no one will see those graphics posted in the above screen for at least a year anyway. 16xFAA, 1 billion colors and 4 vertex shaders won\'t make current games look much better. IMO, better to wait for a year and get the card at a discount when the first games ship that take maximum advantage of the GPU\'s performance.
Thanks for the info btw. It\'s cool to be updated.