I really think the core MHz is purely a joke. It\'s really just a gimmic if you think about it.
The most important aspect in data through put is how fast the memory is able to function. Sure, having a high core frequency is a plus, but I\'d rather have X card with a 275/600 than a card with a 300/550.
With nVidia\'s LightSpeed memory design, it\'s able to perform at or higher than ATI\'s Radeon lineup, regardless of their "potiental" specs. ATI thought they could buy the show with outrageous specs, thinking they can win the hearts of market share holders and customers, but once they released that product, they had such inadaquet drivers to go along with it. This is where nVidia has established itself in the market, being able to distribute "most" of the time quality, functional, and fresh new optomizations. I take into respect that ATI has indeed improved, but playing catch up in the PC graphic market is a up hill climb all the way.
Now back to the subject. I feel as though this war will never end. Just take a look and tell me how many games fully utilize the Geforce technology. I mean all of it\'s nice features. I still respect the first GeForce 1 up until today. If nVidia wants to be top dog in this situation, it has to let developers take advantage over it\'s technology and fully implement it into it\'s games. If nVidia is constantly releasing something new and X company already chose Y technology when they released Z technology, analysis then claim the title being "outdated."
I think the summer will be very interesting when ATI\'s new flagship card is released and nVidia\'s. I highly ensure you that their flagship cards will come at a premium, but without doubt, the prices will come down with competition.
Also, how about them BitBoys! King of Vaporware!