Stop making up rubbish to suit your own argument, he NEVER said X-Box wasn\'t capable of the RS2 demo\'s but as I said he indicated that he would be very suprised if XBox could do it with his "eat a broom" comment. He says RS2 uses 50% of the hardware so where\'s the fault in saying that, if you had done your homework properly you would know he also said he was waiting for the final XBox Hardware before he made any final judgements. On the German message board he was comparing the finalized Gamecube Hardware to the current XDK (Pentium 3 733, GeForce 2 GTS) which the Gamecube Hardware understandably destroyed (i.e - 5 times faster framerate at 50% of it\'s full power capacity).
lol, this is getting quite annoying.
he said he would eat a broom if xbox could do something that only utilizes 50% of gc\'s power. That is the same as saying xbox is less than 50% as powerful as gc (unless he likes eating brooms...).
he did not say "I will eat a broom if a geforce 2 can do our demo", he said "I will eat a broom if
xbox can do our demo"
"if you had done your homework properly you would know he also said he was waiting for the final XBox Hardware before he made any final judgements."
lol, he didn\'t even wait till the end of his post to make a judgement! he said he would eat a damn broom if xbox could do it! I\'d call that a judgement.
and if he was judging xbox based on what a geforce 2 can do, then he is just being ignorant and shortsighted, which gives us a window into his character.
So you\'re assuming EA already had the source code for the benchmark program specifically designed to run on Gamecube Hardware before they even recieved dev kits, I don\'t think so.
no, not a benchmark designed specifically for gc, a simple opengl benchmark.
What you\'re also saying is EA Canada has recieved, learned, understood and maxed out Gamecube Hardware in a week.
I love how complicated you make it sound.
as long as they know opengl, all they would have to learn are a few gamecube specific opengl functions. Couldn\'t take more than a day to learn (assuming they are experienced programmers, as they should be working for EA), and a week to implement into a simple benchmark.
As much as you would like to think developers are actually going to code in assembly (does the term "spaghetti code" ring a bell?) to get the most power out of any platform, it just isn\'t going to happen, so it doesn\'t require eons to "max out" a system (realistically, anyway).
Lets look at this realistically, the source for say a benchmark program would not nearly take as long as a game too optimise for a specific Hardware platform but there is no way in hell it would take only a week to fully optimise benchmark source code for a very foreign and unknown platform, not to mention using a PC based benchmark program as your suggesting isn\'t going to exactly give top performance, it is crazy to suggest EA maxed out Gamecube in a week, absolutly crazy.
Not a very foreign and unknown platform. A new platform that uses opengl and has a few platform-specific functions to take advantage of its few unique features.
oh yeah, not a "pc benchmark", either. An opengl benchmark.
Also OpenGL is the Gamecube API yes, but that does not mean games must be coded in OpenGL to run, im sure Nintendo has included it\'s own custom API tools with Gamecube for better performance and less overhead. OpenGL is simply a nice alternative, that suits PC developers porting projects to Gamecube.
Unless developers are going to code in assembly (suuuuuurrre), there is no alternative to opengl. It
is gamecube\'s API.
His posts were delted because Nintendo caught wind of his activities, clearly there are plenty of things Nintendo is trying to keep secret with Gamecube and they weren\'t about to let their NDA protected tech specs become public knowledge on a German message board.
that doesnt have any bearing on the argument.
in other words...that doesn\'t mean Nintendo would be pissed at him if he did an interview and said gc had better graphics than xbox.
Well how about we discuss how the NV2A is severly Bandwidth limited how it has no embedded RAM like PS2 and NGC
severly bandwidth limited?
thats what vertex, texture, and Z compression are for.
If you can handle the truth I suggest you check out this article:
http://www.ddj.com/articles/2000/0008/0008a/0008as1.htmin which abrash explains how xbox has sufficient bandwidth to handle 100 mpps...
...even while the cpu is using its full bandwidth allocation (1gb/sec), bandwidth is reduced by 1.3 gb/sec to account for the natural innefficiencies or memory, 4x anti-aliasing is used.
He also didn\'t take into HSR, nor did he take into account vertex compression (which he explains at then end of the article could\'ve freed up more than 1 gb/sec of bandwidth).
You can do your own calculations if you want to check his work....
I hope that clears up the whole bandwidth issue
how it\'s CPU has half the on chip cache of Gekko
ahh, the cache issue. Well, first remember how little the CPUs in these consoles will actually do, I will then remind you how little cache size means in dynamic applications...but I\'ll argue with you anyway.
Do you remember the orginal athlon processor? It had 512k of cache, but it that cache ran at half the speed of the core.
Then AMD made the thunderbird, which had half the cache of athlon, but that cache ran at full speed.
The thunderbird was much faster than the athlon, and the world rejoiced in its glory
anyway...point is, size isn\'t always the biggest issue. In a static app like a word processor, where operations are performed on the same data repeatedly...possibly...but games are dynamic apps, and cache speed is much more of an issue.
bottom line:
Despite the fact that the p3 in xbox will have half the cache of gekko, if gekko\'s cache is running slower than the core, or is inferior in other ways (say 4 way associative as opposed to 8 way associative), it could still be slower! size doesn\'t matter!
and how it\'s raw fillrate is less then half that of the Playstation 2\'s GS.
it\'s really more or less equal....
here\'s an unfair comparison. Keep in mind that the ps2\'s effective fillrate will decrese whenever polys are less than 16 pixels in size. But, I am assuming that xbox\'s memory bandwidth will limit it to 350 mp/sec (conservative), but the Z compression and HSR will bring it back up to 700 mp/sec (also conservative IMO). BTW, I know you will probably banter something like "now we see the problems with xbox\'s UMA", but if you know anything you will know that. Oh yes, you may have also seen a figure of 2.4 gb/sec for ps2, but that\'s with no texture.
xbox ps21 texture: 700 mp/sec 1.2 gp/sec
2 textures: 700 mp/sec 600 mp/sec
3 textures: 350 mp/sec 300 mp/sec
4 textures: 350 mp/sec 150 mp/sec
this really gives xbox the advantage. Yes, ps2 has a substantial lead when only texture is used, but at 640X480 all of that extra fillrate just goes to waste. There would have to be more than 50x overdraw to actually put all of that extra fillrate to use. The main thing to look at is that xbox could handle 4 textures, while GS would probably be fillrate limited.
Thats ok Michael Abrash has already proved him right
- XBox = 8 Million PPS with 8 local lights and 2 texture effect layers.
- Gamecube = 14 million PPS with four texture effect layers + all other effects on.
All other effects for Gamecube does include 8 local lights, notice - "all other effects".
hehe...
care to explain how EA got gc to use 8 local lights with only 4 light maps...that\'s kind of ummm....impossible...
and oh yeah, that xbox spec is with 8 texture effect layers...unless xbox can use 8 local lights with only 2 light maps...yes, that\'s impossible too.
Also i\'ve seen the XBox 3 minute demo reel on TV, I couldn\'t help laughing at what looked like a bunch of Playstation 2 games with FSAA. The preceeding realtime Gamecube demo\'s seriously killed it. Oh but of course you\'ll bring up the Raven, Butterfly and Ping Pong pre rendered demo\'s im sure
lol...
not only were the ping pong and butterfly demos not pre-rendered, they were running on a geforce 2! hahahhhahahahah!!
about the raven demo, it depends on which you are talking about. There were pre-rendered and real-time versions. The real-time one of course looked worse, but it was also running on a geforce 2.
oh yeah, if you can find some high quality vids of the toys on desk demo I suggest you check them out. It looked amazingly real, and it too was only running on a geforce 2.
I would expect the gamecube demos to destroy the xbox one\'s at this point. They are running on a geforce 2...
get over it...
oh yeah, not to mention that they were made by pipeworks software, a small, new dev house made up of about 20 guys, in a very short amount of time.
Who know\'s how long Nintendo\'s massive, legendary dev house EAD had been perfecting those gc demos, and Julian recently confimed that rs2 demo took factor 5 more like 6 months than two weeks to make.
oh well...
The wait won\'t be long now. The console and controller are going to be shown at CES on jan 6-9, and actual games are going to be shown at microsoft\'s gamestock, which will be sometime around feb/mar, then of course there\'s GDC and E3....
[Edited by drcrumble on 01-02-2001 at 09:48 PM]