Hello

Welcome, Guest. Please login or register.
Did you miss your activation email?

Author Topic: doom 3 ATi vs Nvidia  (Read 874 times)

Offline mm
  • clyde\'s boss
  • Legendary Member
  • ******
  • Posts: 15576
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« on: May 13, 2003, 04:40:39 PM »
http://www.hardocp.com/article.html?art=NDc0

Quote
We were thrilled to see these scores at 1600x1200. I truly did not think this sort of frame rate at such a high resolution would even be within any system\'s grasp. At this crushing resolution we see the GeForceFX 5900 Ultra stay ahead of the 9800 Pro from 25% to 70% in frame rate.


Quote
The other thing that is surprising about our frame rates is the obvious gap between the 5900 Ultra and the 9800 Pro-256. You have to wonder if this is due to a simple driver issue at this point in the game.



/me opens the closet door for ATi
\"Leave the gun. Take the cannoli.\" - Clemenza

Offline Lord Nicon
  • The Member
  • Legendary Member
  • ******
  • Posts: 4205
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #1 on: May 13, 2003, 04:48:26 PM »
Damn, and i like ATI. This was obvious. GF has always been fast though. When they first came out it was still the same just that you had to get so many drivers etc and it sucked. This was back when it was mainly GF and VooDoo. Man those were the days *voodoo, sigh*.
Does GF even have a 128 bit card out? Just wondering it. What about graphical performance. Are all the effects maxed???
Originally posted by ##RaCeR##
I don\'t have comprehension issues, you just need to learn how to communicate.
Yessir massir ima f*** you up reeeeal nice and homely like. uh huh, yessum ; ).
Debra Lafave Is My Hero ;) lol

Offline mm
  • clyde\'s boss
  • Legendary Member
  • ******
  • Posts: 15576
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #2 on: May 13, 2003, 04:51:23 PM »
umm, click the link

they run up to 1600x1200 32 bit 4xFSAA 8x AF

nvidia card almost doubled what Ati could do at that pressure
\"Leave the gun. Take the cannoli.\" - Clemenza

Offline Lord Nicon
  • The Member
  • Legendary Member
  • ******
  • Posts: 4205
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #3 on: May 13, 2003, 04:54:41 PM »
Damn. I just read the whole thing. That is crazy. How much are the prices though for the top Radeon and GF4???
Originally posted by ##RaCeR##
I don\'t have comprehension issues, you just need to learn how to communicate.
Yessir massir ima f*** you up reeeeal nice and homely like. uh huh, yessum ; ).
Debra Lafave Is My Hero ;) lol

Offline mm
  • clyde\'s boss
  • Legendary Member
  • ******
  • Posts: 15576
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #4 on: May 13, 2003, 05:39:06 PM »
technology don\'t come cheap
\"Leave the gun. Take the cannoli.\" - Clemenza

Offline Tyrant
  • Hero Member
  • *****
  • Posts: 1877
  • Karma: +10/-0
    • http://www.bahrainicars.com
doom 3 ATi vs Nvidia
« Reply #5 on: May 13, 2003, 08:07:25 PM »
preety nice scores, but wasnt their a FX5800 before and as i recall it did not perform very well compared to ati\'s R9700 pro.
may be it was just nvidia\'s way of tricking ati to think that they had the uper hand.
[size=1.5]It is a mistake to try to look too far ahead. The chain of destiny can only be grasped one link at a time.~Sir Winston Churchill[/size]
Bahrains ultimate vehicle showroom,  CV8=ownage, Bahrain F1, Bahraini cars, GulfGt.

Offline Samwise
  • Moderator
  • Legendary Member
  • ******
  • Posts: 12129
  • Karma: +10/-0
    • http://151.200.3.8/~vze29k6v/you.html
doom 3 ATi vs Nvidia
« Reply #6 on: May 14, 2003, 12:54:02 AM »
Lol, too bad I can\'t use 1600x1200. Not that I would shell out $400 on a gfx card anyway. :)

/me hugs his Geforce 3, which still runs all his games beautifully
RRRRRRRRRRRRRRRRRRAPETIME!
(thanks Chizzy!)

Offline THX
  • nigstick
  • Legendary Member
  • ******
  • Posts: 8158
  • Karma: +10/-0
doom 3 ATi vs Nvidia
« Reply #7 on: May 14, 2003, 12:59:22 AM »
Quote
mm opens the closet door for ATi

This coming from the same guy who kept saying "another nail in Xbox\'s coffin." ;)  3 words: Don\'t be hatin\' *gag*

Competition is a good thing.. no reason to be a fanboy of one brand or the other.  Admittedly I had a bad experience with a BBA Radon 7500 but when they released the 9700 Pro and seeing it in action compared to my GF4 I gotta give credit where it\'s due.

Ever since the 9700 Pro ATi has been a serious competitor and threat to nVidia.  With these consumer-benefical price/technology wars I welcome their challenge.

Return of the king

\"i thought america alreay had been in the usa??? i know it was in australia and stuff.\"
-koppy *MEMBER KOPKING FANCLUB*
\"I thought japaneses where less idiot than americans....\" -Adan
\"When we can press a button to transport our poops from our colon to the toilet, I\'ll be impressed.\" -Gman

Offline JBean
  • Legendary Member
  • ******
  • Posts: 2535
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #8 on: May 14, 2003, 10:22:07 AM »
From Toms Hardware (link in post above)

Quote
The DOOM III engine offers a selection of rendering modes to choose from. NVIDIA\'s FX and NV2x cards are explicitly supported with their own codepath and optimizations. ATi\'s cards are supported with the R200 codepath. Of course, there are general modes like ARB2 as well, which is what ATi\'s Radeon 9800/ 9700/ 9600/ 9500 cards use. The Radeon 9000 runs in "R200," while NVIDIA\'s FX 5900/ 5800/ 5600/ 5200 use "NV30." The GeForce4 Ti4200 runs NV20 code, although at reduced rendering quality.


I\'m no expert at this stuff, but it seems to me the Nvidia cards are getting specialized drivers, while the Ati cards get the shaft with "general mode" drivers.
« Last Edit: May 14, 2003, 10:24:35 AM by JBean »

Offline mm
  • clyde\'s boss
  • Legendary Member
  • ******
  • Posts: 15576
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #9 on: May 14, 2003, 11:35:27 AM »
it\'s funny how you put "ATi", "drivers", and "shaft" all in the same sentence
\"Leave the gun. Take the cannoli.\" - Clemenza

Offline NVIDIA256
  • Hero Member
  • *****
  • Posts: 1138
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #10 on: May 14, 2003, 07:38:17 PM »
first off here is some important info from http://www.ANANDTECH.com

At a $499 price tag, the GeForceFX 5900 Ultra is extremely hard to justify, However a lot can change between now and the release of the game, it will be interesting to see how quickly ATI will incorporate Doom3 optimizations into their current drivers in the event that we get another such opportunity to benchmark Doom3 in the near future.


NVIDIA obviously wouldn\'t agree to this opportunity unless they knew they would come out ahead in the benchmarks, and as you are soon to see, they did. John Carmack did have a few words of warning, which we received after our benchmarking time with the system was up:

"We have been planning to put together a proper pre-release of Doom for benchmarking purposes, but we have just been too busy with actual game completion. The executable and data that is being shown was effectively lifted at a random point in the development process, and shows some obvious issues with playback, but we believe it to be a fair and unbiased data point. We would prefer to show something that carefully highlights the best visual aspects of our work, but we recognize the importance of providing a benchmark for comparison purposes at this time, so we are allowing it to be used for this particular set of tests. We were not happy with the demo that Nvidia prepared, so we recorded a new one while they were here today. This is an important point -- while I\'m sure Nvidia did extensive testing, and knew that their card was going to come out comfortably ahead with the demo they prepared, right now, they don\'t actually know if the demo that we recorded for them puts them in the best light. Rather nervy of them, actually.

The Nvidia card will be fastest with "r_renderer nv30", while the ATI will be a tiny bit faster in the "r_renderer R200" mode instead of the "r_renderer ARB2" mode that it defaults to (which gives some minor quality improvements). The "gfxinfo" command will dump relevant information about the functioning renderer modes and optimizations. At some point, after we have documented all of the options and provided multiple datasets, Doom is going to be an excellent benchmarking tool, but for now you can still make some rough assessments with it."

Offline EmperorRob
  • Mr Sexual Harassment
  • Legendary Member
  • ******
  • Posts: 3932
  • Karma: +10/-0
doom 3 ATi vs Nvidia
« Reply #11 on: May 15, 2003, 08:16:57 AM »
And the plot thickens...

http://www.extremetech.com/article2/0,3973,1086025,00.asp

Not that it will make that much difference but I wasn\'t seeing the "70%" difference anyway.  I just wanted to throw some gas on this fire.
This is America and I can still pay for sex with pennies

Offline JBean
  • Legendary Member
  • ******
  • Posts: 2535
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #12 on: May 15, 2003, 11:37:46 AM »
Quote
What We Found

During our analysis of Game Test 4, we paused the benchmark, went into the free-look mode, and moved "off the rail" in the 3D scene. Once we did that, serious drawing errors were readily apparent. We interpreted these results to indicate that nVidia was adding static clip planes into the scene. These static clip planes reduce the amount of sky that the GPU has to draw, thus reducing the pixel shader workload and boosting performance.

If this were a "3D Guided Tour" and the goal was to make the scene render as quickly as possible, with no regard to any real-world performance correlation, then this type of optimization would be fine. But that\'s not the goal of 3DMark. It\'s a synthetic benchmark that measures performance, designed to indicate how well a 3D GPU can render DX8/DX9 game code. In a game where you have six degrees of free movement, with a user controlled (vs. fixed) camera path, static clip planes would not work at all.

At first, this almost sounds like a reasonable optimization. But our guess is that these custom clip-planes were manually designed and implemented for each frame of animation.

During our testing, when we stayed "on the rail," or the fixed path through the benchmark, things looked fine. But when we went "off rail" in the developers version, we saw problems, particularly with the sky in Game Test 4. We tried the same inspection using ATI\'s Radeon 9800 Pro, and saw no such drawing errors. We wanted to do the same thing with the Matrox Parhelia, but it doesn\'t support Pixel Shaders 2.0, and so cannot run Game Test 4.

http://common.ziffdavisinternet.com/util_get_image/2/0,3363,sz=1&i=24048,00.jpg
The above picture shows the scene drawn correctly, and everything appears to be in its right place.

http://common.ziffdavisinternet.com/util_get_image/2/0,3363,sz=1&i=24046,00.jpg
http://common.ziffdavisinternet.com/util_get_image/2/0,3363,sz=1&i=24047,00.jpg

However, if we come off the rail and move the camera slightly, you can see where the sky is incorrectly drawn, and where we believe nVidia added an additional "hard-wired" clip plane. If the scene were drawn correctly, normal clipping operations would occur, and this error would not exist. But because the clip planes appear to be "baked in," that is, not being tested for every frame of animation, this drawing error results.

Clipping/culling operations are called "view-dependent," because the outcome of clip/cull tests is dependent upon where the view camera is located relative to objects inside/outside of the frustum. Because nVidia appears to have hard-wired these additional clip plane operations into its driver, they are not view dependent, because the driver is assuming that the camera will remain on the rail, and not catch the errors these "optimizations" would otherwise create.

A reasonable question to ask here is: doesn\'t nVidia have early Z-tests, occlusion culling, and other features to help it discard non-visible polygons and pixels earlier in the pipeline? The answer is that they do, but those features don\'t buy them much here, because of Game Test 4\'s drawing order. According to FutureMark, "In Game Test 4, the sky is mostly drawn first, when the camera angle shows a lot of it. In some other cameras, it is drawn later or not at all, depending on the visibility."

The sky in Game Test 4 uses Pixel Shaders 2.0, and so having to first draw the entire sky, only to have scene objects occlude it later is an expensive way to render, almost like a back-to-front Painter\'s Algorithm. Using clip planes to ignore drawing non-visible parts of the sky can save considerable processing time and bandwidth -- although we don\'t know exactly how much performance benefit nVidia receives. Oddly enough, nVidia hasn\'t published a registry setting bit to enable/disable this "feature." And while one could take FutureMark to task for drawing the scene in this manner, it is nonetheless what the application does, and we think it should be executed correctly in the hardware.


hmmmmmm.......
« Last Edit: May 15, 2003, 11:45:36 AM by JBean »

Offline JBean
  • Legendary Member
  • ******
  • Posts: 2535
  • Karma: +10/-0
    • http://
doom 3 ATi vs Nvidia
« Reply #13 on: May 15, 2003, 11:44:22 AM »
Quote
The Second Odd Behavior

Interpretation of our test results also leads us to believe that nVidia is not clearing its buffers between certain animation frames in Game Test 2 - Battle of Proxycon and Game Test 3 - Troll\'s Lair.

Clearing the buffers (color and Z) involves writing zeros to all memory locations in the new back buffer and Z-buffer so that the GPU can begin drawing the next scene with a "clean slate." But since not all memory locations will contain visible pixels, parts of the buffers remain dirty -- based on the assumption that the camera\'s unvarying movement path won\'t catch the errors. The advantage to not clearing buffers is the bandwidth savings of avoiding those writes to memory. This frees up precious memory bandwidth to be working on other things, like texture/texel reads and writes of visible parts of the scenes.

Take a look at these two images:
Good Buffer


Bad Buffer


The top image is rendered correctly, and the buffer clear issue in the second image seems to crop up whenever the stars are not visible when traveling along the fixed camera path. In other words, when the soldiers are in an elevator going up, the camera cannot see outside the spaceship to see the stars. At this point, the buffers for those parts of the scene apparently are not getting cleared, which would result in the "smearing" you see in the second image above. However, when the view camera can see the stars outside of the ship while traveling along the camera\'s fixed path, the buffers are cleared, and the stars are rendered correctly, as in the case of the first of the two images.

Final Thoughts and Conclusions

We believe nVidia may be unfairly reducing the benchmark workload to increase its score on 3DMark2003. nVidia, as we\'ve stated above, is attributing what we found to a bug in their driver.

Nearly 95% of the 3DMark03 overall score is a measure of DX8/DX9 performance on high-end 3D hardware. The Wings of Fury test, a DX7-era test accounts for less than 6% of the total score.


Maybe it\'s just me, but i\'m not convinced the FX 5900 Ultra is really all that much better than the 9800 Ultra.  Every other benchmark have the two neck and neck, with doom 3 being the exception.  And that can be attributed to Id\'s lack of updated drivers for ATi...
« Last Edit: May 15, 2003, 11:52:46 AM by JBean »

Offline Heat
  • Pee Pee
  • Legendary Member
  • ******
  • Posts: 2134
  • Karma: +10/-0
    • http://www.tombgaming.co.uk
doom 3 ATi vs Nvidia
« Reply #14 on: May 15, 2003, 12:06:31 PM »
If I wasn\'t poor I\'d get a new gfx card, but my GF3 will just have to do, at least I have the 128mb version, I\'ll get a demo when one comes out and see if it\'s like a slide show.
\" A delayed game is eventually good, a bad game is bad forever\" - Shigeru Miyamoto

 

SMF spam blocked by CleanTalk