Hello

Welcome, Guest. Please login or register.
Did you miss your activation email?

Author Topic: More cookies for nVidia to eat for a snack  (Read 745 times)

Offline jm
  • TushyKushy
  • Hero Member
  • *****
  • Posts: 518
  • Karma: +10/-0
More cookies for nVidia to eat for a snack
« on: June 26, 2002, 10:04:17 AM »
Ok, so Matrox supplied nVidia with cookies after it\'s Parahelia dud, now I think ATi is going to add the milk. Read this blurb I foud on FiringSquad:

Quote
ATI\'s R300, now being referred to as the Radeon 10,000, will have an astounding 128 bit color capability. Current cards are 32 bit, which provide 24 bits (16.7 million) for colors and 8 bits (256) for transparency. 128 bit color will provide for stunning realism and clarity never before seen on PC monitors. While 16.7 million simultaneous colors is said to be around the limit of the human eye, the combination of computational possibilities provided by 128 bit color will provide for much more accurate rendering so that when images are sampled down to 16.7 million actual colors, it will provide a fully optimized palette with a higher degree of accuracy across the visual spectrum.


128bit color hu? I wonder how long it will take to implement that feature. It\'s funny how ATi and Matrox want to be a leader, yet no one follows them. Everyone is following nVidia these days. I expect all the research going into 128bit color really worthless being everyone is following nVidia right now. I mean, even 52bit color isn\'t even used in games of today yet.
You think positive, I\'ll think realistic.

Offline Peltopukki
  • Full Member
  • ***
  • Posts: 128
  • Karma: +10/-0
More cookies for nVidia to eat for a snack
« Reply #1 on: June 26, 2002, 10:53:50 PM »
The floatingpoint pixel pipelines are required for Full DirectX9. And it is likely that Doom III will support 64 and/or 128 bit framebuffer when it comes out.  (John Carmack is one of the big names who want more precision for calculations)

The 128 bits are for the framebuffer not for the monitor. Monitor will show those images dithered to 8 or 10 bits/channel.
this for the simple reason that monitors cannot produce the additional dynamic range needed to display it.

BUT. the additional bits are needed quite badly in framebuffer especialy now when games are using more texture passes to form the final pixel on screen. Now when you have enough lights with per-pixel fx or many transparent layers you will see banding and color errors.

This is bad, but it is also inpossible to do some subtle fx without enough accuracy of calculations. (IE. Fog and some fx in Final Fantasy movie used something like >4000 very transparent layers)

to see banding check this little demo and set brightness of stars low.
http://planet3d.demonews.com/PWGravity3D.htm
« Last Edit: June 26, 2002, 11:26:55 PM by Peltopukki »
Against stupidity ..gods themselves.. fight in vain
- Isaac Asimov

Offline Mr. Kennedy
  • Resident Libertarian
  • Legendary Member
  • ******
  • Posts: 9110
  • Karma: +10/-0
More cookies for nVidia to eat for a snack
« Reply #2 on: June 27, 2002, 09:41:10 AM »
w00t!!!!

Go me...

<----Own\'s Ti4600
\"In the last 12 months 100,000 private sector jobs have been lost and yet you\'ve created 30,000 public sector jobs. Prime Minister, you cannot carry on forever squeezing the productive bit of the economy in order to fund an unprecidented engorgement of the unproductive bit. You cannot spend your way out of recession or borrow your way out of debt.\" - Daniel Hannan

Follow Me on Twitter!

 

SMF spam blocked by CleanTalk