PSX5Central
Playstation/Gaming Discussions => Gaming Discussion => Topic started by: TheSammer on November 05, 2001, 03:10:50 AM
-
Ok... ok... marketing "it\'s not a PC"..."it\'s not a PC"... call it as you wish... but... what i see it\'s a Intel P3 700 with an NVidia GPU and 64 Mb shared with vid mem.
PLUS:
- DirectX
- a sort of Windows OS.
So... if they ship directX N.. or new version of vid/audio drivers.. or patch for its Win OS (ahem... remember.. the wrong DLL on kiosks)... users must download/install it. (like PC).
The software house can ship unfinished games because they can patch later... buy today-play tomorrow...(like PC).
bah... the "it\'s not a PC" it\'s only a marketing mantra.
The games are directX so if they can run on a P3 700 and 64 Mb of ram (+nvidia GPU) + Windows, they will do the same (or better) on Nvidia GPU PC sytem.
I know, the XBox is cheaper than a PC... but the XBox will have the same problems of PC.
My 2 euro.
-
Originally posted by TheSammer
Ok... ok... marketing "it\'s not a PC"..."it\'s not a PC"... call it as you wish... but... what i see it\'s a Intel P3 700 with an NVidia GPU and 64 Mb shared with vid mem.
PLUS:
- DirectX
- a sort of Windows OS.
So... if they ship directX N.. or new version of vid/audio drivers.. or patch for its Win OS (ahem... remember.. the wrong DLL on kiosks)... users must download/install it. (like PC).
The software house can ship unfinished games because they can patch later... buy today-play tomorrow...(like PC).
bah... the "it\'s not a PC" it\'s only a marketing mantra.
The games are directX so if they can run on a P3 700 and 64 Mb of ram (+nvidia GPU) + Windows, they will do the same (or better) on Nvidia GPU PC sytem.
I know, the XBox is cheaper than a PC... but the XBox will have the same problems of PC.
My 2 euro.
First of all Welcome TheSammer :)
...only one thing...do you really care about the components of a Console ? ...when the hardware is so powerful and friendly to develop for ? :)
...when a system has this kind of specs :
Xbox CPU (modified PIII Coppermine):
Clockspeed: 733MHz
Transistor: 28.1 millions
Process: .18 micron
Die size: 10.29 mm x 10.29 mm (or 106 mm2)
Voltage: 1.1 - 1.7 Volts
L1 cache: 32KB (16KB for instructions + 16KB for data)
L2 cache: 128KB running at core clock (733MHz)
L2 cache\'s bus width: 256-bit at 733MHz
L2 cache\'s bandwidth: 11.7 GB/s
Front side bus: 64-bit at 133MHz
FSB bandwidth: 1.064 GB/s
Interger precision: 32-bit
Interger performance: 1985 MIPS
Floating-point precision: 64-bit
Floating-point performance: 2.9 GFLOPS
Registers: 128-bit
Xbox GPU (Nvidia NV2A):
Clockspeed: 233MHz
Transistors: 64 millions
Process: .15 micron
Fully programmable vertex shaders: 2
Fully programmable pixel pipelines: 4
Max pixel fillrate: 932 Mpixels/s
Max texel fillrate: 1864 Mtexels/s
Max texture per pass: 4 textures
Memory: 128-bit, 200MHz DDR-SDRAM
Memory bandwidth: 6.4 GB/s
Supports:
S3TC (6:1)
Vertex compression
Z-compression (4:1)
Z occlusion culling (capable of rejecting 4 Gpixels/s)
Multi-sampling FSAA and Nvidia\'s HSAA (Quincunx)
Nvidia DASP (reduces memory lactency)
Nvidia Crossbar Memory Controller (4 sub controllers each can process 64 bits of data per clock)
Modified DX8 (192 contants, vertex programs can be 136 instruction long)
Effects
Procedural deformation, geometry compression, morphing, fur rendering, skeletal animation, keyframe interpolation, per vertex motion blur, layered/volumetric fog/smoke, enviroment bump-mapping, embross bump-mapping, dot3 bump-mapping, anisotropic lighting, phong shading, light and gloss maps, 8 hardware lights, reflection, refraction, shadows, spotlights, texture morphing, texture coordinate transformation, projected textures, etc.
Particle performance: 116.5 millions/s
Polygon performance:
1 infinite light, 1 texture: 79.2 Mpolys/s
1 infinite light, 2 textures: 65.3 Mpolys/s
1 infinite light, 2 spotlights, 2 textures: 18.7 Mpolys/s
Xbox MCPX:
Clockspeed: 200MHz
Transistors: 6 millions
Process: .15 micron
Processing power: 6 billion operations per second
Max bus bandwidth: 800 MB/s
APU (audio processing unit): 1
DSP (digital signal processor): 2
Dolby Digital encoder: 1
Dolby Digital encoding in realtime: Yes
2D sound channels: 256
3D sound channels: 64
Positional 3D sound support: Yes (positional 3D sound is fake 3 sound using 2 stereo speakers)
Compression: ADPCM (4:1)
Fidelity: 16-bit, 48 KHz (CD quality)
DirectX Audio support: Yes
Effects: looping, enveloping, reflection, reverb, occlusion, obstruction, Doppler, etc. (to meet I3DL2 specifications)
Communication interfaces: USB, IEEE 1392, Ethernet, 56K, high-density I/F (hard drive, optical disk drive)
...and huge support from the best developers...who does care if the X-Box uses a modified P3 or a Nvidia GPU...
IMO Look at the X-Box as an ultra powerful console...not as a mini PC !
...ops...another little thing...
The games are directX so if they can run on a P3 700 and 64 Mb of ram (+nvidia GPU) + Windows, they will do the same (or better) on Nvidia GPU PC sytem.
well...maybe in 2 or 3 years...you forget that X-Box is a fixed,balanced,optimized piece of hardware(unlike PCs) and developers will be able to fully push it to its limits(unlike PCs) :)
-
Here we go again.. the same laughable "facts" Mircosoft likes to throw around with them. Yeah sure, my ass.
"It\'s not a PC". It might not be a PC in terms of upgrading etc, but it sure is one in terms of components. Plus, an architecture that is over 20 years old and numbers that probably aren\'t even true. BTW: heard that the UMA is giving a lot of developers some trouble... seems to be a big bottleneck.
-
Originally posted by seven
Here we go again.. the same laughable "facts" Mircosoft likes to throw around with them. Yeah sure, my ass.
"It\'s not a PC". It might not be a PC in terms of upgrading etc, but it sure is one in terms of components. Plus, an architecture that is over 20 years old and numbers that probably aren\'t even true. BTW: heard that the UMA is giving a lot of developers some trouble... seems to be a big bottleneck.
Some of the X-Box games are self-explanatory ;)
(Have you seen Project Ego?)
BTW: heard that the UMA is giving a lot of developers some trouble... seems to be a big bottleneck
Nope :) Xbox has plenty of bandwidth to render even 50-60 mpps in game...
http://www.ddj.com/articles/2000/0008/0008a/0008as1.htm
-
Some of the X-Box games are self-explanatory
(Have you seen Project Ego?)
I never denied that games look good on X-Box. But it\'s a fact that X-Box runs a very old architecture. PS2 will show it\'s true potential when developers get to use the parallel processing efficiently. ( ave you seen BG : DA? Final Fantasy X?)
Nope Xbox has plenty of bandwidth to render even 50-60 mpps in game...
Believe your articles my friend. I am just quoting what developers have said. Seperate theory from reality.
EDIT: okay, didn\'t necessarely mean bottleneck, but that developers are having problems. I think it has more got to do with the memory managing.
-
Originally posted by seven
I never denied that games look good on X-Box. But it\'s a fact that X-Box runs a very old architecture.
...Power is what matters...not the way it comes from...
PS2 will show it\'s true potential when developers get to use the parallel processing efficiently. ( ave you seen BG : DA? Final Fantasy X?)
I know that and I\'ve seen FFX,J&D,BG : DA,etc...I\'m a hot PS2 fan and I\'m going to buy all these games ;)
Believe your articles my friend. I am just quoting what developers have said. Seperate theory from reality.
EDIT: okay, didn\'t necessarely mean bottleneck, but that developers are having problems. I think it has more got to do with the memory managing.
List all the developers(with links) who have problem with UMA :D
...then I\'ll show you what most of developers think about the X-Box performance !
-
...Power is what matters...not the way it comes from...
I don\'t call that power. If I were to believe Microsoft, then games should look 3 to 4 times better on X-Box now than on PS2. Besides, I think X-Box is already pushing a lot of its power. I don\'t think we will see a big leap in graphics anytime soon. But whatever, I don\'t give a damn about the X-box anyway.
List all the developers(with links) who have problem with UMA
...then I\'ll show you what most of developers think about the X-Box performance !
I can\'t because these opinions come from the developers themselves and not the marketing people above. If you\'re interested, I was able to speak to some developers a little while back. Some developer from who works in some devision for EA statet that they could not achieve the pixelfillrate on screen that Microsoft is publishing. That also makes me wonder, why there is no "grain" effect in SH2 for X-Box... BTW, I don\'t feel the need to back that up - I couldn\'t care less if they have problems or not. I am more than happy with my PS2 and no X-Box will change that. ;)
-
BizioEE thank you for the welcome.
My opinion is that if they put an operating system on a "super console" that is the same (maybe embedded version) as PC and have the same hardware (yes, ultra optimized) as the PC... well i see 50% of an ultra optimized PC.
The other 50%? Well... depends on them.
If they manage software and patchs like a PC system the XBox will be an 100% little-ultra optimized PC.
If they DO NOT patch and sell half made games (they will patch it tomorrow) they will made users console gamers... but if they do not... wow... there will be the forums full of "argh!!! what version of this?, what patch...?).
In my opinion Microsoft and PC software house are not able to sell a complete product (or game) that is based on Windows/dierctX without doing patchs. The fact is that in the PC world they sell beta-stage software. Why now should be different?
We will see. But for now i don\'t trust (i\'ve already paid a lot of money and health).
-
hell, even the Dramcast had a *shudder* micro$oft operating system
oh well
-
Oh yeah, welcome TheSammer!! Great to have you here.
Sorry, after BizioEE posted those tech-numbers, I just lost focus on the thread ;)
-
Hmmm... you\'re right. Also dreamcast has sometype of Windows Op. system.
But seems to me that there\'re 2 big difference:
1- developers can use it OR don\'t use it to make games.
2- DreamCast doesn\'t have Hard disk.
This make me think: "why put an hard disk OBBLIGATORY on the XBox?". Hard disk have a price. If you can sell your console at 100$ less without harddrive...
Maybe it need an hard disk to do some nice memory swap. Or as i said before: to patch and expand the software in future.
This can be a good thing if used properly.
But can be a very dangerous think if used by "not so skilled" programmers. For example you can have very detailed hi-resolution textures and ambient but at some time the system must swap memory on hd and you see some "chop" in FPS.
For example try to run UT or Q3 on 64 Mb system with all detail on. (ahem... should count on 32 mb on video memory).
Yes yes i know... XBox it\'s ultra optimized rather than PC, but when the system it\'s out of memory or you have designed very well the resource consuption or you swap. (in windows system).
Don\'t know... but i think that if they can spend less money to do the XBox and sell at a little less price they had done it. Putting HD obbligatory on the console make me think they NEED the HD.
-
X box just a PC in a box ?
WHOLE FLAME WARS have been started over people saying less than that.
now lets see.
the Xbox
has a
Modified P3
Ge force 3
MS OS
and uses a newer version of Direct X
now where i have i seen that before..........
............
tu te tum tum
............
La la la la
............
*ooseven scratches his head and twidless his thumbs*
............
oh come one there must be something otherthan a Personal Computer
............
If i say PC ill just get flamed
............
errrmmmm errmmmmmm please someone help me out i am stuck as the only thing i can think of is a PC...........
but then again i might be wrong.
oh ok DC, no no that was just Windows CE.
oh i give in !
:(
-
now lets see.
the Xbox
has a
Modified P3
Ge force 3
MS OS
and uses a newer version of Direct X
...well...yes...when People in general look at these components could think it\'s a mini PC...but...a gamer couldn\'t care less...
...when the hardware is so powerful,friendly to develop for plus huge support from developers...
...I think X-Box has the best Launch Line-up this generation and the graphics of titles like Project Ego,DOA3,Halo,JSRF,Project Gotham,AFDS are impressive,considering X-Box has still to be released...and don\'t reply(I\'m speaking in general...not to you ooseven:) ) with the usual statement:" oh...but X-Box is ultra easy to develop for and developers are using almost all the power of the X-Box",because it\'s a bull****...each console need a lot of time to be fully pushed...even the simple PSX needed 4 years to be fully pushed...and X-Box is not the exception to the rule...
...yes...GC is the easiest...PS2 is the hardest to develop for...but all three console need time to be fully understood !
...I know...it\'s a little strange hearing all this from BizioEE\'s mouth...but that\'s what I really think now...I decided to be unaware of my dislike for Microsoft and consider the X-Box for what it is...a Console...and BizioEE,as a gamer,can\'t skip the opportunity to play Great Games...because X-Box wil produce Great Games !
(The X-Box GPU is the NV2a...not a Ge force 3 :) )
-
BizioEE or anyone else:
Since the X-Box uses a UMA (Unified Memory Architecture), where is all the data stored? We have 64 MB of RAM, I know that, but what data exactly does it need to display a picture on screen.
Just to point out: I won\'t answer this question myself, I\'ll let a more experienced X-Box fan answer this :) - because I think I know what the main problem of the X-Box architecture is and where developers may have problems. Hope this won\'t get too technical.. :)
I am looking forward to your reply.
Phil
-
Originally posted by BizioEE
(The X-Box GPU is the NV2a...not a Ge force 3 :) )
Damn there gose my plan to rip a Ge force 3 out of my mates X box and put it in my PC :(
-
Originally posted by seven
BizioEE or anyone else:
Since the X-Box uses a UMA (Unified Memory Architecture), where is all the data stored? We have 64 MB of RAM, I know that, but what data exactly does it need to display a picture on screen.
Effective Texturing Bandwidth
http://www.segatech.com/technical/consolecompare2/index.html
Just to point out: I won\'t answer this question myself, I\'ll let a more experienced X-Box fan answer this :) - because I think I know what the main problem of the X-Box architecture is and where developers may have problems. Hope this won\'t get too technical.. :)
I am looking forward to your reply.
Texel Fill Rate,Rendering Bandwidth,Total Memory,Texture Space,Textures Per Frame...
http://www.segatech.com/technical/consolecompare/index.html
These Links come from a reliable source...and they give you an idea of the X-Box capabilities....
-
Originally posted by BizioEE
Effective Texturing Bandwidth
http://www.segatech.com/technical/consolecompare2/index.html
Texel Fill Rate,Rendering Bandwidth,Total Memory,Texture Space,Textures Per Frame...
http://www.segatech.com/technical/consolecompare/index.html
These Links come from a reliable source...and they give you an idea of the X-Box capabilities....
Damn and Blast !
i wish i could understand what all this Techno-Bable actually means !
i just play the games :(
-
Uhm...
------------------------------------------------------------
...
Xbox: 64 MB main memory, so 64 MB - 4 MB (code) - 4 MB (sound) - 3 MB (frame buffer and Z-buffer) - 8 MB (polygon data) leaves 45 MB for textures. S3TC texture compression at a ratio of 6:1 for 24-bit textures gives us a total of 270 MB of textures!
...
DC: 8 MB of graphics memory, so 8 MB - 1.2 MB (double buffered 16-bit frame buffer) - ~2 MB (polygon data "infinite planes") leaves 5 MB for textures. DC used VQ compression for texture data, so the 5 MB of graphics memory could hold 25 to 40 MB of textures depending on the compression ratio and that varied depending on the size of the textures being compressed. Lets assume 30 MB average for textures. Code and sound data where in separate memories on the DC, and did not impact the 8 MB of graphics memory.
...
GC: 24 MB of main memory, 1 MB of texture cache, and 2 MB for the frame buffer and Z-buffer, so 27 MB - 4 MB (code) - 3 MB (frame buffer and Z-buffer) - 8 MB (polygon data) leaves 12 MB for textures. S3TC texture compression at a ratio of 6:1 for 24-bit textures gives us a total of 72 MB of textures. GC has 16 MB of sound memory to hold the 4 MB of sound in our example, and that memory\'s extra free space cannot hold textures.
.....
-------------------------------------------------------------
So at the end we have (Texture Memory):
DreamCast = 5 (not compressed) and 40Mb (compressed)
GameCube = 12 (not compressed) and 72Mb (compressed)
XBox = 45Mb (not compressed) and 270Mb (compressed).
If that numbers are right we can have on PS2:
32Mb main memory + 4 vid + 2 sound + 2I/O a total something like:
PS2 = around 17 Mb not compressed (PS2 don\'t have HW texture compression).
Something don\'t work right here...
- this mean that even Games on DreamCast (that can use texture compression) should have better-hires-textures than PS2.
- Games on GC should destroy in texture dep. PS2 games.
- Games on XBox should be very very VERY hi detailed (in texture).
Seem to me that today the things are not set like that.
Another 2 thing:
- how many memory Windows-Xbox operating system eat to function properly?
- XBox use DDR ram, PS2 use RamBus... now for what i know RamBus have much more bandw. than ddr... or not?
Who know why they don\'t put 512Mb on a console? Today 512Mb of ram don\'t cost much... maybe less than a HD.
-
BizioEE, thanks for the links. I\'ll analize and reply to them later.
Who know why they don\'t put 512Mb on a console? Today 512Mb of ram don\'t cost much... maybe less than a HD.
Simple: When the PS2 was made, RAM was very expensive. Secondly, RAM with high bandwith costs a lot. The PS2 is made for streaming to cut down on memory, cost and anyone who works with graphic-intense stuff will know that data is always on the move and should not be cached. 512 Mb would be far too much and would be a pure waste: Why? Bandwidth to get those textures overthere is too small, again, data is changed very often so a lot of data would be "wasted space".
-
Just a few things, sorry no links for them...
The Windows OS kernel is supposedly very small
PS2 can compress textures but it is not a built-in (right term?) feature and requires at least some % loss in CPU power.
seven, wouldn\'t it be more correct to say it\'s best not to cache data in large chunks for graphically intensive stuff rather than implying it should be not cached at all? If I might try to recoin a phrase, maybe it could be said a stack of needles vs the old needles in the haystack way of doing things. Perhaps this is what you are driving at with the hints about the shortcomings of xbox. You may as well spit it out, while this forum has many fans looking forward to xbox there are hardly any here who are really tech savvy on the subject as far as I can tell. At least none that I\'ve seen around for quite some time. Whatever you have to say that\'s negative about xbox might be what it would take to draw them out.
I know next to nothing on the subject, just throwing out a little info I\'ve managed to glean. Kick it around as you all see fit. Or not
-
okay mates, lets get console technical here:
I\'ll go through some of the stuff that I think are important and where the PS2 is handled unfair or upon false assumptions. I\'ll take into consideration that this is a specific comparasment between DC / GC / Xbox, but when someone has the guts to say "The PS2 is poor at texturing", then I just can\'t let that get away. :) So anyway, here it goes:
Why the Playstation 2 is so poor at texturing. Some facts to consider:
- PS2 GPU\'s (GS) external bandwidth is 1.2 GB/s (64-bit @ 150 MHz bus to CPU (EE))
- PS2 GPU cannot deal with compressed textures over the above bus in real-time, as the GPU has no hardware to deal with compressed textures. The CPU (EE) can support compressed textures, but cannot deliver those textures to the GPU compressed, it has to uncompress them and thus taking up lots of bandwidth over the GPU bus.
- Will assume 4 MB for code, 8 MB for polygon and lighting information, so leaving 20 MB for textures out of PS2\'s 32 MB of total main memory.
PS2: 1.2 GB/s / 20 MB = 60 FPS with 20 MB of textures per frame
A 60 FPS result is not good enough! Any polygon processing will affect that rate, so the PS2 has to either render even less textures per frame or have the game run at a lower frame rate. The more polygons the PS2 wants to render the greater the impact will be against the amount of textures it can render each frame.
Note: the PS2\'s GPU has 4 MB of on-chip eDRAM, and after the frame buffers and z-buffers take up 3 MB, that leaves 1 MB for textures. This extra 1 MB adds a little more to the above result, but not a significant amount.
I think one problem with a lot of technical overviews over the PS2 now days is, well, most people just don\'t understand the PlayStation 2 good enough or think too much in \'PC-terms\'. I have come across a lot of technical references of the PS2\'s architecture in the past from various sites that seem to know what they are talking about. ArsTechnica is one of them and I am basing a lot of my knowledge of the PS2 on their articles (from others aswell).
One big difference is that the PlayStation 2 displays a true dynamic media machine. That means that thinking is done way different: the 3d graphics which the system has to process are very intense. Instead of caching data, textures and polygon information, is being loaded, processed, sent to GS in form of display lists, deleted. Since the data is not cached, it allows to stream data at very high speeds and deleted as soon as it isn\'t needed anymore, only to be swapped with the new load of data.
That is why, the above picture doesn\'t fit right. Assuming that polygons take up 8 MB, code 4 MB leaving 20 MB of textures in the main memory is just not correct. As I already said, data is not being cached. As soon as the code is processed and not needed anymore, it is replaced by the next bit of data. Therefore, the conclusion of 1.2 GB/s (bandwidth) / 20 MB == 60 fps with 20 MB per frame is assuming the PS2 works like the Xbox, GameCube or PC. Thinking is done way different on the PS2 board. I think the screens posted in the main forum (Stuntman and ICO) underline this very well and prove that the PS2 isn\'t weak in terms of texture quality.
I\'ll go into X-Box with the next reply... (I\'m a bit too stressed out to reply on the whole article just right now.. sorry)
-
seven, wouldn\'t it be more correct to say it\'s best not to cache data in large chunks for graphically intensive stuff rather than implying it should be not cached at all? If I might try to recoin a phrase, maybe it could be said a stack of needles vs the old needles in the haystack way of doing things. Perhaps this is what you are driving at with the hints about the shortcomings of xbox. You may as well spit it out, while this forum has many fans looking forward to xbox there are hardly any here who are really tech savvy on the subject as far as I can tell. At least none that I\'ve seen around for quite some time. Whatever you have to say that\'s negative about xbox might be what it would take to draw them out.
Correct. That\'s the main problem with the PC architecture in todays computers and as it seems, both Xbox and GameCube use a PC kind architecture. Of course the good thing is that a lot of developers know the way around on this architecture, but if you want to do it right, you will want to use something that is very dynamic in terms of processing (PS2 sets a new standard in these terms).
If you look at the architecture of the PS2, it becomes pretty obvious that it wasn\'t build to store data for long (very small, but very fast caches, memory). Some developer pointed out that one group that had had PS2 development units for a while took the strategy of constantly downloading textures and models into the VU and processors, instead of downloading them once, caching them, and working on them inside the cache (PC way of thinking). This approach was running the 10-channel DMAC at 90% capacity! This kind of aggressive use of bandwidth resources is exactly the kind of thing PS2 developers will have to do if they want to get somewhere on the PS2 hardware. For your info, the DMAC (Dynamic Memory Access Controler) is in charge for managing the memory. It gets the data out of the RAM (over the 3.2 GB/s bus) and directs it to the right unit within the Emotion Engine. Since the EE does the procession and geometry calculations, the completed display lists can be sent over the 64 bit @ 147 MHz bus to the Graphics Synthesizer (explained above in my last post). Now, obviously, the PlayStation 2 has been designed very well for graphics-intense stuff - it\'s up to the developers to really get that power on screen. And for that, they really have to get away from that caching-data method that is used on todays PC architectures. Note that I am refering a lot to the PC architecture, but that\'s okay, since both Xbox and GameCube use a similar architecture.
by ArsTechnica:
The PS2 is the exact opposite, though. There\'s memory-to-processor bandwidth out the wazoo. The RIMMS are the cache, and the available bandwidth is such that you can get away with storing everything there and downloading it on the fly. So with the PS2, code and data have to be constantly streamed over the wide internal buses in order arrive at the functional units right when they\'re needed. Of course, then the trick is scheduling the memory transfers so that you always have what you need on hand and latency doesn\'t kill you.
-
Originally posted by seven
okay mates, lets get console technical here:
I\'ll go through some of the stuff that I think are important and where the PS2 is handled unfair or upon false assumptions. I\'ll take into consideration that this is a specific comparasment between DC / GC / Xbox, but when someone has the guts to say "The PS2 is poor at texturing", then I just can\'t let that get away. :) So anyway, here it goes:
I think one problem with a lot of technical overviews over the PS2 now days is, well, most people just don\'t understand the PlayStation 2 good enough or think too much in \'PC-terms\'. I have come across a lot of technical references of the PS2\'s architecture in the past from various sites that seem to know what they are talking about. ArsTechnica is one of them and I am basing a lot of my knowledge of the PS2 on their articles (from others aswell).
One big difference is that the PlayStation 2 displays a true dynamic media machine. That means that thinking is done way different: the 3d graphics which the system has to process are very intense. Instead of caching data, textures and polygon information, is being loaded, processed, sent to GS in form of display lists, deleted. Since the data is not cached, it allows to stream data at very high speeds and deleted as soon as it isn\'t needed anymore, only to be swapped with the new load of data.
That is why, the above picture doesn\'t fit right. Assuming that polygons take up 8 MB, code 4 MB leaving 20 MB of textures in the main memory is just not correct. As I already said, data is not being cached. As soon as the code is processed and not needed anymore, it is replaced by the next bit of data. Therefore, the conclusion of 1.2 GB/s (bandwidth) / 20 MB == 60 fps with 20 MB per frame is assuming the PS2 works like the Xbox, GameCube or PC. Thinking is done way different on the PS2 board. I think the screens posted in the main forum (Stuntman and ICO) underline this very well and prove that the PS2 isn\'t weak in terms of texture quality.
Yes...there\'s something unclear about texture-capabilities of PS2...I mean...how the Hell AM2 could do VF4 on PS2(almost arcade perfect) if the console is only capable of 20 MB per frame at 60 frame/sec...the Naomi 2 uses Two PowerVR2 (CLX2) GPU\'s with 32 MB of memory each, which is twice the amount that the PVR2 GPU had on the Naomi 1 board...and the DreamCast can do 30 MB of textures (compressed) per frame at 160 FPS...
There\'s someone on this planet who can explain to me how much PS2 is competitive in texturing ?
-
WOW!
seven, that was an answer, man.
Got only one read but i understand the concept.
So, to resume... PS2 need less memory space for texturing \'cause it doesn\'t store "forever" but use it and replace when needed.
Ok... but another question:
So we can have texture compressed on DVD... then part of it loaded when needed in same memory location... then decompressed and fired up on processing unit?
OR
Texture on DVD and directly decompressed when loaded and fired on processing unit?
Uhm.. maybe the 1... \'cause the access to DVD should be too slow.
-
Good question :)
Since the Graphics Synthesizer can\'t deal with compressed textures, my guess is that they are only loaded into the main RAM right before they are needed. Therefore, you can always use the most part of the RAM to store the textures needed at the moment. Since the EE understands MPEG-2 compression (and the GS doesn\'t), my guess is that the images/textures are compressed on DVD so that they don\'t take up much bandwidth when trasfered to the RAM. Once it\'s in the RAM, the DMAC directs it to the right unit within the EE that uncompresses the texture and sends it over the 64-bit bus to the GS that displays it. Once it is sent over to the GS, my guess is that it isn\'t needed in the main Memory anymore -> which concludes that the texture can now be deleted and replaced with the next load of textures. The result is simple: no data is being cached for long (only until it isn\'t needed, which is very quick), so the buses connecting each unit are always running at highspeeds.
Now, there is one big confusion going on at the moment, which I haven\'t figuered out yet (due to the lack of reference on this specific topic). Where are the images stored when being displayed? The VRAM is very small (only 4 MB) but it can be accessed at 48 GB/s (wow) over the 2560 bit bus. I guess that the data runs into the VRAM, processed, and then deleted. As soon as it\'s processed, it gets replaced with the new loads of data. So, basically, the VRAM is one temporary memory that only holds the data that\'s being processed. Therefore, 4 MB is more than enough, since it\'s being access and processed very quick.
-
Damn, this is getting into a real PlayStation 2 analys - so sorry for the guys who aren\'t too interested. But since the PlayStation 2 is said to be weak in terms of textures, I want to have this said. (I\'ll get back to the Xbox and PC debate right after this one).
PS2 = around 17 Mb not compressed (PS2 don\'t have HW texture compression).
Now I am not sure, but the above link is assuming that the PS2 doesn\'t use texture compression.
(https://psx5central.com/community/proxy.php?request=http%3A%2F%2Fwww.arstechnica.com%2Freviews%2F1q00%2Fplaystation2%2Ffigure2.gif&hash=e1dc5cb10fc5439dcb4b71159bd03478675f4c82)
The above pictures shows rough view of the architecture of the EE. Note the DMAC and the IPU (Image Processing Unit) at the bottom. So my pick is (it\'s obviously speculation since I am not a PS2 developer) that the RAM holds the MPEG-2 compressed textures. The DMAC loads them when needed into the IPU (The IPU is basically an MPEG2 decoder with some other capabilities) where they are being uncompressed. The DMAC then directs them on the main bus to the GIF (Graphics Interface) where it will be sent to the Graphics Synthesizer. This would mean that the RAM could hold compressed textures (anyone know the compression rate MPEG-2 has?), so 20 MB would be far to little as stated above. A compression at 1:2 would already mean that the main RAM could hold 40 MB... and so on. Since the images are being decompressed on the fly while on the way to the GS (and being managed by a unit dedicated to this task) I don\'t see where the EE will take a hit. If you look at the above picture of the EE, you will see that the CPU core, FPU, VUO, VU1 handle the geometry processing. I have also spoken to some developers and the amazing thing about the Emotion Engine is, data can be sent to the GIF without the CPU core even knowing about this. The result, there is no hit on CPU power, since every unit handles its tasks independently. Another thing is that there are more then just one bus connecting each unit. That means that textures that travel through the main internal bus (textures = big loads of data) are not held up by CPU code since they have their own bus to get from unit to unit.
EDIT: MPEG-2 compression (Moving Pictures Experts Group) is actually more for compressing video\'s etc. I am not quite sure how advanced it is and if it can be used to compress textures aswell. If I remember correctly, the Voodoo 3500 card has got some sort of MPEG-2 decoder that enables you to decompress data on the fly. If that\'s true, then I could imagine that the PS2 has this sort of feature aswell. Anyway, it doesn\'t really make a big difference, since the streaming still remains as the key to maximum performance (with or without texture compression). So on second thought: who needs texture compression, if you can stream your data at such highspeeds?
-
Uhm... i was thinking that the 4 Mb of Vram was used as video memory for triple buffering and such things. Maybe i was wrong and there\'s other memory where 800*600*32bit*3 pages are store (don\'t know where perhaps in main memory).
I was thinking that PS2 had an hardware MPEG decoder totaly independent from EE, but seems to me that there is the need of sometype of software to run Films right on PS2. Some game have strange MPG \'chop\' in their introduction films that other games don\'t have. For example GT3 intro sequence it\'s superb and fluid like any DVD film, but Bushido blade \'chop\', and little \'chop\' also Onimusha and Tekken intro when moving the camera. I saw that, like they go minus fps. (arg! bad english sorry).
The main memory of a PS2 is 32 Mb. We can have something like 20 Mb free to use as "buffer" or "cache" or MPG2 compressed texture and decoder buffer but the problem is that in my opinion there is the need of a space in memory where place texture of the DVD disk maybe in little steps but i don\'t think it\'s possible take MPG2 compressed texture randomly from the DVD, pass it to the decoder and then to the chip. All go well but the dvd access it\'s too slow.
If the texture are NOT compressed the things are very strange!
We can\'t think of use same sort of DVD stream for sure, and can store very little part of it in main memory (20 Mb).
So very strange here... we can\'t think to move very frequently texture from DVD (randomly), we can\'t use only NOT compressed texture (20Mb should seems like a dreamcast game will have big textures compressed),.... woooo... there should be some compression pass in the middle.
ARG! How they are able to do Devil May Cry and GT3 using only 20 Mb of not compressed texture and some DVD read?
-
seven, are you sure the GS can\'t handle compressed textures? I know it\'s not there in hardware but I also know the GS is programmable and its ability to be customized and the on board memory have been underestimated or overlooked in some tech reviews in a similar way to the operation of the EE being misunderstood.
For those like me who have trouble thoroughly understanding a lot of the technical info related to consoles, maybe the most important concept to get a hold of is the fact that developers have so much room for exploring possibilities in the radical architecture the EE provides. The games we\'ve seen of late is proof enough the PS2 is showing itself to be more of a blessing than it has been a curse to developers.
TheSammer, your use of English is no worse than 90% of Americans who use the internet, so don\'t worry about it.
-
Thanks for your info and thoughts seven :)
...it should be interesting to have a forum dedicated to hardware and technical info but...I\'m not so sure it will be successful here :(
-
WOAH, I had a very interesting talk today with some colleague at work about image compression technics (hardware and software), and something really interesting came out:
texture compression on X-Box and GameCube might be done hardware, but any compression of data takes up time. If you check the link by SegaTech above about the texture amount of either consoles, you will something like:
GameCube 12 MB (not compressed) == (compressed) 72 MB per Frame (!)
X-Box 45 MB (not compressed) == (compressed = 270 MB per Frame (!)
Since the images have to be processed (a monitor can\'t display 3D, has to calculate into a 2d image), I figuere that the images are being compressed on the fly (thanks to hardware compression). Now, who where believes that X-Box can compress 270 Mega-Bytes (!) of data into 45 MB in 1/60 of a second (assuming the game runs at 60 fps)?
If I compress something on my PC (ZIP with WinAce), it takes me for 100 MB several minutes. Yes, it\'s software compressed, but none the less if it\'s done hardware or software, it will take time to process big amounts of data like this. I mean, come on: 270 MB of data.. into 45 MB in one sixtieth of a second. The above math is probably assuming that compressing textures and uncompressing them does not take up anytime at all.. I would be very keen to know how they do it.. :)
The other thing to note is: RAM would hold image1 (uncompressed), compresses it and saves it again in the RAM (since it\'s a UMA). Image1 would have to be deleted to free memory that isn\'t being used anymore.. think about the time lost in this process.. this might be the problem about the UMA on X-Box.
BizioEE, TheSammer, Heretic (or anyone), any thoughts on this?
BTW: Yeah, a technical forum would be cool, but I think you\'re right.. nobody is really interested here at these forums. But it would be better than the on going flaming or endless debates about how game \'X\' looks better than game \'Y\' on \'Z\' console. :(
-
Hi seven... i understand your doubt about copression and decompression space&time... but i think that they have textures already compressed on the support (DVD-HardDisk).
Then they load something like 30Mb of this compressed textures in ram. When they need to put a compress texture on a surface they send the compressed textr. to the chip that will process it and put on that surface.
If the chip doen\'t put DIRECTLY the compressed texture on the poly surface (that means write on video memory) they need a intermediate buffer to put the uncompressed textr. before next stage.
But see how a GeForce work on a PC.
They put compressed texture directly in the videocard memory. So i think they don\'t need an intermediate buffer... the chip process directly with compressed texture and write on video memory.
For the problem of COMPRESS the texture, well... they can store them already compressed in the support OR they can compress them when install on HD OR they can compress them 1 time at startup. But i don\'t think they compress it when the game is in a middle of a running action.
Well, i think that THIS is very interesting \'cause there is no sense in scream mantras like "XBox is Better" or "PS2 will WIN" without know WHY something should be better or not.
My initial question was to understand why people say XBox is not a PC... when i saw very little differences (UMA). My first warnings are that Microsoft and games software house will use the XBox not only as a console but as a PC (patches, SO, unfinished product, different levels of software version on field...).
But before say something it\'s better than other i think we need to understand \'other\'.
We know at main steps how xbox can run, we don\'t know well how PS2 can do things like some game do?
If we understand that there\'re HW limitation that means that thinks CANNOT be done or we see that there are no problem \'cause different approach then there\'s a sense when someone can say "PS2 have a future" or "PS2 can do better".
This forum is full of debating without know-how.
Maybe it\'s better ASK then scream pseudo-religion concepts.
IMO.
-
TheSammer, hope you don\'t mind a late reply from a no tech know how person to your earlier concern shown for xbox being a PC. The reason for seeing little danger in there being patches for xbox would be that programing for a fixed set of hardware and a single OS is supposed to eliminate many of the bugs that show up when trying to program a game to work on many different PC setups. You must of known that already though, right? While free added levels could be seen as a bonus, a patch/fix would kill any chance xbox has left of not being thought of as a PC that only plays games.
Wouldn\'t ANY added time for compressed textures go a long way towards explaining why those texture specs listed don\'t add up to much net difference in texture looks between consoles? Anyway, its become increasingly clear to me the xbox specs have proven to be little more than a marketing ploy.
This may be wrong but I remember the unified memory wasn\'t even listed when the first set of xbox specs were originally made public, which makes me think it wasn\'t considered until after they discovered the performance of the paper specs weren\'t even coming close to being realized in the real world. Same thing goes for the fatter pipes. My guess as tech ignoramus would be the CPU and GPU are bumping heads trying to access data from the UMA at the same time thereby filling those fat pipes with a hefty % of noise.
About the tech forum. I don\'t think it\'s so much a lack of interest on the part of the members here as it is a lack of expertise among the members. Also, the few times I have witnessed ‘experts\' elsewhere go back and forth it isn\'t long before most of us onlookers get left in the dust with only a foggy sense of the points being argued and no real clue which speaker really knows their stuff. That\'s how it is for me anyway. Unless one expert gets busted by the other and exposed as a poser on a point of fact that everyone else can go back and check, it\'s pretty easy to fool us plain old gamers. That\'s why most of us have come to rely on the games we can see being played as the only reliable gauge for performance. The screaming about system ‘X\' being so much better than system ‘P\' hardley gets wispered now compared to how it used to be.
-
good point there... although, I am pretty sure that the images have to go through some sort of calculation before it can be displayed on screen. Anyway, tech notes by side, I think that present games coming to PS2 clearly prove that it\'s not weak in textures. I\'ll get back to the streaming issue, if I find an article by some developer. I will also see if I can contact some of the developers at SCEE (Camdan, guys behind Dropship) and maybe they have some impressions to share.
As for X-Box being like a PC. I must agree, while it might not be used as a PC, I think the hardware and components are too PC similar to be ignored. I mean, it uses DirectX, it\'s got a slightly modified (and weak for graphics) Pentium III CPU, a very powerful GPU (as in a PC too) with the only difference that all this is shared over a UMA. Look, it\'s even got a built in harddrive, which a lot of fanboys outthere will jump in joy about. Hell, I even heard DoA3 copies a lot of data onto the harddrive when inserted the first time. It will be very interesting to see, for what the hdd will come more in handy: for lazy programming (a fact on PC\'s), or really good use of the hardware. Time will show.
As for the launch games of X-Box? Do they look good? Yes, I think so, but am I impressed? Definately no. Many X-Box fanboys will try the arguement that it is first generation and that second generation will look much better, but this doesn\'t quite count with X-Box. Here\'s why:
1. It\'s very similar to a PC architecture.
2. Use of DirectX.
3. There is only one architecture. Not like in PCs.
Of course, X-box\'s graphics will get better with time, but it won\'t see such a big leap as we will with the PS2. I think the above technical posts about the PS2 proves how complex and different it is. As a result, first gen games looked okay, with second and third coming in fast and looking much much better. Since X-Box is very easy to develop for, I think it\'s safe to say that games like DoA3 are already pushing a lot out of its hardware. PS2 on the other hand is only at its beginning and developers are slowly getting a hold of it. It\'s getting better and I think at this rate, it won\'t take a long until PS2 will be fully up to par with X-Box games. And maybe, PS2 will even surpass X-Box once... it\'s all up to the developers.
-
I was a lot busy these days so I couldn\'t express my thoughts about these arguments...
texture compression on X-Box and GameCube might be done hardware, but any compression of data takes up time. If you check the link by SegaTech above about the texture amount of either consoles, you will something like:
GameCube 12 MB (not compressed) == (compressed) 72 MB per Frame (!)
X-Box 45 MB (not compressed) == (compressed = 270 MB per Frame (!)
Since the images have to be processed (a monitor can\'t display 3D, has to calculate into a 2d image), I figuere that the images are being compressed on the fly (thanks to hardware compression). Now, who where believes that X-Box can compress 270 Mega-Bytes (!) of data into 45 MB in 1/60 of a second (assuming the game runs at 60 fps)?
If we assume that a typical level in a GC or Xbox game has these requirements: 4 MB for code, 4 MB for sound, 3 MB for frame buffer and Z-buffer, and 8 MB for polygons + lighting information...then "you have" 72 MB for the GC(at 183 FPS) and 270 for the X-Box(at 124 FPS)...but...
...there\'s something I can\'t understand :( ...are they assuming a rendering rate of 10 million polygons per second?...if the answer is "yes"...what happens if you process,for example,15 or..20 mpps with more effects ? or with a lot more instructions for poly ?
...you should need more than 8 MB for polys...and you could need more than 4 MB for sound...
...and...when they say you need 3 MB for frame buffer and z-buffer,are they considering the X-Box z-compression ? ..if the answer is "no"...then X-Box should have more RAM available for textures in the same conditions...
Where\'s Dr Yassam when you need him ? :(