Hello

Welcome, Guest. Please login or register.
Did you miss your activation email?

Author Topic: Xbox and PC  (Read 2107 times)

Offline BizioEE

  • Legendary Member
  • ******
  • Posts: 4530
  • Karma: +10/-0
Xbox and PC
« Reply #15 on: November 06, 2001, 01:32:26 PM »
Quote
Originally posted by seven
BizioEE or anyone else:

Since the X-Box uses a UMA (Unified Memory Architecture), where is all the data stored? We have 64 MB of RAM, I know that, but what data exactly does it need to display a picture on screen.


Effective Texturing Bandwidth

http://www.segatech.com/technical/consolecompare2/index.html

Quote

Just to point out: I won\'t answer this question myself, I\'ll let a more experienced X-Box fan answer this :) - because I think I know what the main problem of the X-Box architecture is and where developers may have problems. Hope this won\'t get too technical.. :)

I am looking forward to your reply.


Texel Fill Rate,Rendering Bandwidth,Total Memory,Texture Space,Textures Per Frame...

http://www.segatech.com/technical/consolecompare/index.html

These Links come from a reliable source...and they give you an idea of the X-Box capabilities....
He has the power of both worlds
Girl: What power… beyond my expectations?
AND IT\'S PERSONAL
Demon: No… the legendary Sparda!?
Dante: You\'re right, but I\'m his son Dante!

Offline ooseven
  • The TRUE Scot\'
  • Legendary Member
  • ******
  • Posts: 10105
  • Karma: +10/-0
    • http://
Xbox and PC
« Reply #16 on: November 06, 2001, 02:06:23 PM »
Quote
Originally posted by BizioEE


Effective Texturing Bandwidth

http://www.segatech.com/technical/consolecompare2/index.html



Texel Fill Rate,Rendering Bandwidth,Total Memory,Texture Space,Textures Per Frame...

http://www.segatech.com/technical/consolecompare/index.html

These Links come from a reliable source...and they give you an idea of the X-Box capabilities....


Damn and Blast !

i wish i could understand what all this Techno-Bable actually means !

i just play the games :(
“If you’re talking about sheep or goats, there could be some issues,” [/color]

Offline TheSammer
  • Newbie
  • *
  • Posts: 15
  • Karma: +10/-0
    • http://
Xbox and PC
« Reply #17 on: November 07, 2001, 05:45:58 AM »
Uhm...

------------------------------------------------------------
...

Xbox: 64 MB main memory, so 64 MB - 4 MB (code) - 4 MB (sound) - 3 MB (frame buffer and Z-buffer) - 8 MB (polygon data) leaves 45 MB for textures. S3TC texture compression at a ratio of 6:1 for 24-bit textures gives us a total of 270 MB of textures!

...

DC: 8 MB of graphics memory, so 8 MB - 1.2 MB (double buffered 16-bit frame buffer) - ~2 MB (polygon data "infinite planes") leaves 5 MB for textures. DC used VQ compression for texture data, so the 5 MB of graphics memory could hold 25 to 40 MB of textures depending on the compression ratio and that varied depending on the size of the textures being compressed. Lets assume 30 MB average for textures. Code and sound data where in separate memories on the DC, and did not impact the 8 MB of graphics memory.

...

GC: 24 MB of main memory, 1 MB of texture cache, and 2 MB for the frame buffer and Z-buffer, so 27 MB - 4 MB (code) - 3 MB (frame buffer and Z-buffer) - 8 MB (polygon data) leaves 12 MB for textures. S3TC texture compression at a ratio of 6:1 for 24-bit textures gives us a total of 72 MB of textures. GC has 16 MB of sound memory to hold the 4 MB of sound in our example, and that memory\'s extra free space cannot hold textures.
 .....

-------------------------------------------------------------

So at the end we have (Texture Memory):

DreamCast = 5 (not compressed) and 40Mb (compressed)
GameCube = 12 (not compressed) and 72Mb (compressed)
XBox = 45Mb (not compressed) and 270Mb (compressed).

If that numbers are right we can have on PS2:

32Mb main memory + 4 vid + 2 sound + 2I/O a total something like:

PS2 = around 17 Mb not compressed (PS2 don\'t have HW texture compression).

Something don\'t work right here...

- this mean that even Games on DreamCast (that can use texture compression) should have better-hires-textures than PS2.

- Games on GC should destroy in texture dep. PS2 games.

- Games on XBox should be very very VERY hi detailed (in texture).

Seem to me that today the things are not set like that.

Another 2 thing:

- how many memory Windows-Xbox operating system eat to function properly?

- XBox use DDR ram, PS2 use RamBus... now for what i know RamBus have much more bandw. than ddr... or not?

Who know why they don\'t put 512Mb on a console? Today 512Mb of ram don\'t cost much... maybe less than a HD.

Offline seven
  • conceptics Elitist
  • Hero Member
  • *****
  • Posts: 1743
  • Karma: +10/-0
    • http://www.conceptics.ch
Xbox and PC
« Reply #18 on: November 07, 2001, 09:25:59 AM »
BizioEE, thanks for the links. I\'ll analize and reply to them later.

Quote
Who know why they don\'t put 512Mb on a console? Today 512Mb of ram don\'t cost much... maybe less than a HD.


Simple: When the PS2 was made, RAM was very expensive. Secondly, RAM with high bandwith costs a lot. The PS2 is made for streaming to cut down on memory, cost and anyone who works with graphic-intense stuff will know that data is always on the move and should not be cached. 512 Mb would be far too much and would be a pure waste: Why? Bandwidth to get those textures overthere is too small, again, data is changed very often so a lot of data would be "wasted space".

Offline Heretic
  • Hero Member
  • *****
  • Posts: 641
  • Karma: +10/-0
    • http://
Xbox and PC
« Reply #19 on: November 07, 2001, 12:21:58 PM »
Just a few things, sorry no links for them...

The Windows OS kernel is supposedly very small

PS2 can compress textures but it is not a built-in (right term?) feature and requires at least some % loss in CPU power.

seven, wouldn\'t it be more correct to say it\'s best not to cache data in large chunks for graphically intensive stuff rather than implying it should be not cached at all? If I might try to recoin a phrase, maybe it could be said a stack of needles vs the old needles in the haystack way of doing things. Perhaps this is what you are driving at with the hints about the shortcomings of xbox. You may as well spit it out, while this forum has many fans looking forward to xbox there are hardly any here who are really tech savvy on the subject as far as I can tell. At least none that I\'ve seen around for quite some time. Whatever you have to say that\'s negative about xbox might be what it would take to draw them out.

I know next to nothing on the subject, just throwing out a little info I\'ve managed to glean. Kick it around as you all see fit. Or not

Offline seven
  • conceptics Elitist
  • Hero Member
  • *****
  • Posts: 1743
  • Karma: +10/-0
    • http://www.conceptics.ch
Xbox and PC
« Reply #20 on: November 07, 2001, 01:25:44 PM »
okay mates, lets get console technical here:

I\'ll go through some of the stuff that I think are important and where the PS2 is handled unfair or upon false assumptions. I\'ll take into consideration that this is a specific comparasment between DC / GC / Xbox, but when someone has the guts to say "The PS2 is poor at  texturing", then I just can\'t let that get away. :) So anyway, here it goes:

Quote
Why the Playstation 2 is so poor at texturing. Some facts to consider:
- PS2 GPU\'s (GS) external bandwidth is 1.2 GB/s (64-bit @ 150 MHz bus to CPU (EE))
- PS2 GPU cannot deal with compressed textures over the above bus in real-time, as the GPU has no hardware to deal with compressed textures. The CPU (EE) can support compressed textures, but cannot deliver those textures to the GPU compressed, it has to uncompress them and thus taking up lots of bandwidth over the GPU bus.
- Will assume 4 MB for code, 8 MB for polygon and lighting information, so leaving 20 MB for textures out of PS2\'s 32 MB of total main memory.

PS2: 1.2 GB/s / 20 MB = 60 FPS with 20 MB of textures per frame

A 60 FPS result is not good enough! Any polygon processing will affect that rate, so the PS2 has to either render even less textures per frame or have the game run at a lower frame rate. The more polygons the PS2 wants to render the greater the impact will be against the amount of textures it can render each frame.

Note: the PS2\'s GPU has 4 MB of on-chip eDRAM, and after the frame buffers and z-buffers take up 3 MB, that leaves 1 MB for textures. This extra 1 MB adds a little more to the above result, but not a significant amount.


I think one problem with a lot of technical overviews over the PS2 now days is, well, most people just don\'t understand the PlayStation 2 good enough or think too much in \'PC-terms\'. I have come across a lot of technical references of the PS2\'s architecture in the past from various sites that seem to know what they are talking about. ArsTechnica is one of them and I am basing a lot of my knowledge of the PS2 on their articles (from others aswell).

One big difference is that the PlayStation 2 displays a true dynamic media machine. That means that thinking is done way different: the 3d graphics which the system has to process are very intense. Instead of caching data, textures and polygon information, is being loaded, processed, sent to GS in form of display lists, deleted. Since the data is not cached, it allows to stream data at very high speeds and deleted as soon as it isn\'t needed anymore, only to be swapped with the new load of data.

That is why, the above picture doesn\'t fit right. Assuming that polygons take up 8 MB, code 4 MB leaving 20 MB of textures in the main memory is just not correct. As I already said, data is not being cached. As soon as the code is processed and not needed anymore, it is replaced by the next bit of data. Therefore, the conclusion of 1.2 GB/s (bandwidth) / 20 MB == 60 fps with 20 MB per frame is assuming the PS2 works like the Xbox, GameCube or PC. Thinking is done way different on the PS2 board. I think the screens posted in the main forum (Stuntman and ICO) underline this very well and prove that the PS2 isn\'t weak in terms of texture quality.

I\'ll go into X-Box with the next reply... (I\'m a bit too stressed out to reply on the whole article just right now.. sorry)

Offline seven
  • conceptics Elitist
  • Hero Member
  • *****
  • Posts: 1743
  • Karma: +10/-0
    • http://www.conceptics.ch
Xbox and PC
« Reply #21 on: November 07, 2001, 01:43:52 PM »
Quote
seven, wouldn\'t it be more correct to say it\'s best not to cache data in large chunks for graphically intensive stuff rather than implying it should be not cached at all? If I might try to recoin a phrase, maybe it could be said a stack of needles vs the old needles in the haystack way of doing things. Perhaps this is what you are driving at with the hints about the shortcomings of xbox. You may as well spit it out, while this forum has many fans looking forward to xbox there are hardly any here who are really tech savvy on the subject as far as I can tell. At least none that I\'ve seen around for quite some time. Whatever you have to say that\'s negative about xbox might be what it would take to draw them out.


Correct. That\'s the main problem with the PC architecture in todays computers and as it seems, both Xbox and GameCube use a PC kind architecture. Of course the good thing is that a lot of developers know the way around on this architecture, but if you want to do it right, you will want to use something that is very dynamic in terms of processing (PS2 sets a new standard in these terms).

If you look at the architecture of the PS2, it becomes pretty obvious that it wasn\'t build to store data for long (very small, but very fast caches, memory). Some developer pointed out that one group that had had PS2 development units for a while took the strategy of constantly downloading textures and models into the VU and processors, instead of downloading them once, caching them, and working on them inside the cache (PC way of thinking). This approach was running the 10-channel DMAC at 90% capacity! This kind of aggressive use of bandwidth resources is exactly the kind of thing PS2 developers will have to do if they want to get somewhere on the PS2 hardware. For your info, the DMAC (Dynamic Memory Access Controler) is in charge for managing the memory. It gets the data out of the RAM (over the 3.2 GB/s bus) and directs it to the right unit within the Emotion Engine. Since the EE does the procession and geometry calculations, the completed display lists can be sent over the 64 bit @ 147 MHz bus to the Graphics Synthesizer (explained above in my last post). Now, obviously, the PlayStation 2 has been designed very well for graphics-intense stuff - it\'s up to the developers to really get that power on screen. And for that, they really have to get away from that caching-data method that is used on todays PC architectures. Note that I am refering a lot to the PC architecture, but that\'s okay, since both Xbox and GameCube use a similar architecture.

Quote
by ArsTechnica:
The PS2 is the exact opposite, though. There\'s memory-to-processor bandwidth out the wazoo. The RIMMS are the cache, and the available bandwidth is such that you can get away with storing everything there and downloading it on the fly. So with the PS2, code and data have to be constantly streamed over the wide internal buses in order arrive at the functional units right when they\'re needed. Of course, then the trick is scheduling the memory transfers so that you always have what you need on hand and latency doesn\'t kill you.

Offline BizioEE

  • Legendary Member
  • ******
  • Posts: 4530
  • Karma: +10/-0
Xbox and PC
« Reply #22 on: November 07, 2001, 01:55:21 PM »
Quote
Originally posted by seven
okay mates, lets get console technical here:

I\'ll go through some of the stuff that I think are important and where the PS2 is handled unfair or upon false assumptions. I\'ll take into consideration that this is a specific comparasment between DC / GC / Xbox, but when someone has the guts to say "The PS2 is poor at  texturing", then I just can\'t let that get away. :) So anyway, here it goes:

I think one problem with a lot of technical overviews over the PS2 now days is, well, most people just don\'t understand the PlayStation 2 good enough or think too much in \'PC-terms\'. I have come across a lot of technical references of the PS2\'s architecture in the past from various sites that seem to know what they are talking about. ArsTechnica is one of them and I am basing a lot of my knowledge of the PS2 on their articles (from others aswell).

One big difference is that the PlayStation 2 displays a true dynamic media machine. That means that thinking is done way different: the 3d graphics which the system has to process are very intense. Instead of caching data, textures and polygon information, is being loaded, processed, sent to GS in form of display lists, deleted. Since the data is not cached, it allows to stream data at very high speeds and deleted as soon as it isn\'t needed anymore, only to be swapped with the new load of data.

That is why, the above picture doesn\'t fit right. Assuming that polygons take up 8 MB, code 4 MB leaving 20 MB of textures in the main memory is just not correct. As I already said, data is not being cached. As soon as the code is processed and not needed anymore, it is replaced by the next bit of data. Therefore, the conclusion of 1.2 GB/s (bandwidth) / 20 MB == 60 fps with 20 MB per frame is assuming the PS2 works like the Xbox, GameCube or PC. Thinking is done way different on the PS2 board. I think the screens posted in the main forum (Stuntman and ICO) underline this very well and prove that the PS2 isn\'t weak in terms of texture quality.



Yes...there\'s something unclear about texture-capabilities of PS2...I mean...how the Hell AM2 could do VF4 on PS2(almost arcade perfect) if the console is only capable of 20 MB per frame at 60 frame/sec...the Naomi 2 uses Two PowerVR2 (CLX2) GPU\'s with 32 MB of memory each, which is twice the amount that the PVR2 GPU had on the Naomi 1 board...and the DreamCast can do 30 MB of textures (compressed) per frame at 160 FPS...

There\'s someone on this planet who can explain to me how much PS2 is competitive in texturing ?
He has the power of both worlds
Girl: What power… beyond my expectations?
AND IT\'S PERSONAL
Demon: No… the legendary Sparda!?
Dante: You\'re right, but I\'m his son Dante!

Offline TheSammer
  • Newbie
  • *
  • Posts: 15
  • Karma: +10/-0
    • http://
Xbox and PC
« Reply #23 on: November 07, 2001, 03:38:02 PM »
WOW!
seven, that was an answer, man.

Got only one read but i understand the concept.

So, to resume... PS2 need less memory space for texturing \'cause it doesn\'t store "forever" but use it and replace when needed.

Ok... but another question:

So we can have texture compressed on DVD... then part of it loaded when needed in same memory location... then decompressed and fired up on processing unit?
OR
Texture on DVD and directly decompressed when loaded and fired on processing unit?

Uhm.. maybe the 1... \'cause the access to DVD should be too slow.

Offline seven
  • conceptics Elitist
  • Hero Member
  • *****
  • Posts: 1743
  • Karma: +10/-0
    • http://www.conceptics.ch
Xbox and PC
« Reply #24 on: November 07, 2001, 11:48:20 PM »
Good question :)
Since the Graphics Synthesizer can\'t deal with compressed textures, my guess is that they are only loaded into the main RAM right before they are needed. Therefore, you can always use the most part of the RAM to store the textures needed at the moment. Since the EE understands MPEG-2 compression (and the GS doesn\'t), my guess is that the images/textures are compressed on DVD so that they don\'t take up much bandwidth when trasfered to the RAM. Once it\'s in the RAM, the DMAC directs it to the right unit within the EE that uncompresses the texture and sends it over the 64-bit bus to the GS that displays it. Once it is sent over to the GS, my guess is that it isn\'t needed in the main Memory anymore -> which concludes that the texture can now be deleted and replaced with the next load of textures. The result is simple: no data is being cached for long (only until it isn\'t needed, which is very quick), so the buses connecting each unit are always running at highspeeds.

Now, there is one big confusion going on at the moment, which I haven\'t figuered out yet (due to the lack of reference on this specific topic). Where are the images stored when being displayed? The VRAM is very small (only 4 MB) but it can be accessed at 48 GB/s (wow) over the 2560 bit bus. I guess that the data runs into the VRAM, processed, and then deleted. As soon as it\'s processed, it gets replaced with the new loads of data. So, basically, the VRAM is one temporary memory that only holds the data that\'s being processed. Therefore, 4 MB is more than enough, since it\'s being access and processed very quick.

Offline seven
  • conceptics Elitist
  • Hero Member
  • *****
  • Posts: 1743
  • Karma: +10/-0
    • http://www.conceptics.ch
Xbox and PC
« Reply #25 on: November 08, 2001, 12:22:39 AM »
Damn, this is getting into a real PlayStation 2 analys - so sorry for the guys who aren\'t too interested. But since the PlayStation 2 is said to be weak in terms of textures, I want to have this said. (I\'ll get back to the Xbox and PC debate right after this one).

Quote
PS2 = around 17 Mb not compressed (PS2 don\'t have HW texture compression).


Now I am not sure, but the above link is assuming that the PS2 doesn\'t use texture compression.



The above pictures shows rough view of the architecture of the EE. Note the DMAC and the IPU (Image Processing Unit) at the bottom. So my pick is (it\'s obviously speculation since I am not a PS2 developer) that the RAM holds the MPEG-2 compressed textures. The DMAC loads them when needed into the IPU (The IPU is basically an MPEG2 decoder with some other capabilities) where they are being uncompressed. The DMAC then directs them on the main bus to the GIF (Graphics Interface) where it will be sent to the Graphics Synthesizer. This would mean that the RAM could hold compressed textures (anyone know the compression rate MPEG-2 has?), so 20 MB would be far to little as stated above. A compression at 1:2 would already mean that the main RAM could hold 40 MB... and so on. Since the images are being decompressed on the fly while on the way to the GS (and being managed by a unit dedicated to this task) I don\'t see where the EE will take a hit. If you look at the above picture of the EE, you will see that the CPU core, FPU, VUO, VU1 handle the geometry processing. I have also spoken to some developers and the amazing thing about the Emotion Engine is, data can be sent to the GIF without the CPU core even knowing about this. The result, there is no hit on CPU power, since every unit handles its tasks independently. Another thing is that there are more then just one bus connecting each unit. That means that textures that travel through the main internal bus (textures = big loads of data) are not held up by CPU code since they have their own bus to get from unit to unit.

EDIT: MPEG-2 compression (Moving Pictures Experts Group) is actually more for compressing video\'s etc. I am not quite sure how advanced it is and if it can be used to compress textures aswell. If I remember correctly, the Voodoo 3500 card has got some sort of MPEG-2 decoder that enables you to decompress data on the fly. If that\'s true, then I could imagine that the PS2 has this sort of feature aswell. Anyway, it doesn\'t really make a big difference, since the streaming still remains as the key to maximum performance (with or without texture compression). So on second thought: who needs texture compression, if you can stream your data at such highspeeds?

Offline TheSammer
  • Newbie
  • *
  • Posts: 15
  • Karma: +10/-0
    • http://
Xbox and PC
« Reply #26 on: November 08, 2001, 04:55:18 AM »
Uhm... i was thinking that the 4 Mb of Vram was used as video memory for triple buffering and such things. Maybe i was wrong and there\'s other memory where 800*600*32bit*3 pages are store (don\'t know where perhaps in main memory).

I was thinking that PS2 had an hardware MPEG decoder totaly independent from EE, but seems to me that there is the need of sometype of software to run Films right on PS2. Some game have strange MPG \'chop\' in their introduction films that other games don\'t have. For example GT3 intro sequence it\'s superb and fluid like any DVD film, but Bushido blade \'chop\', and little \'chop\' also Onimusha and Tekken intro when moving the camera. I saw that, like they go minus fps. (arg! bad english sorry).

The main memory of a PS2 is 32 Mb. We can have something like 20 Mb free to use as "buffer" or "cache" or MPG2 compressed texture and decoder buffer but the problem is that in my opinion there is the need of a space in memory where place texture of the DVD disk maybe in little steps but i don\'t think it\'s possible take MPG2 compressed texture randomly from the DVD, pass it to the decoder and then to the chip. All go well but the dvd access it\'s too slow.

If the texture are NOT compressed the things are very strange!
We can\'t think of use same sort of DVD stream for sure, and can store very little part of it in main memory (20 Mb).

So very strange here... we can\'t think to move very frequently texture from DVD (randomly), we can\'t  use only NOT compressed texture (20Mb should seems like a dreamcast game will have big textures compressed),.... woooo... there should be some compression pass in the middle.

ARG! How they are able to do Devil May Cry and GT3 using only 20 Mb of not compressed texture and some DVD read?

Offline Heretic
  • Hero Member
  • *****
  • Posts: 641
  • Karma: +10/-0
    • http://
Xbox and PC
« Reply #27 on: November 08, 2001, 09:49:03 AM »
seven, are you sure the GS can\'t handle compressed textures? I know it\'s not there in hardware but I also know the GS is programmable and its ability to be customized and the on board memory have been underestimated or overlooked in some tech reviews in a similar way to the operation of the EE being misunderstood.

For those like me who have trouble thoroughly understanding a lot of the technical info related to consoles, maybe the most important concept to get a hold of is the fact that developers have so much room for exploring possibilities in the radical architecture the EE provides. The games we\'ve seen of late is proof enough the PS2 is showing itself to be more of a blessing than it has been a curse to developers.

TheSammer, your use of English is no worse than 90% of Americans who use the internet, so don\'t worry about it.

Offline BizioEE

  • Legendary Member
  • ******
  • Posts: 4530
  • Karma: +10/-0
Xbox and PC
« Reply #28 on: November 09, 2001, 03:16:50 AM »
Thanks for your info and thoughts  seven :)

...it should be interesting to have a forum dedicated to hardware and technical info but...I\'m not so sure it will be successful here :(
He has the power of both worlds
Girl: What power… beyond my expectations?
AND IT\'S PERSONAL
Demon: No… the legendary Sparda!?
Dante: You\'re right, but I\'m his son Dante!

Offline seven
  • conceptics Elitist
  • Hero Member
  • *****
  • Posts: 1743
  • Karma: +10/-0
    • http://www.conceptics.ch
Xbox and PC
« Reply #29 on: November 09, 2001, 08:32:05 AM »
WOAH, I had a very interesting talk today with some colleague at work about image compression technics (hardware and software), and something really interesting came out:

texture compression on X-Box and GameCube might be done hardware, but any compression of data takes up time. If you check the link by SegaTech above about the texture amount of either consoles, you will something like:

GameCube 12 MB (not compressed) == (compressed) 72 MB per Frame (!)
X-Box 45 MB (not compressed) == (compressed = 270 MB per Frame (!)

Since the images have to be processed (a monitor can\'t display 3D, has to calculate into a 2d image), I figuere that the images are being compressed on the fly (thanks to hardware compression). Now, who where believes that X-Box can compress 270 Mega-Bytes (!) of data into 45 MB in 1/60 of a second (assuming the game runs at 60 fps)?

If I compress something on my PC (ZIP with WinAce), it takes me for 100 MB several minutes. Yes, it\'s software compressed, but none the less if it\'s done hardware or software, it will take time to process big amounts of data like this. I mean, come on: 270 MB of data.. into 45 MB in one sixtieth of a second. The above math is probably assuming that compressing textures and uncompressing them does not take up anytime at all.. I would be very keen to know how they do it.. :)

The other thing to note is: RAM would hold image1 (uncompressed), compresses it and saves it again in the RAM (since it\'s a UMA). Image1 would have to be deleted to free memory that isn\'t being used anymore.. think about the time lost in this process.. this might be the problem about the UMA on X-Box.

BizioEE, TheSammer, Heretic (or anyone), any thoughts on this?

BTW: Yeah, a technical forum would be cool, but I think you\'re right.. nobody is really interested here at these forums. But it would be better than the on going flaming or endless debates about how game \'X\' looks better than game \'Y\' on \'Z\' console. :(

 

SMF spam blocked by CleanTalk