Originally posted by Black Samurai
Think about what you are saying. Do you really think it is better for game development when a company has to waste months of development and millions of dollars to "experiment". If there was a standard we could be playing fourth/fifth generation games when a console launches. Why? Because there was no need for experimenting, they knew the system because it was the same architecture and development environment as all the previous consoles they worked on.
No.I dont think its better.But Direct X isnt letting developers program directly to the metal.Developers that stand out have the chance to do just that.Many tools are being created and can be distributed among developers or even by Sony.Something similar to the direct X can happen with acrhitecture like the PS2.The developers can choose a variety of tools or if they have the money program directly to the heardware(big developers that can stand out).There is a variety of tools on PS2 already.Its just that this was the first time developers program on such an architecture.It is self evident that for PS2 these problems were unavoidable
PC Developers have to design their games to work on as many rigs as possible. If they could say f*ck it lets work for the best PCs out there PC games would look even better than they do now. That is why this is so good for Console developers.
Ahm....comparing PCs with console when it comes to developer\'s choice of hardware is kind of awkward.
Standard development architecture WITH a standard hardware layout is incredible. Do you know how many good games we\'ll see in a shorter timespan and from more varied developers? Something like this would mean more and maybe even cheaper games. [/B]
As I said PS2 introduced a new method.Its needs time to evolve.If developers continue to support this kind of architecture and programming you ll start seeing the difference I am talking about.
The only reason why you see such a difference in PCs is becuase the evolve fast when it comes to hardware.Faster than what games on PCs seem to show you.You think its the Direct X that does the biggest difference?No.Its the hardware.And most of it is left untapped.
Originally posted by Paul
Err....was there any valid argument at all in this thread????
Making a standard tools is good for everyone. Do you want a 5 year development cycle for each game?? By the time the game came out, it\'s time for the next gen console!!! (not to metion the rising development cost...which the consumers will have to bear eventually...do you want to pay 99 dollars or 49 dollars for a game??)
Thats only the risk paid when you introduce something new
And yes, indeed Direct X "shit" is > PS2.
PS2 is like a GF2(DX7) with higher poly count and much lower texture.
DX really rocks. It provides a backward compatible platform for developers to work and add new features and functions for each generation. What\'s wrong with that!!!!!
And usually by the 3rd year of a console life span, the graphics improvement will have stale...like the PS2 now..(GT4 is nice, but is hardly any different from GT3). [/B]
Do you like Burnout3 sunshine?Thank Middleware.How many racing games on XBOX look as good as Burnout3 or GT4?And XBOX is using Directx tools, on a GF2?Now whats wrong with that?
Its funny that developing on PS2 helped the appearance of a variety of visuals on XBOX when its titles became multiplatform
And thats just the beginning.A new start is always hard.Developers have been accustomed to Direct X for more than a decade.And yet comparing XBOX de direct X GF2 super console(appeared late 2001), with what PS2 has offered (harwdware released in 1999) isnt the huge gap you make it seem to be.