Why PC game is slower than console game? I heard PS2 "Emotion Engine" processor only able to do transfer 400 MB/s, but why my intel 1.7 Ghz is slow when playing a game.
Stupid programmers? Lack of CPU instruction or non optimized code?
Posted on 2005-08-05 07:13:18 by realvampire
Specialized processors (playstation 2 has a *very* funky design, and is hard to program), as well as optimized code. MHz coun't isn't everything, and neither is bandwidth (if your calculations are so heavy that you don't utilize the full bandwidth, the full bandwidth obviously isn't necessary).

I thought PS2 had relatively high memory bandwidth btw, since it's rambus ram.
Posted on 2005-08-05 07:58:32 by f0dder
There aren't the levels of indirection on a console.
DirectX on the XBox can be written knowing it's using the nVidia chipset, so the level on abstraction in the Hardware Abstraction Layer is thinner.
They never need to worry about whether or not method A is faster on chipset X, vs. method B on chipset Y, or a compromise that works OK on both. There's a definitive answer.

Also in terms of the traditional frame buffer, this is just an address range on your card in your PC, and has to be because you could run at 300x200, or 1600x1200, where as on a console (and both PS3 and XBox360 do this, not so sure about PS2) they put embedded DRAM which is dedicated and only just big enough for the frame buffer.

It's the constantly shifting target of the PC vs the static for several years console.

Mirno
Posted on 2005-08-05 08:46:54 by Mirno
On PS1/2, there is one framebuffer, where you store frontbuf, backbuf, textures AND texture palettes. Accessing the framebuffer directly is not possible (at least on PSX). Resolutions get changed rather often, especially if there's a FMV to be shown. The coder chooses where exactly the frontbuf, backbuf, textures and palettes are to be stored (x/y offset). iirc you could only upload to the videomem -texturedata, palettes and requests to decode a FMV frame.
Anyway, this is for the PSX/PS2.

Mirno's answer is the definite to this topic. The whole HAL gets as thin as a simple lib. The gameconsole coders get no pressure on compatibility issues, and thus they have time to make even more features in their games. And as a plus they get less bugs. But they can't do Valve's trick on "hoping for faster cpus/gpus" so that their games run smooth :)

Ah, and also - yes, most PC game coders are not that good at optimizations. The HAL slows things down a bit, too. And the wide range of graphic cards brings more confusion on how to solve problems - for instance, ATI's newest cards could run slower with games that often lock textures and stuff.
Posted on 2005-08-05 10:29:00 by Ultrano

But they can't do Valve's trick on "hoping for faster cpus/gpus" so that their games run smooth

That's more of the ID software thing to do - scali successfully ran HL2 on, what was it, an underpowered PII with a r8500 card? Low detail etc, but it ran ;)
Posted on 2005-08-05 12:48:15 by f0dder
Don't forget that PC Operating Systems are usually running a whole bunch of other processes and services that eat up processing time, while consoles process almost only the game (little OS overhead).
Posted on 2005-08-05 15:47:36 by SpooK

Don't forget that PC Operating Systems are usually running a whole bunch of other processes and services that eat up processing time

True - while the majority of those threads are in suspended mode taking no processor time whatsoever, there's still quite a number that are run at least periodically.
Posted on 2005-08-05 18:09:24 by f0dder
wow this right after I finished playing Goldeneye: Rogue Agent on my Gamecube lol. I was just wondering why the gfx were so good, man I kicked butt :D

Also the console h/w engineers arent bound by the standardization that the PC has. They can do radical changes to the architecture of the machine with each generation. Also as f0dder pointed out consoles have special processors. The PS2 has dedicated RISC vector units on top of the EE2.
Posted on 2005-08-07 18:49:34 by x86asm
Also the console h/w engineers arent bound by the standardization that the PC has. They can do radical changes to the architecture of the machine with each generation. Also as f0dder pointed out consoles have special processors. The PS2 has dedicated RISC vector units on top of the EE2.


Special processors are not always advantageous. PS2 is a perfect example. By all reports I've come across, the 'Emotion Engine' is absolute hell to code for.
Posted on 2005-08-08 17:16:23 by invalid
What about xBox360, what processor it is using. And is PS3 or xBox360 using x86 IA or they have their own format?
Posted on 2005-08-26 22:26:16 by realvampire

What about xBox360, what processor it is using. And is PS3 or xBox360 using x86 IA or they have their own format?


http://www.xbox.com/en-US/xbox360/factsheet.htm... XBox 360 based on the IBM PowerPC.

As far as I know, PS3 will have backwards compatibility... XBox 360 has not mentioned anything about running XBox games (emulation or native).
Posted on 2005-08-26 22:44:21 by SpooK

XBox 360 has not mentioned anything about running XBox games (emulation or native).

Dunno if it was an official statement or not, but I heard that the xbox360 will have backwards compatibility with "most major games".

I'm assuming that this will be done through emulation (wonder why MS bought VirtualPC?), with the addition of some game-specific extra patches (thus the "most major games"). That's the gripes you get from changing hardware architecture around entirely :)
Posted on 2005-08-27 00:39:01 by f0dder


What about xBox360, what processor it is using. And is PS3 or xBox360 using x86 IA or they have their own format?


http://www.xbox.com/en-US/xbox360/factsheet.htm... XBox 360 based on the IBM PowerPC.

As far as I know, PS3 will have backwards compatibility... XBox 360 has not mentioned anything about running XBox games (emulation or native).



22.4 GB/s memory interface bus bandwidth
256 GB/s memory bandwidth to EDRAM
21.6 GB/s front-side bus


Owwww, thats amaze me. Is IBM power PC is equal to a PC like at my home.
Posted on 2005-08-28 00:23:16 by realvampire
yes, they used IBM PowerPC in Macs
Posted on 2005-08-28 09:29:28 by comrade
Before getting too hyped up about the xbox specs, read
http://groups.google.com/group/comp.arch/browse_frm/thread/c034498c16592a34/e33579dfd1757d19?q=xbox&rnum=1#e33579dfd1757d19
Posted on 2005-08-28 11:21:17 by f0dder
f0dder is right, there is no point looking at numbers and saying one system will be better than the other because of it.
Look at PowerVRs tile processing architecture (GigaPixel also had a very good tile rendering engine, which nVidia now owns all the IP rights for BTW). Although the architecture never took off in the market place (it lacked hardware T&L), when comparing the numbers the PowerVR chipset got beaten on pretty much every number you'd care to measure, but it won plenty of benchmarks against the cards nVidia and ATi put out at the same price point.

The PS3 with it's vector units can be VERY powerful, but it's dependant on their being able to be put to use. I'm guessing that the PS3 will get better and better over time, as libraries mature, and programmers get to grips with the architecture.

The big factor that could go in MS's favour is that the R520 at least has some h264 hardware decoding. nVidia in theory also does h264 with it's 7800 cards, but no drivers have actually shown this capability yet... That could be a big plus for the XBox.
If Sony puts a separate h264 decoder on the boards, then it's extra cost for them. If they don't then the CPU has to do all the grunt, and h264 is NOT suited to vector processing.

Mirno
Posted on 2005-08-28 19:42:26 by Mirno
I think Scali owns/owned a PowerVR card - appearantly the drivers were a bit buggy etc., but the performance *was* good (work smarter, not harder). One of my real-life friends has one lying around. It's one of those "add-on 3d cards", but interestingly it doesn't use a pass-through cable like the original VOODOOs, it uses PCI bus mastering to transfer data between itself and the primary cards framebuffer. Pretty funky stuff.


The PS3 with it's vector units can be VERY powerful, but it's dependant on their being able to be put to use. I'm guessing that the PS3 will get better and better over time, as libraries mature, and programmers get to grips with the architecture.

That's my guess too - it'll take a bit to get used to, but then you'll see some very impressive stuff (like what happened with PSX and PS2). I saw a video of one of the PS3 launch titles (well, I think it's a launch title anyway - but of course I've forgotten it's name). This already looked VERY promising. And considering that first-generation games tend not to use nearly all the potential of the hardware, PS3 is certainly drool material (not that the XBOX360 isn't).


nVidia in theory also does h264 with it's 7800 cards, but no drivers have actually shown this capability yet... That could be a big plus for the XBox.

Well, what kind of chip does the PS3 have? My guess would be a custom version...
Posted on 2005-08-28 21:56:49 by f0dder
By all accounts I've seen PS3 has something very similar to the nVidia 7800, but with some eDRAM for fast framebuffer access.
Both consoles have got tweaked architectures of current consumer parts...

Mirno
Posted on 2005-08-30 08:55:37 by Mirno
In all honesty after seeing the the demo running, I would give my hand to the PS3. The graphics simply blew XBox360 out of the water IMO. I remember the PowerVR chip, damn it didnt use the passover? Holy thats pretty neat. I ad a Matrox Millenium 2MB at the time the thing couldnt do texturing in hardware lol but it could raster flat and gouraud shaded polygons fine. Its amazing how hardware has advanced in only a decade!
Posted on 2005-09-08 08:39:37 by x86asm
Mirno:
Where did you have that information? I saw you all always have a complete information.
Is xBox 360 or PS3 programming tools available?
Posted on 2005-09-16 01:15:17 by realvampire