Re: Re: Huh?
Originally posted by dguisinger
I would have to bet your G4 processor is faster than this P3 in my XBox, so again, CPU IS NOT an issue on modern day consoles as much as GPU power is.
I'm just adding to this discussion, not taking a side. I agree with your statement. I do think there are a lot of factors to consider when a game is brought to a computer.
To be fair, the XBox only has to pump out graphics at 768 x 480 (max) for NTSC video (640 x 480 with overscan). I'm not sure if it does HDTV, but even that is only about 1280 x 1024... but I don't think it does for games. (Correct me if I'm wrong).
There is a good reason why consoles can produce relatively great looking graphics... because they only have to pump out 25% of the pixels per second that your average computer would have. 1280 x 1024 is 4 times the spatial density as 640 x 480. That, in turn, is the reason why good computers can really out shine a console, too. At 1280 x 1024, with FSAA on, I'd have to have a great computer... but a game would look much better than a console because the geometry density _can_ be higher on a computer's CPU. Doesn't mean it is, but it can be.
The computer's CPU determines what you see, then the GPU renders it. So, in a way, the two are tied. Depending on how modern the graphics card, and how the routines are done in the game, the load can be greater on one than the other. For example, Unreal is heavily dependant on the CPU as compared to some other FPS games. This is straight from the developer's mouth.
And graphics are not the only thing to consider. AI and Physics are large computational tasks that must occur simultaneously with the visual sub-system. So not only is the CPU determining what the person can see, but it's calculating the moves of any BOT AI's, any physics, and the general game mechanics. When games are ported to a computer, they literally loose a bit in the translation. They also get new features most of the time. all of this adds up.
Again, I'm just thinking outloud to spur a little discussion on the point.
One last thing I'd like to mention-- there are now programming languages being ported to run on GPU's. GPU's on many computers can be FAR more effective at crunching numbers, vectors, etc. than the CPU. I find that very interesting. Why not use the GPU to process some vector math or even general task computing?