Valve on the Graphics/Processor Debate

Evo

Tank
Joined
May 6, 2005
Messages
6,517
Reaction score
7
In a sneak preview of an interview to appear later this week CVG have recieved word from Doug Lombardi concerning Valve's thoughts on the debate between putting the graphics card ahead of the CPU as the most important piece of the PC.[br]
"For a long time the GPU side has been leading the charge towards brighter, shinier games, and it usually ends up that whoever has the best looking game at shows like E3 usually gets game of the show. We've always looked at that shaking our heads thinking it's not always about the graphics. We've all seen games that looked really pretty and got all these awards but then it comes out and there's not much of a game there."
[br]Check it out here.
 
He probably should have said that graphics and GPUs have topped out in relative importance compared to CPUs. People might get confused.
 
He probably should have said that graphics and GPUs have topped out in relative importance compared to CPUs. People might get confused.

Knowing the way some websites jump on headlines, he most likely did.
 
I love it when Valve talks tech, cant wait for the full interview!~
 
Doesn't matter either way what most people think. It's what most people want and humans are shallow. They want pretty things now.
 
The games that cripple graphics cards and recieve oodles of hype are always going to exist, but I think there is room for both sides in the industry, the problem now is that the budget required for games such as Crysis or whatever makes them an obviously risky prospect for most devs to consider, which is why inovating areas such as AI, Physics, Interactivity and unique gameplay features are more interesting.
 
Games cripple CPUs, CPUs are not fast enough. If you have a high end GPU you know all about CPU bottlenecks.

I should have worded that better, I was refering to somthing like crysis, in that its main selling point was that it pushed the high end, to the point that even the best rig struggled on max settings, your always gonna have the games that decide to do that, but it doesnt mean all games have to.
 
Valve has always put a lot into their game environment and art. No need for extra fancy pixel creation. Plus the gameplay is the meat of the game.
 
No.

Games cripple CPUs, CPUs are not fast enough. If you have a high end GPU you know all about CPU bottlenecks.

If you have a low end graphics card, you'll know all about GPU bottlenecks. The statement is redundant. Games can cripple both.
 
No the statement is not redundant as high end GPUs are being held back by weak CPUs. You can't make the same argument with outdated technology.
 
I think GPUs are far more of a bottleneck if you're anywhere but at the very cutting edge (and most people aren't). For the latest games, a CPU upgrade will give most people a negligible performance increase compared to a GPU upgrade. I think Doug's right on the money by saying there's CPU potential in games that isn't exploited. Which is nice, because usually everything Doug says scares or depresses me.
 
GPU's are technically the more important aspect when it comes to gaming. It doesn't seem what Doug was saying was relative to now, but just more future ideas valve has; make their games more reliant on CPUs so you can raise the bar in AI and gameplay, not just graphics like Crysis. But definitly right NOW, you're money should go more into the GPU if you're a gamer.

Speaking relative to valve, they always do a great job releasing their products with not just improved graphics, but better AI, and while no valve game, so far, has really been a hardware-hog like Crysis, yet when HL2 came out, it WAS Crysis of 2004, it just was well optimized...

Another thing that i am a little flustered about, is the pricing of GPU technology today... I'd say considering how much it costs to manufacture a card, and the significane of it in relation to the other parts of a computer, its quite overpriced... Hopefully in the near-future the average price of top-line GPUs will lower from $600 to, not to provide specifics, around $450, especially since you may also need a new mobo, or PSU, or CPU, or case, which can cost a lot too... This could be why valve wants to get away from making games that soley depend on GPUs....
 
Pretty hypocritical of Valve, considering their immense requirements for Left 4 Dead.
 
saratos said:
immense requirements for Left 4 Dead

Uh, dude, L4D is pretty low in requirements. It uses Shader Model 2.0... while plenty of new games (Bioshock, cough cough) require 3.0 or higher. The ATI X800 (which should be able to play the game just fine) was released in 2004! Mind you, the Radeon 9800 was killer tech when Half-Life 2 was released, and that was just a generation behind the X800... we're up to the X4800 right now, so get a grip.
 
lol, I have an X800. And yet, according to the requirements I can still run it. But I have noticed a lot of Steam people have old computers. (I'm upgrading specifically for L4D and Far Cry 2)
 
I think I used to run HL2 on an X300, but now I don't have a graphics card :P
 
lol, I have an X800. And yet, according to the requirements I can still run it. But I have noticed a lot of Steam people have old computers. (I'm upgrading specifically for L4D and Far Cry 2)

Not all people have steam or answer the questionnaire do so on a machine they play their games on, nor do all people who answer the questionnaire want to play top end games.
 
Crap... when I first glanced at this headline quickly I thought it was going to be about the graphics update for HL2/EP1 :-\
 
Uh, dude, L4D is pretty low in requirements. It uses Shader Model 2.0... while plenty of new games (Bioshock, cough cough) require 3.0 or higher. The ATI X800 (which should be able to play the game just fine) was released in 2004! Mind you, the Radeon 9800 was killer tech when Half-Life 2 was released, and that was just a generation behind the X800... we're up to the X4800 right now, so get a grip.
Not the GFX card, the processor power needed.
 
I'm sorry, but you never specified. The most common complaint is graphics horsepower... but I can make a similar argument for processors. The first Pentium 4 at 3 GHz was released in late 2002. The first Intel Core 2 Duo released in Summer 2006. The AMD64X2 was released in 2005. This isn't exactly cutting edge stuff. Valve is good to us when it comes to requirements.
 
Eye candy graphics (colors, lights, shadows, reflections, textures): mostly GPU
Intelligence (AI, npc behaviour, director): CPU

L4D, and Valve remarks this, seems to be heavy in the second department, which makes for a smart game.
 
All I know is in this world of shallow graphical masterpieces. I find myself playing my nintendo DS more than any current generation game on PC or any Wii/xbox360/ps3 game (except maybe disgaea 3 but that falls right into the game[play > graphics scenario)
 
Being of the male race we know that great looks can easily turn heads
the male race we know that great looks
the male race
race

Seriously? No one noticed this yet? What halfwit wrote this?
 
Being of the male race we know that great looks can easily turn heads
the male race we know that great looks
the male race
race

Seriously? No one noticed this yet? What halfwit wrote this?

What.
 
I think the main point Doug is trying to make here is that the definition of GPU (and possibly even CPU) will be coming in the near future. A future where GPU stands for 'General Processing Unit' instead of 'Graphics Processing Unit'. CPUs and GPUs are both moving towards a multi-core, heavily parallel architecture in which the processor itself has it's own "OS" that then is programmed for via that OS. For more details in a not-too-terribly hard to understand way, read http://arstechnica.com/articles/paedia/gpu-sweeney-interview.ars
 
I hope we'll get close to raytracing, whether its going to be by CPU or GPU, or a combination.

It's not going to giver prettier graphics, because the latest games effectively copy all the features of raytracing (plus radiosity of course). But when raytracing becomes possible, it's going to be easier to do create the graphics, because the reflections/refractions, shadows and so forth, is part of the algorithm. That'll remove the burden of creating all these hacks and illusions to make it look like there's real global illumination, and focus can be put elsewhere than the graphics.
(What I mean is that raytracing won't cause a jump in visual quality, but a decrease in production time)

Oh, my teacher have an example of a raytracer & photon mapper here:
http://www.cc.gatech.edu/~phlosoft/photon/

The above should be easier to do (Source code included), than the rasterizers we use today, but it's way, way, waaaay, more computing intensive.
 
Back
Top