V
vissione
Guest
So in all your honesty, can I experience DirectX9 the way it's meant to be with a Radeon 9600 pro?
I recently downloaded the HDR demonstration video now that I had all my bandwidth free and had nothing else better to do.
Well, this video, even without any action, was just soooo amazing. Doom III has got to really do something special to beat Half-Life 2's lighting effects... I'm already drooling over what they can come up with with the extra time they've got.
Anyway, I'll be frank, I'm on a tight budget. I'm getting a 2500+ so I can overclock to atleast 2800+, hopefully reach 3000+. I'm sticking with 512MB DDR400, though I'll get another stick once I can afford it. Last peice would be the video card, and with the benchmarks out I felt the Radeon 9600 pro really IS the best bang for your buck.
What I hate mostly about benchmarks like 3DMark is that it's just numbers, so I try to avoid those. UT2k3 uses 2 forms of benchmarks, flyby and botmatch, and people WAY TOO DAMN MUCH tend to use flyby. So it's more stressful on the GPU and not the CPU, who cares? I look at some benchmarks and I see the 9500 non-pro, 9500 pro, 9600 non-pro, 9600 pro, 9700 non-pro, 9700 pro, 9800 non-pro, and 9800 pro ALL can run UT2k3's botmatch at over 60fps. Right there, I know that it doesn't matter what card I'd get (for UT2k3), I know I can get acceptable framerates. (This is of course at 1024x768 and no AA/AF... you could counter-point me saying you pay $200 more for a card so you can use higher res and AA/AF, but in my case, after watching the 20min demo AND the HDR video which both run without AA/AF, I could care less... even the 9800 pro is lousy at HL2 @ 1600x1200 with full AA/AF, so I don't care.)
My point is the following, the HL2 benchmark shows real-world performance. The first one, Techdemo, is not really real-world, it's more of a eye-candy showroom... and it's acceptable to be easier on the 9800 pro which can do more effects faster. Still so, I see a ~20fps difference between the 9600 pro and 9800 pro in that test. It's not very CPU intensive, it's more GPU intensive like flybys from UT2k so I see that as a test for pure GPU.
Bugbait is a real-world situation, which will be a good indicative of how you'll be playing. During that test the Radeon 9600 pro and 9800 pro are about ~10fps in difference. This is performance with AI, physics, and effects all going at it. It's the performance that matters. Not theoretical, not one sided (GPU only), but real numbers. They'll also be a bit higher if the optimizations hold true, and new drivers will undoubtly raise the performance a bit for all cards in the Radeon line.
Last but not least City 17's benchmark shows again about ~10fps (more like ~6fps) difference.... once again a real situation in HL2 one that proves to be the most demanding.
Now back to my topic title, would you consider the proximity of the Radeon 9600 pro to the 9800 pro similar to the proximity of the said cards in UT2k3 under botmatch? Is it that some benchmarks are designed to show areas of power in GPU only, and really forget that what matters is running all those effects along with the rest of the stuff the CPU has to do?
I'm pretty much going to have to use the Radeon 9600 pro to play Half-Life 2 AND Doom III... so I hope my logic is right. I know the more expensive cards are better, but the numbers show it's not by much, not when it's a real game. I don't need to show off 5k more points in 3DMarks, or show my fps running only DirectX9 effects. I won't ever buy a card to run applications that just overload it to run effects. I want to run applications(games) that will do that moderately, and work together with the CPU to produce entertaining software.
I recently downloaded the HDR demonstration video now that I had all my bandwidth free and had nothing else better to do.
Well, this video, even without any action, was just soooo amazing. Doom III has got to really do something special to beat Half-Life 2's lighting effects... I'm already drooling over what they can come up with with the extra time they've got.
Anyway, I'll be frank, I'm on a tight budget. I'm getting a 2500+ so I can overclock to atleast 2800+, hopefully reach 3000+. I'm sticking with 512MB DDR400, though I'll get another stick once I can afford it. Last peice would be the video card, and with the benchmarks out I felt the Radeon 9600 pro really IS the best bang for your buck.
What I hate mostly about benchmarks like 3DMark is that it's just numbers, so I try to avoid those. UT2k3 uses 2 forms of benchmarks, flyby and botmatch, and people WAY TOO DAMN MUCH tend to use flyby. So it's more stressful on the GPU and not the CPU, who cares? I look at some benchmarks and I see the 9500 non-pro, 9500 pro, 9600 non-pro, 9600 pro, 9700 non-pro, 9700 pro, 9800 non-pro, and 9800 pro ALL can run UT2k3's botmatch at over 60fps. Right there, I know that it doesn't matter what card I'd get (for UT2k3), I know I can get acceptable framerates. (This is of course at 1024x768 and no AA/AF... you could counter-point me saying you pay $200 more for a card so you can use higher res and AA/AF, but in my case, after watching the 20min demo AND the HDR video which both run without AA/AF, I could care less... even the 9800 pro is lousy at HL2 @ 1600x1200 with full AA/AF, so I don't care.)
My point is the following, the HL2 benchmark shows real-world performance. The first one, Techdemo, is not really real-world, it's more of a eye-candy showroom... and it's acceptable to be easier on the 9800 pro which can do more effects faster. Still so, I see a ~20fps difference between the 9600 pro and 9800 pro in that test. It's not very CPU intensive, it's more GPU intensive like flybys from UT2k so I see that as a test for pure GPU.
Bugbait is a real-world situation, which will be a good indicative of how you'll be playing. During that test the Radeon 9600 pro and 9800 pro are about ~10fps in difference. This is performance with AI, physics, and effects all going at it. It's the performance that matters. Not theoretical, not one sided (GPU only), but real numbers. They'll also be a bit higher if the optimizations hold true, and new drivers will undoubtly raise the performance a bit for all cards in the Radeon line.
Last but not least City 17's benchmark shows again about ~10fps (more like ~6fps) difference.... once again a real situation in HL2 one that proves to be the most demanding.
Now back to my topic title, would you consider the proximity of the Radeon 9600 pro to the 9800 pro similar to the proximity of the said cards in UT2k3 under botmatch? Is it that some benchmarks are designed to show areas of power in GPU only, and really forget that what matters is running all those effects along with the rest of the stuff the CPU has to do?
I'm pretty much going to have to use the Radeon 9600 pro to play Half-Life 2 AND Doom III... so I hope my logic is right. I know the more expensive cards are better, but the numbers show it's not by much, not when it's a real game. I don't need to show off 5k more points in 3DMarks, or show my fps running only DirectX9 effects. I won't ever buy a card to run applications that just overload it to run effects. I want to run applications(games) that will do that moderately, and work together with the CPU to produce entertaining software.