AMD good enough for HL2's graphics?

From toms hardware :
image012.gif


Another one of the latest games supporting DirectX 8 is Unreal Tournament 2003. With just about 207 frames, the Athlon XP 2800+ takes first place ahead of the P4/2800 with 199 frames.
 
We aren't talking about the 533MHz FSB P4s. We are talking about the 800MHz FSB P4s. There is a big difference in performance between them...
 
if you get a 2500 get some 3200 memory and oc the cpu to 200*11 and you get a 3200 for much less money :D
 
well if you have anything over 2.0Ghz your lucky so stop freaking bitching about which one is a nano second faster.
 
Hey Hunter, is that xp2800 a barton?
If it is ive got the same proc. as you =P
altough iäve overclocked it to xp3200 :afro:
 
?!?! someone actually say AMD arent good!.... bah i spit on ur intel.... *now locks door, to prepare for flammings*
 
Originally posted by OCybrManO
We aren't talking about the 533MHz FSB P4s. We are talking about the 800MHz FSB P4s. There is a big difference in performance between them...
*looks at the competition
AMD 2800+ for 180$ and then at Intel 2.8ghz 533fsb for 260$ and 800fsb 270$
*notices small fps difference and wonders if he would never notice 225fps vs 212fps while playing the game
Think 270$ ($100 extra) would be a wise use of money on that CPU?
 
Originally posted by Asus
*looks at the competition
AMD 2800+ for 180$ and then at Intel 2.8ghz 533fsb for 260$ and 800fsb 270$
*notices small fps difference and wonders if he would never notice 225fps vs 212fps while playing the game
Think 270$ ($100 extra) would be a wise use of money on that CPU?

You will gain a lot more FPS spending that extra $100 on a better gfx card.
 
^^^

not really. A ati 9800 pro 128 barely has more performance FPS than a ati 9800 pro 256 which is 100 dollars more.

However a 2800+ and 2.8C has huge difference in fps.
 
Originally posted by 3DDuL
From toms hardware :
image012.gif


Another one of the latest games supporting DirectX 8 is Unreal Tournament 2003. With just about 207 frames, the Athlon XP 2800+ takes first place ahead of the P4/2800 with 199 frames.

How many times are the amd fanboys gonna post this.......amd always wins this bench and like 2 others.....the other 47 benchs go to intel.....mabey you should mention that.
 
Originally posted by subs
^^^

not really. A ati 9800 pro 128 barely has more performance FPS than a ati 9800 pro 256 which is 100 dollars more.

However a 2800+ and 2.8C has huge difference in fps.
yeah but thats a poor example.
what about 9600 pro vs 9800 pro...
and any differences when they are all over 200...im sure you will notice. :rolleyes:
 
good god, stop using tomshardare benchmarks, that site is HORRIBLE
 
Originally posted by crabcakes66
ok...lesson 1.......amds are not better for overclocking

....it depends alot on the exact individual cpu you have(not even the speed....i mean the one that you have in your possesion).

.....and from what i read and heard amd has a lot more variation in the overclockability (is that a word?) of its cpus....

From my personal experiance...when they are operating at stock rated speed...they are both equally stable. my only experiance being with the sligthly older cores for the 2 brands

<amd=TBa TBb>
<intel=p4 400fsb 533fsb>(cant remeber core names)

ill just put my 2 pennies in! i bought a 2000xp+ chip a couple of days ago...

i have it running at 2600xp+ speeds

thats from a 1662 or so speed to 2020 or something speeds on the basic fan and heatsink that came with it and my system is stable as hell!

now if that isnt good overclocking i dont know what is!

to be honest in my opinion the only thing you "really" need top of the line for if your going to spend the cash is a GREAT gfx card! after all if your running games... you dont want to play it in 800x600 with half the visual beauty turned off....

you want to see it in at least 1280 by 960 with everything on full 8xAA and 16xAF with all the beauty the game can offer!!

TheRook

(yes the game should run fine on that processor)
as long as ur gfx card is good :)
 
one thing most ppl forget when comparing AMD's and inetls is that they do different ops/cycle.

amds do 9 while pentiums do 6. meaning an AMD can be slower and stillg et the same preformance as a faster pentium cause its doing more at one time. This was why a few years ago AMD's just slaugtered intels but now its starting to slow them down.
 
Oh Dear Lord...

Don't take benchmarks at face value. You can't say that one CPU is better than another simply because one CPU won 90% of the benchmarks. As far as real life gaming goes, the CPU isn't very important at all. If you run with high quality settings turned on most of the stress is on the video card. Thus, a CPU that is simply "fast enough" will give you the exact same FPS as a top-of-the-line processor, to a certian point. That point is when the video card needs to extend its frame buffer into the system memory. Most people do not allow themselves to ever reach this point because the game plays too slow. On modern systems this usually happens around 10-30 FPS. That is just a rule of thumb and can vary greatly depending on both hardware and software. AMD systems are faster when this happens. Intel systems are faster when the detail settings are so low that the FPS are very high. Most of us play with an average FPS of about 30-60. Within this range, all modern CPUs from AMD and Intel perform the same (with a few exceptions that favor AMD's method (UT2K3 for example) and a few that favor Intel's method (Comanche 4 for example)). The Intel P4 3.2C and the AMD Athlon XP 3200+ are two different processors. However, you can't say that one is always better than the other. The way I see it, both CPUs are about equal overall. The 3200+ is better for everyday general usage task, some games, some profession 3D apps and Adobe Premier. The 3.2C is better for audio encoding, video encoding, some games, some profession 3D apps, and photoshop. It all depends on your needs. I have a 2500+ overclocked as a 3200+ for my programs that favor AMD and a 2.4C overclocked to 3.3 GHz for my programs that favor Intel.
 
Re: Oh Dear Lord...

Originally posted by TheOtherDude
Don't take benchmarks at face value. You can't say that one CPU is better than another simply because one CPU won 90% of the benchmarks. As far as real life gaming goes, the CPU isn't very important at all. If you run with high quality settings turned on most of the stress is on the video card. Thus, a CPU that is simply "fast enough" will give you the exact same FPS as a top-of-the-line processor, to a certian point. That point is when the video card needs to extend its frame buffer into the system memory. Most people do not allow themselves to ever reach this point because the game plays too slow. On modern systems this usually happens around 10-30 FPS. That is just a rule of thumb and can vary greatly depending on both hardware and software. AMD systems are faster when this happens. Intel systems are faster when the detail settings are so low that the FPS are very high. Most of us play with an average FPS of about 30-60. Within this range, all modern CPUs from AMD and Intel perform the same (with a few exceptions that favor AMD's method (UT2K3 for example) and a few that favor Intel's method (Comanche 4 for example)). The Intel P4 3.2C and the AMD Athlon XP 3200+ are two different processors. However, you can't say that one is always better than the other. The way I see it, both CPUs are about equal overall. The 3200+ is better for everyday general usage task, some games, some profession 3D apps and Adobe Premier. The 3.2C is better for audio encoding, video encoding, some games, some profession 3D apps, and photoshop. It all depends on your needs. I have a 2500+ overclocked as a 3200+ for my programs that favor AMD and a 2.4C overclocked to 3.3 GHz for my programs that favor Intel.

Somewhat true.........

Id disagree that the 3200+ is on average as fast as a 3.2......just becuase it does 3 things faster.....

And your argument about differant detail settings doesnt make sense.
 
Id disagree that the 3200+ is on average as fast as a 3.2......just becuase it does 3 things faster.....

I'll word it a bit differently then:

3200+ Advantages:

General Usage
Adobe Premier

3.2 C Advantages:

Media Encoding
Photoshop

They essentially tie in the rest of the catagories I can think of depending on the software. If you perfer the 3.2C, then that is great. You could certianly argue that it is better than the 3200+. I could do the same in favor of the 3200+, but I will not because I think both processors have their own strengths and weaknesses and both have there own uses.

And your argument about differant detail settings doesnt make sense.

Do you mean that you couldn't actually understand what I was trying to say, or that you find what I was trying to say illogical?

I was simply saying that the CPU is almost irrelevant at the detail settings most of us play our games at. If the CPUs are anywhere near the same speed, you will get the (more or less) exact same FPS when a game is played with detail setting that allow smooth gameplay (~45 FPS average) but are as high as they can be and still allow the system to run the games smoothly. In this situation, ALL the presure is on the video card. The CPU and system memory actually wait for the video card.

If the detail settings are lowered and the game runs faster, Intels greater memory bandwith allows it to score higher than the AMD. However, that is irrelevant because no one can notice a difference much above 30 FPS anyway.

If the detail settings are raised and the game runs slower, AMD design (more cache/shorter pipeline/Nforce2's unique implementation of DCDDR) allows it to run faster as both the video card and CPU have to share the system memory. However, this is irrelevant because no one would play their games slow enough to enjoy the difference.
 
Originally posted by Tredoslop
Intel is better

Give me a break.

AMD athlon 1900+
40gig seagate
60gig ibm
evga ti4200 64Mb
512 pc 2100 crucial





CRAKCAKES STOP DEFENDING YOURSELF WITH LONG POSTS IM TOO LAZY TO READ!!!
 
I had to double post, but do you guys remember when we got to 1.0 gig computers?! It was such a big deal. Lmao that wasn't long ago at all!
 
check the sig. intel's 'C' lineup are very good for overclocking. got a 2.6c to a 3.25Ghz for 200$ you cant beat that. 'c' means that it has 800mhz fsb, which is waaay faster than what amd has right now (400MHz) and with my overclock, my fsb is 250X4 = 1GHz!! if you know how to overclock, you can get the best bang for you buck by going with intel, imo.

good place to research on how to overclock your stuff: www.overclockers.com

you can get an incredible bang for your buck if you learn how to oc your machine, or if you know what part to get that are good for overclocking.
 
The Intel 2.4C is by far the best overclocking value in the mid-high-end range. As I said earlier, I have a 2.4C overclocked to 3.3 GHz and it could go even higher but I would have to lower my memory frequency.

AMD's 1700+ Tbred B (~$45USD) and 2500+ Barton (~$90USD) are by far the best overclocking value in the low-end. Heck, a $45 1700+ running a 2.5 GHz (that is possible) can outperform a $170 2.4C running at 3.3 GHz in some things. Now that's one heck of a good value. The 2500+ can even do a bit better, but at twice the price of the 1700+ it can't touch it in terms of price/performance (overclocked).
 
Originally posted by TheOtherDude
I'll word it a bit differently then:

3200+ Advantages:

General Usage
Adobe Premier

3.2 C Advantages:

Media Encoding
Photoshop

They essentially tie in the rest of the catagories I can think of depending on the software. If you perfer the 3.2C, then that is great. You could certianly argue that it is better than the 3200+. I could do the same in favor of the 3200+, but I will not because I think both processors have their own strengths and weaknesses and both have there own uses.



Do you mean that you couldn't actually understand what I was trying to say, or that you find what I was trying to say illogical?

I was simply saying that the CPU is almost irrelevant at the detail settings most of us play our games at. If the CPUs are anywhere near the same speed, you will get the (more or less) exact same FPS when a game is played with detail setting that allow smooth gameplay (~45 FPS average) but are as high as they can be and still allow the system to run the games smoothly. In this situation, ALL the presure is on the video card. The CPU and system memory actually wait for the video card.

If the detail settings are lowered and the game runs faster, Intels greater memory bandwith allows it to score higher than the AMD. However, that is irrelevant because no one can notice a difference much above 30 FPS anyway.

If the detail settings are raised and the game runs slower, AMD design (more cache/shorter pipeline/Nforce2's unique implementation of DCDDR) allows it to run faster as both the video card and CPU have to share the system memory. However, this is irrelevant because no one would play their games slow enough to enjoy the difference.

I understand what your saying.....But I dont buy it.

Especially "However, that is irrelevant because no one can notice a difference much above 30 FPS anyway."

That is simply not true.
 
No there is no "line" where the eye can tell. (i read this at my eye doctor's)

The average human eye cannot tell the difference between 35fps and 10^10000fps. However, people who are used to tracking small objects rapidly, such as computer/video gamers and pilots, can notice the difference upto 60 or fps.

*edit* grammer
 
I said 30 FPS because that is the speed movies play at. I have never heard anyone remark about how "laggy" a movie is. My statement would still hold true (most of the time, at least) if I replaced the "30" with a "60."

Check out some of the benchmarks in the link below. I'm sure you will notice that the P4 is of the 533 FSB variety, but I have done testing on my P4C that shows the trend is still present, although much less pronounced.

http://www.ukgamer.com/article.php4?id=238&page=18
 
Does anyone have a picture of a barton 2500 heat sink? Im going with amd for my new system.
 
Originally posted by TheOtherDude
I said 30 FPS because that is the speed movies play at. I have never heard anyone remark about how "laggy" a movie is. My statement would still hold true (most of the time, at least) if I replaced the "30" with a "60."

Check out some of the benchmarks in the link below. I'm sure you will notice that the P4 is of the 533 FSB variety, but I have done testing on my P4C that shows the trend is still present, although much less pronounced.

http://www.ukgamer.com/article.php4?id=238&page=18

Id rather not.

This arguement is getting old.

You need to do more research on the whole fps thing....I can tell you that much.
 
You need to do more research on the whole fps thing....I can tell you that much.

Well, that wasn't the most productive statement in the world. :rolleyes:

Anyway, if you know as much about computers as you seem to think you do then my basic point should be obvious.
 
Originally posted by TheOtherDude
Well, that wasn't the most productive statement in the world. :rolleyes:

Anyway, if you know as much about computers as you seem to think you do then my basic point should be obvious.


I understand your basic point. I agree that each manufacturers line of cpus has strengths and weaknesses. Just not to the extent or in all of the specific situations that you say.

Ive been argueing about cpus for the last month on this forum. I dont know everything, I just put together all the info I can and go from there.

Im just tired of all the .. "Omgz AMD is teh bett0rzs!!!111!" from people that have less of an idea what they are talking about than I do.
 
Most movies are interlaced 30fps, they show 2 images within a second so it is actually 60. Progressive 30fps is tru 30 though.
Hard to notice the difference between constant frame rate differences (60fps constant vs 70 fps constant) but you notice changes in framerates easily (60fps then drops to 50 or lower etc).
Also you notice light and images when flashed by. 200fps (or higher) all black then a image is flashed for a frame. You will notice the light and may even know the image. Your eyes are very sensitive to changes (+/-) in light, even way beyond 60fps.

As far as hardware...buy the CPU perform best for you and at the price you can afford.
But why many dislike Intel is the company. How they work, what they try to do and who they try to please.
Others enjoy AMD because of how they work, what they try to do and who they try to please. Im talking the company not hardware.
Same goes with ATI and Nvidia. I view Nvida and Intel as excessive (mhz!!). Strong marketing team, do whatever to sell, sell to the OEMs. Try to BE the standard. Large following (sheep in some cases but not all, AMD has some sheep too). Optimization isnt in the hardware for them, its in the drivers and instruction sets.
While ATI and AMD are for efficency. Dont have the best marketing team. Go all out for the Gamer, User or who they sell to. Listen to the community. Small and deticated following. Bring prices down for the market. Optimisation in the hardware and are powerful (do more per mhz).
 
Most movies are interlaced 30fps, they show 2 images within a second so it is actually 60. Progressive 30fps is tru 30 though.
Hard to notice the difference between constant frame rate differences (60fps constant vs 70 fps constant) but you notice changes in framerates easily (60fps then drops to 50 or lower etc).
Also you notice light and images when flashed by. 200fps (or higher) all black then a image is flashed for a frame. You will notice the light and may even know the image. Your eyes are very sensitive to changes (+/-) in light, even way beyond 60fps.

Good info. Thank you for posting it. Keep in mind that most people don't have their monitors refreshing more than 85 times per second in games.

Crabcakes66,

I actually have a high-end Athlon and a high-end P4 (after overclocking, at least). I enjoy comparing the systems. It has even become a bit of a hobby. I get annoyed when people people say how their Intel system kills anything AMD has to offer. I get annoyed when people reference benchmark results that I consider invalid. I get annoyed when people compare CPUs by the percentage of benchmarks that a CPU won in a certain review.

My basic point:

Intel systems are better optimized (whether intentionally or not) to run games/settings that are not stressful on the hardware. I mean to the point that the game is even slightly "laggy." For example, Intel systems dominate AMD in Quake III. However, who really cares about the difference between 300 FPS and 400 FPS? Why are Intel systems faster in this regard? The main reason is simply that they have enough memory bandwith to feed the a CPU and graphics card that can't get data fast enough,

AMD system are better optimized (whether intentionally or not) to run games/settings that force the video card to extend its frame buffer into the system memory. However, who really cares about the difference between 5 FPS and 10 FPS? Why are AMD systems faster in this regard? The main reason is simply that they have enough cache to minimizes memory accesses from the CPU. Thus, allowing the video card to use the bandwith.

While the CPU/motherboard/memory does play a large role in the performance of the system in the two situations I explained above, the CPU/motherboard/memory are unimportant (to a certain point) when the graphics card is being heavily stressed but not being forced to extend its frame buffer into the system memory. I would like to include that most of us play our games with settings that put the system in this exact situation, but I will not because that is not really one of my basic points.
 
Originally posted by Flynn
i've never had any probs w/ AMD, my Benchmark 03 score is around 5000

9800 (not pro)
2100 XP
512 DDR Yes, im a rich bastard compared to most folks, prolly not so much to u guys, but yes u can bash me
Sure you'll be able to run it fine :)
 
Back
Top