6800GT or X800 Pro

which one?

  • x800 pro

    Votes: 13 31.7%
  • 6800GT

    Votes: 28 68.3%

  • Total voters
    41
even john carmmack the maker of doom 3 says to run doom3 the way its meant to be played you need 3.4gig 2gigs ram and x800 or 6800 i read it in the new pc gamer
 
alan00000 said:
even john carmmack the maker of doom 3 says to run doom3 the way its meant to be played you need 3.4gig 2gigs ram and x800 or 6800 i read it in the new pc gamer
LOL!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! HAHAHAHAHAHAHAHAHAH
 
sHm0zY said:
that would be funny if after all this you get 3 fps

lololololololololololo hahahahahahahaaha MAN that would mke my day
 
ok i found it this what he siad in pc gaEach girder, door, and window adds tangible substance to each scene, and even th \e effect of your flashlight shinging into a drakened corner looks ridiculosly real - as the light floods through a room , swinging back andforth, shadows are cast perfectly; dust particles gently drift into the cone of the flash light , eerily visible. And these are just the basics of the enviroment: just wait until you enter the depths of hell, and dive into some of the later mass melees,\. Doom 3, with all due awareness of hyperbole, is the best looking game you've ever seen.

Not surprising you'll need a monster system to render these monsters in all their intracetly textured glory. But the ability to play Doom 3 with all its visual magic maxed out is really a good excuse to trade up. A P4 3GHzwith a Georce 5950- class card will see u through okay. One of our test systems had a geforce 6800 ultra and ran flawlessly at 1024x 768 with high detail. (A higher level of quality and resolution is available , but the PC to run it well isnt)nning with a geforce4 MX card and 512 Mb RAM, the texture detail was great, although the game was choppier in spots.

Bottom line: If Far Cry didnt convince you, then Doom 3 should - the time to upgrade to a next generation 3D chip , or even an all new rig, is now.
mer
 
A P4 3GHzwith a Georce 5950- class card will see u through okay. did you read this part will see you through lol this is crazy
 
I dunno guys, I think a dual athlon fx-55, 2 gb ram, and dual geforce 6800 ultra MAY run it at max settings with 20fps, but it's a long shot. (Note the sarcasm)
 
if you dont belive me thats to bad buy a copy of pc gamer if you can it has the doom3 on cover it says right there
 
if you dont care how the games going to look just ignore everything i have said
 
keep in mind that doom 3 is in tiney hallways throughout the game means better fps unlike hl2 out door levels lots of action requiering lots of memory and at the level of detail ive seen in this game its goin to blow away any game

you will need to upgrade or just buy a new system entirely if you want excellent performance and visual effects(shadows) ohh and dont forget about the flashlight in both games and in any game it uses half the memory if you guys have played far cry and you know who you guys are you know what im talking about
 
I voted for the 6800GT even though i own an X800pro myself. It might not be able to do somethings as good as the ati in directX but in OGL its better and even more so than the X800XT in some cases. I doubt the X800pro will beat the 6800ultra EVER in directX so its quite simple that ATI really have screwed up and i probably will never buy another ATI again. I know the X800pro is a good card but it does'nt really do anything the 6800's cant do, except fail dreadfully in OGL and have REALLY shitty buggy drivers which i think makes it a bad card.
 
I don't think the dust has quite settled yet with the 6800 GT and X800 Pro debate. Right now the 6800 GT is in the lead but I wouldn't call the X800 Pro a horrible performer. I suspect the X800 Pro still has some tricks up its sleeve. We will wait and see. I am really starting to believe (instead of thinking) SM 3.0 was over hyped.

Alig - Do you even know what you are talking about? I don't think so. You are really complaining about nothing. How did ATI screw up with the X800 Pro? A screw up is the FX 5800 not the X800 Pro. Show me (or even write) a coherent arguement why the X800 Pro is horrible and I can easily disprove it.
 
Ok, something everyone looks for - "bang for the buck" and the 6800GT is not ONLY cheaper, it is FASTER and BETTER!. nuff said.

Edit/ I never said it was a horrible card, it just does'nt look impresive what-so-ever when next to the 6800GT.

Lets use car examples...they are easiest.

Lets say the 6800GT is a slower top speed/faster acceleration car and the X800PRO is a faster top speed/slower acceleration car - say there is no speed limit on roads, but roads are still roads with bends etc - which one would you prefer? a car that gets to 200mph in 25 seconds or a car that gets to 250mph in 45 seconds.

Or more real life like, put an Indy Car next to an F1 car on a proper race circuit and say GO! see which one gets back first. (F1 cars are slower btw)
 
there is no way x800 pro is bad in any way there is nothind the x800 pro cant that the 6800 can except for its 6 extra piplines what games need that right now
 
Alig said:
Ok, something everyone looks for - "bang for the buck" and the 6800GT is not ONLY cheaper, it is FASTER and BETTER!. nuff said.

Wow. Convincing arguement. You win. I'm sorry, I'll admit you have better debating skills then me. I'll shall go in a corner and despair.

Read the stuff in this thread and stop being so ignorant.

http://www.halflife2.net/forums/showthread.php?t=29182
 
alan00000 said:
there is no way x800 pro is bad in any way there is nothind the x800 pro cant that the 6800 can except for its 6 extra piplines what games need that right now

Doom3. STALKER will no doubt.

It has 4 more pipelines actually and if you wanna take a look at the difference between the XT and PRO (the XT has 16 pipe lines) then go take a look and make your mind up if them 4 less pipe lines really are nothing to fuss over.
 
Alig said:
Doom3. STALKER will no doubt.

It has 4 more pipelines actually and if you wanna take a look at the difference between the XT and PRO (the XT has 16 pipe lines) then go take a look and make your mind up if them 4 less pipe lines really are nothing to fuss over.

ahh yes you are right on that 4 not 6 thanks i knew that too :dozey:
 
read this ------>Each girder, door, and window adds tangible substance to each scene, and even th \e effect of your flashlight shinging into a drakened corner looks ridiculosly real - as the light floods through a room , swinging back andforth, shadows are cast perfectly; dust particles gently drift into the cone of the flash light , eerily visible. And these are just the basics of the enviroment: just wait until you enter the depths of hell, and dive into some of the later mass melees,\. Doom 3, with all due awareness of hyperbole, is the best looking game you've ever seen.

Not surprising you'll need a monster system to render these monsters in all their intracetly textured glory. But the ability to play Doom 3 with all its visual magic maxed out is really a good excuse to trade up. A P4 3GHzwith a Georce 5950- class card will see u through okay. One of our test systems had a geforce 6800 ultra and ran flawlessly at 1024x 768 with high detail. (A higher level of quality and resolution is available , but the PC to run it well isnt)nning with a geforce4 MX card and 512 Mb RAM, the texture detail was great, although the game was choppier in spots.

Bottom line: If Far Cry didnt convince you, then Doom 3 should - the time to upgrade to a next generation 3D chip , or even an all new rig, is now.
 
blahblahblah said:
Wow. Convincing arguement. You win. I'm sorry, I'll admit you have better debating skills then me. I'll shall go in a corner and despair.

Read the stuff in this thread and stop being so ignorant.

http://www.halflife2.net/forums/showthread.php?t=29182

Shader 3.0 is a feature, one that ati card owners now wont be able to use when it becomes more mainstream and thats beyond the point. I was'nt talking about shader 3.0.

And my arguement is convincing, its coming from the ATI boat that im on...ive got an X800pro, i know how buggy the drivers are, i know that the card really isnt what it was cracked up to be but for you to admit that would mean admitting your wrong and you could'nt do that, could you?.
 
hehe
That is far from a screw up when you bring in a card from the other manufacture that has more pipelines because they changed it from 12 to 16 pipelines before shipping.

3Dc will have an impact there. Lot bigger than PS3.0 anyway.
You can see the X800pro beating or at least matching even the Ultra in a number of places. 1 2 3 4 5 6 7 8 9 10 11 12 13
 
Alig said:
Shader 3.0 is a feature, one that ati card owners now wont be able to use when it becomes more mainstream and thats beyond the point. I was'nt talking about shader 3.0.

And my arguement is convincing, its coming from the ATI boat that im on...ive got an X800pro, i know how buggy the drivers are, i know that the card really isnt what it was cracked up to be but for you to admit that would mean admitting your wrong and you could'nt do that, could you?.

You mean a SM 3.0 feature like Geometry instancing, right? Whoops, X800 Pro can do that too. :O SM 3.0 doesn't do anything different than SM 2.0. Unless you like you shaders extra long and impratical for use in a game.

Maybe you also know that the 6800 series of cards still use 16 FP for shader calculations in games. In fact, Nvidia tells game developers to do so. DX 9 was created to get rid of 16 FP.

You are also forgetting that the X800 Pro is DX 9 compliant. :O That means any new games will be coded to take advantage of its DX 9 features. It is not going to be like Stalker is not going to give it any SM 2.0 shaders. That is completely wrong.

I've got an X800 Pro too. What buggy drivers? The only bugs I have is in the game with Far Cry. That problem lies directly with Crytek, not ATI. You are just speaking jibberish and it irritates me.
 
Asus said:
hehe
That is far from a screw up when you bring in a card from the other manufacture that has more pipelines that it becaues they changed it from 12 to 16 pipelines before shipping.

3Dc will have an impact there. Lot bigger than PS3.0 anyway.
You can see the X800pro beating or at least matching even the Ultra in a number of places. 1 2 3 4 5 6 7 8 9 10
thanks Asus i needed that(charts)for reasurance
 
The difference being, 60+ fps at 1600x1200 4x/8x is a lot less disapointing than 20fps (unplayable) at 1600x1200 4x/8x (was it?). Not to mention the X800pro does'nt get anywhere near them fps when i play ut2004 at lower ress with lower AA/AF :rolleyes:
 
I forgot to rant some more.

Do you know what a pipe line really does? Additional pipe lines (at 12 and 16 levels) only start to become effective at higher resolutions. Otherwise they are not being use effectively. That is why the X800 Pro can keep up in games with resolutions of 1280 by 1024 but has a severe drop in FPS at resolutions at 1600 by 1200 and above. Maybe you should do some research before calling the X800 Pro a disaster.
 
The number of pipelines in a video card is slightly deceiving. At lower resolutions (like 1024 by 768), more pipelines will not always mean better performance. The additional pipelines will help, but only to a limited extent. This is because the pipelines are not being fully utilized (they are dependent on esolution). However, more pipelines are usefully and more expenisve graphics have more features, higher clockspeed, better memory, etc.

Here is an example to explain this. I have two freeway's a four lane freeway and an 8 lane freeway and they run parallel to each other and reach the same destination. Now lets assume that it is 11:00 in the morning, the morning rush is over and there is little traffic on the road. If I were to go on either of freeway's I would reach my destination at the same time. Going on a 8 lane freeway will not help me get to my destination faster if the freeway is not being used all the way. Runnnig a game at a lower resolution is the same way with these higher end cards. Having a 16 pipeline card is not being put to good use at lower resolutions.

Now lets assume it is 6:00 in the evening and you are in the middle of rush hour and traffic is really heavy. Going on the 8 lane freeway is going to be much faster than the 4 lane freeway because it can handle more traffic. The exact same thing will be true with graphics card. A 16 pipeline card will repeatedly beat a 12 pipeline card in to submission time and time again only if you turn up the traffic (which would be resolution).

I think Nvidia and ATI are creating cards with more pipelines because we are in the middle of another resolution change in gaming (besides that competition thing they've got going on). It used to be 800 by 600 was standard and now it is 1024 by 768 and it is looking like 1280 by 1024 will become the next standard. I also think that if a person pays that much money for a video card, they should be able to turn up the resolution really high and enjoy their games with decent frame rates.
 
blahblahblah said:
I forgot to rant some more.

Do you know what a pipe line really does? Additional pipe lines (at 12 and 16 levels) only start to become effective at higher resolutions. Otherwise they are not being use effectively. That is why the X800 Pro can keep up in games with resolutions of 1280 by 1024 but has a severe drop in FPS at resolutions at 1600 by 1200 and above. Maybe you should do some research before calling the X800 Pro a disaster.

I did my research, i made the mistake of buying an ATI card.
 
blahblahblah, here's another reason you want to find those 4.8 Cat's.
Link
Pretty good performance increase huh.
Against an OCed 6800U no doubt. Wish they showed the X800pro. ;)
 
Alig said:
I did my research, i made the mistake of buying an ATI card.

Come on, if you did research, you should've have known about the limitations of the X800 Pro.

I guess I gotta go search for those 4.8's. If I get bored, I may retest Far Cry.
 
No actually i bought it because i thought they could'nt be that bad if they have this much praise but i was wrong and i'll never listen to another person's opinion on anything gfx card related.
 
These two tips are very important. lol
THE FIRST BEST TIP FOR GAMERS: YOU WILL NEVER BE ABLE TO RUN ALL GAMES WITHOUT EXCEPTION AT MAXIMUM SUPPORTED RESOLUTIONS WITH ALL GRAPHICS OPTIONS AT MAXIMUM SETTINGS NO MATTER WHAT HARDWARE YOU HAVE. THIS IS NORMAL AND DOES NOT MEAN THAT YOUR PC HAS A PERFORMANCE PROBLEM

THE SECOND BEST TIP FOR ALL PC USERS: IF IT AIN'T BROKEN, DON'T FIXIT! This means that if your games/applications run/look fast/good enough for you there is no need to do anything. If you think you do have a performance problem, read on.
Link
 
I was donig some looking around and I don't think I'm going to be able to find those 4.8's. They are supposedly given out just to reviewers. Maybe in a day or so, somebody will leak them to the interent. But not today.

As for SM 2.0b, apparently there is a beta version in Patch 1.2. I'm going to test that right now. You don't need Cat 4.8's for it. You need Cat 4.8's for geometry instancing only.
 
Ohhh, fight to the death. [Asus vs. blahblahblah] Who will win? Who will end up with the cheese? Why am I still talking?
 
RAWR
Actually I'm hoping blahblahblah get's ahold of those 4.8's in the near future. I'm interested. Course we could always wait for another review but just for drivers? Maybe another FarCry 1.3 review! :hmph:
Weee!
 
alan00000 said:
wait i just ordered anx800 pro is it not faster than a 9800pro

Omg, how could it not be faster???

/me slaps head in shock at stupidity
 
The 6800 owned the X800 in HL2, it was the FX that lost.

The 6800 GT is a good choice, plus now nvidia has released a new driver that has support for 6800, DX9.0b and OpenGL 1.5
 
crownest said:
The 6800 owned the X800 in HL2, it was the FX that lost.

The 6800 GT is a good choice, plus now nvidia has released a new driver that has support for 6800, DX9.0b and OpenGL 1.5

Yeah good statement...Sorry, NO OFFICIAL BENCHMARKS OUT YET. And why would ATi lose in d3d? They have proven to be better than nvidia in it several times. Your statement is flawed, good job.
 
By the way, HL2 will use ATI's 3Dc compression. ;)
HL2 won't have Ultrashadow 1/2 or multiple lights that D3 uses which allows Nvidia to pull ahead.
ATI's card is really built for complex DX9 shader games built from the ground up.
Just take a look at Tomb Raider benchmarks with it's complex shader effects. 1
This game supports many DX9 effects, even depth of field. 9800XT
 
Back
Top