ATI's cards will own

the X1800 XT really r0x0rz

thump up for ATI
 
But, if so these ATI card pwn Nvidia's current line-up... this still means Nvidia has time to drop out their ultimate card... right?
 
Oh my ****ing god.

Is that the price on those little picture thingies?

x1300 for >$150?

I need to sell my 9800. I got ripped the hell off!
 
still, nvidia gf7800 released months ago
im sure NV got something in thier pocket
 
if it actually performs that well then, that'll be mad pwnage
 
Really nice for the price.

EDIT: No AGP versions...
*cry*
 
Here's what nVidia plans to do I bet...as you know the 7800 GTX has A LOT of power behind it. What if...nVidia is actually holding back that power with a driver and after ATI releases their card...nVidia releases a driver that will unlock the true potential. I actually heard this from my friend, he works for PNY and that's the rumour goin around. All this stuff reminds me of DBZ though..."holding back" the true power to f*&k around for a lil and then BOOM.
 
gweedodogg69 said:
Here's what nVidia plans to do I bet...as you know the 7800 GTX has A LOT of power behind it. What if...nVidia is actually holding back that power with a driver and after ATI releases their card...nVidia releases a driver that will unlock the true potential. I actually heard this from my friend, he works for PNY and that's the rumour goin around. All this stuff reminds me of DBZ though..."holding back" the true power to f*&k around for a lil and then BOOM.
Extreemly unlikely..
But I guess you can hope that since you spent all that money on one.
:thumbs:

Anyway, these cards also support Crossfire.
So maybe... just MAYBE.. 2 of these nice, cheap cards might just = a $400 GeForce.
Cant wait to see...

Could we be entering the age of cheeper cards?
Please deliver us, ATi, and others will follow suit.
 
What about the G71 or G70 core? I can't remember the name of it.. Do you think that core NVidia is working on will come out on top even though once ATI releases the R580 it will possibly all be over for NVidia? :p
 
The war rages on.

/Waves giant red flag with ATi logo on it.

Burn Nvida burn.
 
$330ish AUD for a X1600 XT 256mb hmm same price I payed for my 9600 pro. Come on benchmarks.
 
Can't wait for some proper benchmarks to really see how these cards perform. If they do as well as is stated on that page, then that would be so damn cool!
 
I want to actualy see #'s not percentages. For all I know that may only be 5 more FPS in Fear Sp demo.
 
As someone said its done in %, rather than fps. This could make it seems alot, even thought it could be a small amount. I wished they made made the base index(ie geforce7800) on the graph a little smaller. So the graph came in at 50% rather than 0%. Remember they pay people alot of money to present data, so it looks alot more than it acutally is
 
It's true that they only have 16 pipelines. I thought that was interesting to say the least.
 
They do look like nice cards though. Hopefully it wont be a 1:1 price conversion when they come out in the UK.
 
Ren.182 said:
They do look like nice cards though. Hopefully it wont be a 1:1 price conversion when they come out in the UK.

Yeh same here, but I can't see it happening. But tbh, Ati have to get the cost just right here, i mean they have kept people waiting for this, and the people that are still waiting and havent gone off and bought a 7800 card need something thats worth it and tempting, not ridiculously expensive.
 
All i can say is i'm looking forward to their release and hopefully pre-release reviews in custompc, with benchies

:)
 
3ssence said:
All i can say is i'm looking forward to their release and hopefully pre-release reviews in custompc, with benchies

:)

That would be good if CPC got some proper reviews, you know you will get good honest proper benchmark results from them :D
 
Of course nvidia is going to come out with something in reply to this. Then ATI will reply to that.

I'm impressed by the price, mainly. If the resellers don't jack it up I'll be happy.
 
Awesome. Way to be ATI.
Now lets get some benchmarks with some real numbers.

I wonder how well the X1800XL and X1600 will overclock seeing as how the operating frequencies are quite a bit lower than the XT versions.
 
duffers20 said:
That would be good if CPC got some proper reviews, you know you will get good honest proper benchmark results from them :D

You know its true.

:thumbs:
 
Awesome marketing sheets there. I will be looking forward to the website reviews though.
 
Just another GPU sigma paper launch.

Pure shell allocation is braced with rendering performance.

Wait for the numbers.
 
About the 16 pipelines... is ATI using a unified pipeline method? Where shader and pixel pipelines are ran by the same one.

So basically 16 would be equal to 32.
 
ailevation said:
About the 16 pipelines... is ATI using a unified pipeline method? Where shader and pixel pipelines are ran by the same one.

So basically 16 would be equal to 32.

No, that just means that each pipe can do either vertex or pixel work not both at the same time.
 
holydeadpenguins said:
No, that just means that each pipe can do either vertex or pixel work not both at the same time.

There we go, that's what I mean.

Wait, a minute fool... I never said at the same time... I said by the same ONE. So one pipeline can either do both vertex or pixel... Had me believe I acutally said that.
 
ailevation said:
So basically 16 would be equal to 32.

It would only be 16, dont know where you got the 32. You said that it would be 32 which made me think that you thought it would do both at the same time.
 
holydeadpenguins said:
It would only be 16, dont know where you got the 32. You said that it would be 32 which made me think that you thought it would do both at the same time.

Because some people compare Nvidia's 32 to ATI's 16. Or was it 24 for Nvidia? Anyways, it's like the whole Intel, AMD deal.

My mistake for the 32 if Nvidia really has like 24 pipelines or whatever.
 
Nvidia has 24 pipes, but ATI has some other trick's to do so, but we need to wait for REAL numbers so we can compare...
 
gweedodogg69 said:
Here's what nVidia plans to do I bet...as you know the 7800 GTX has A LOT of power behind it. What if...nVidia is actually holding back that power with a driver and after ATI releases their card...nVidia releases a driver that will unlock the true potential. I actually heard this from my friend, he works for PNY and that's the rumour goin around. All this stuff reminds me of DBZ though..."holding back" the true power to f*&k around for a lil and then BOOM.

Thats the stupidest I've ever heard. Why will they hold back their cards power at the start if they want to show that it is the best card you can get?
 
ATI's low-end X1300 Pro costs more than a 6600GT, and has a far lower 3dmark05 score.
 
Back
Top