nVidia SLI

guinny said:
Owned by Asus as usual.
hardly. and as usual? when has he before?

Asus said:
You quote DDR2 with these great achivements. I shouldn't even respond. hehe
yeah they are slow now, because of terrible latency timings but they will produce way better DDR2 chips next year.

Xeons? I would never buy Xeons for gaming, ever. What about future dual core A64's that work in existing motherboards? Those are actually taped out unlike those Xeons.
ok then more the better, that just proves more that the fx-53 will be highly dated in 2 years.

Actually, I don't expect dual core Intel products to be out in a year.
i think thats the subject we are referring to. timewise when dual core cpu's come what do you think will happen? fx-53 still being top of the line? lol

16MB cache HDDs won't do much at all. They work great when compensating for low RPM when saving power like in laptops but it's diminishing returns for desktop use.
lol hard drives have been the slowest parts in computers for a while because of that 8mb cache that no company seems to care about. now you take two of those, put them in RAID0, you have a ****ing 32mb cache hard drive. not to mention they will probally have 10k+ rpm's on those. 16mb hdd's wont do much? hah.

take a look at this review http://www.anandtech.com/storage/showdoc.html?i=2094
when a 250GB harddrive pretty much matches up with a 74GB 10k RPM raptor your theory that they wont do much is goatmilk. imagine that harddrive in RAID0 ;)

Not new and top of the line, yes. But not outdated by far. It isn't outdated until it doesn't do what you want. Most of these things you mention are developing and new tech. that needs to mature first.
well this is true but im saying in 2 years.

when you post the INQ to back up a statement that goes a long way :stare: not to mention their whole story screwup with the ati bridging the pci-e cards. or the nvidia driver boost of 30% ...or the... i think you get it.

btw We sell 12X and dual layer DVD burners at Compusa currently.
just feeding the fire lol
 
btw im not saying the FX-53 or x800xt-pe are bad at all. its just when you think they will be top of the line for a while your loosin it.

guinny said:
Sorry but after a while it gets overkill.
sorry but when you pay almost 800 bucks for a cpu thats not a overkill?

its just new technology and obviously you have accepted it when you got your fx-53 so why not accept the SLI?

dont get me wrong guys. the fx-53 is a great cpu. the x800xt-pe is a amazing gpu.
 
guinny said:
1. I never said it pwned it, you still insist on taking it that way. I said the XT is a better buy then the 6800 and that's my opinion. And I've got almost every FX line card somewhere in this damn room so stop calling me an ATi fanboy for it.

2. for someone who isnt a fanboy you sure show a funny way of proving it."You also specifically say you are'nt a fanboy but you will never admit the nvidia's are better." thats a rather stupid statement to make, because by saying im a fanboy and then saying nvidia is better you contradict yourself and make yourself an nvidia fanboy. :cheers:

Yeah im an nvidia fanboy but i just spent £350 for a saturday delivery getting my ATI SAPPHIRE X800 PRO :upstare:

Im not a fanboy, im saying the nvidia's are better because they will be after this _if_ they add it to the NV4x line. Just because i think the nvidia is better doesnt mean im a fanboy, it is simply stating the facts and facts you cannot deny no matter how much of a fanboy you are.

I HAVE AN _ATI_ X800 BEFORE YOU CALL ME A FANBOY! YEESH!
 
gh0st said:
Alig I think by buying an X800 when you think a 6800 is MUCH BETTER really proves that you are either 1) stupid or 2) an nvidia fanboy in disguise.

Anyway, I'm not to excited about SLI because I could just buy a new computer and get similar returns for a similar price to just those 2 6800's. Technology is always on the move though, everythings expensive when it starts out.

Quote me when you read and understand my post. OK thank you.

Edit/ And WTF is a fanboy in disguise? Am i infiltrating ATI's card and sending information back to nvidia because i love them oh so dearly! :bonce:
 
Alig said:
i still believe the nvidia to be better in the long run by a LONG way

yes because a disguise always means infiltrating "ati's card". stunning.
 
Hey guys, no one said it would be top of the line. It just isn't 'outdated'.

I question your examples because you refer to an review on harddrives when that 250gb drive is using NCQ and the Raptor does not. That is not related to Cache. Cache doesn't double either when you put it in RAID. There is no adding.

Please don't post information that is incorrect like you know it to be true.

What's wrong about the Inquirer's posts about ATI's bridge or Nvidia's driver boost?
You shouldn't denounce a post just because it's from someplace that you think is unreliable. You should actually look around before criticising the post. It quotes Microsoft's website btw.

The Inquirer hehe
 
Asus said:
I question your examples because you refer to an review on harddrives when that 250gb drive is using NCQ and the Raptor does not. That is not related to Cache. Cache doesn't double either when you put it in RAID. There is no adding.

Please don't post information that is incorrect like you know it to be true.

umm...did you not see the part without NCQ? and cache will go to 32mb if in raid. look at this. http://www.bit-tech.net/review/326/

The hard drives in particular deserve a special mention. At first glance you may mistake them for the current Maxline II SATA drives but you'd be wrong. In fact the Maxline III is the first ever consumer hard disk to have a 16Mb buffer. When combined in RAID that's a staggering 32Mb disk cache! In addition to this the drives support the latest SATA 2 features such as Tagged Command Queuing and Native Command Queuing.

whos saying incorrect info? :cheese:
 
SLI is here for the PCIe line of cards using NV4x.

If you're referring to the INQ bit that I'm thinking of, they didn't say that ATi was using a bridge integrated into the GPU, they said nVidia was accusing ATi of it. That site has a worse rep than it deserves.

Do you really think a dual-6800 rig will be outdated in a year, in any manner of speaking? Would you call the Radeon 9800 Pro outdated? (That's over a year old, isn't it?) Even the 9700 is still holding its ground.

It could be along time before we see this kind of performance beaten (outside of an Alienware setup). I think it will be at least a year before ATi is able to match that performance. There's no way they could have a refresh release that's faster...
 
gh0st said:
yes because a disguise always means infiltrating "ati's card". stunning.

You still have'nt shown me where i say its MUCH BETTER.

I am fully aware that i said it was better in the LONG RUN but right at this moment im not in that "LONG RUN". So dont talk shit.

Care to elaborate on what you think an nvidia fanboy in disguise is whos just bought a F'UCKING £310 ATI card?
 
*sigh* if this is really necessary alig.

you said it is "better in the long run by a long way." i think most people could infer that this means a card is "much better", yes?

theres no need to get angry alig, you just come across as stupid, and fanboyish (though perhaps not, you just strike me as ignorant) by saying 'i just got this ati card, but (insert inferance here:.) should have bought this nvidia one because its so much better in the long run' almost like youre trying to pursuade people to get a 6800. :dork: guess a fool and his money are soon parted though.
 
psyno said:
Do you really think a dual-6800 rig will be outdated in a year, in any manner of speaking?
i never said it would be. lol.
 
x84D80Yx said:
umm...did you not see the part without NCQ? and cache will go to 32mb if in raid. look at this. http://www.bit-tech.net/review/326/
whos saying incorrect info? :cheese:
You don't add cache because Cache is a function of the individual drive. It's there to buffer the information. Cache does absolutly nothing in most cases but if the stream of data stops then cache can be helpful because it has instructions for the drive to perform. ATA100/133 were slow enough were an interuption or change in instructions would cause it to idle with out Cache.
It depends on the use when and if Cache will impact performance or not.
I usually do not like synthetic benchmarks as they test specific performance that is not real world use. I'll take another look at the review later.

It's just like Dual CPUs. You cannot add two 2GHz CPUs and say you have a 4GHz CPU.

Again, know about the technology before you read. Don't assume and think for yourself. hehe
Don't quote someone and expect it to be always true. In fact you might be taking it out of context.

psyno said:
If you're referring to the INQ bit that I'm thinking of, they didn't say that ATi was using a bridge integrated into the GPU, they said nVidia was accusing ATi of it. That site has a worse rep than it deserves.
Well, Nvidia did accuse ATI of "using a Bridged solution when they claimed that it was bad and they said they wouldn't".
ATI said they wouldn't bridge AGP to PCIE and sell it as the fast solution. AGP obviously would be a bottleneck if that's the case (Not for performance for todays cards). That is what they said was bad and what Nvidia was doing.
What ATI will be doing is bridging their high end part PCIE to AGP which is not a bottleneck at all. Nvidia accused wrongly, thats all. ;)

That site only has a bad rep for those who arn't really into tech. Everyone knows that site reports rumors and info well before other sites and NDA's get the information. Things often change before they make it to press and some rumors may not come true but that's what is so great about that site. Plus they give it in humor. ;)
You just can't take what they say and act on it. Especially when the company hasn't made its final decision for that product.

Do you really think a dual-6800 rig will be outdated in a year, in any manner of speaking? Would you call the Radeon 9800 Pro outdated? (That's over a year old, isn't it?) Even the 9700 is still holding its ground.

It could be along time before we see this kind of performance beaten (outside of an Alienware setup). I think it will be at least a year before ATi is able to match that performance. There's no way they could have a refresh release that's faster...
Exactly, the 9700 pro 2+ years ago is an awesome card and is far from outdated. I wish I bought one of those 2 years ago. :(
 
Does anyone think SLI might be a bad thing for the consumers? Sure its going to be the fastest thing out but what if you dont have the money? I dont want to spend $1000 on graphics cards. What if the developers get lazy and just put as many polygons into the picture as they can because they dont have to worry about how slow or fast its going to run. Those of us with the $4-500 cards are going to run our games like crap.

Just wondering also is it possible to set up a dual operton system with SLI technology and have on proccessor compute for the first graphics card and another proccessor compute for the second graphics card?
 
SLI would be ****ing AWESOME for a setup such as Dual 6800 GT's (About 300 a piece)
 
As far as I know SLI is only PCI Express.
Right now, Prescott is the platform you would have to buy into to get PCI-Express.
AMD's chipset manufactures will be coming out with PCI-Express later this year. It would be possible to setup a dual opteron with SLI technology but not like you are thinking. Opterons will have boards with two PCI-E slots.

The two CPus will split the threads needed to be worked on. It won't matter which CPU does what. There is very little what a GFX card actually sends to the CPU anyway. Most of what is CPU bound in a game is the AI and the actual game functioning, not the 3D graphics.
 
guinny said:
Thanks for making me feel like shit. I was so excited about this comp and now you all took the fun out of it. I feel like im running a 486 the way you guys are talking.

Oh poor baby, now we know your weakness.
 
Bad for the consumer? I see what you're thinking, but I don't really think so. In the end, most game companies need to make money. No company in its right mind would put out a product it couldn't hope to sell, and since it's a pretty safe bet that the share of the market with that powerful of a rig -- or even an X800 XT PE or a single 6800 Ultra -- will be very small, it just wouldn't be a viable approach. If anything, they might have "Ultra high" settings available, like very high resolution textures, etc. I don't think developers will expect people to have machines like that just because they're available. VALVe's hardware survey is evidence that developers really do pay attention to things like that. They want people to have good experiences with their products.

About multi-processors -- what ASUS said. There won't be any unique benefit to a dual CPU with dual graphics card setup. Game developers have been stubborn about multithreading their apps well. I suspect that as multiple processor setups become more popular (with Intel's HT now and AMD's and Intel's mutli-core chips later) developers will get better about this. At the moment, don't expect multiple processors to boost the performance in a game very much. The best you can do in many (most) cases right now is isolate the game on one processor, and distribute other processes across the others.
 
psyno said:
...I think I could give you a basic compare/contrast...
I just came across this at Tom's Hardware. It's a short article containing the contents of Alienware's clarification of the differences between nVidia's SLI and Alienware's "video array." I was most intrigued by Alienware pointing out that their array approach is not limited to two graphics cards...

Of course what I'm still wondering about for both of them is mix 'n' match!

(lol @ quoting myself..)
 
blackeye said:
Does anyone think SLI might be a bad thing for the consumers? Sure its going to be the fastest thing out but what if you dont have the money? I dont want to spend $1000 on graphics cards. What if the developers get lazy and just put as many polygons into the picture as they can because they dont have to worry about how slow or fast its going to run. Those of us with the $4-500 cards are going to run our games like crap.
A lazy developer would only target the biggest audience, which are the consumers with $50 - $200 graphics cards. Good developers make their games scalable, which motivates gamers to upgrade their hardware.

I don't think it's very realistic to think that a game will be developed that's only playable on $1000 graphics cards. It won't happen.

EDIT: I must have missed psyno's post while reading this thread. I agree with what he said.
 
Back
Top