DirectX 9.1 soon, GFFX range on top form

I think it's very fair to say NVidia from this news have reached out further than just improving their drivers to get better direct x performance, something ATI has yet to do but I'm not complaining about any of the vendors either.

As has already been said by others optimising for 32bit precision means that NV will be ready out the box to run in full 32 bit precision at speed whereas ATI defaults to the fastest it can do which is 24bit precision so running in 32 bit would theorise it running slower than the nv at 32 bit, but just how much difference will the 24 to 32 bit differenc emake ? only time will show

Also it seems many ATI fans and owners are getting annoyed by this news and claiming it to be "not real" or "full of s***" because of nvidias past they fail to accept that they are getting back up to top spot slowly having already owned and still owning the crown in OpenGL and now whilst they might not beat ATI in direct x they will most definately or surely match them in speed of quality. next months drivers are said to improve speed by 35% or MORE and the news of dx9.1 immproving things by 65% just adds to that fact AND the main thing is there is NO image quality loss, at least from the drivers point of view and I very much doubt there will be any image quality loss on the dx9 aspect either, if its being optimised to the NV spec then how possibly could it result in IQ loss, more rather an IQ gain because it would be running efficiently according to the bnc card therefore no need to drop quality to run faster.
 
Ok so those of us who've held out for the 9800xt to see what happens, and are now faced with nVidia perhaps being the best bet. Which card _exactly_ from nVidia is the one that will run HL2 at its best and also how much is this card, compared to the 9800xt
 
Hell, the PCI Express version of the Loki might be out before HL2... I'm holding on to my cash.
 
Originally posted by mrk
next months drivers are said to improve speed by 35% or MORE and the news of dx9.1 immproving things by 65% just adds to that fact AND the main thing is there is NO image quality loss
They said it was no image quality loss with the new Detonators (compared to the 51.75 crap) at first too, but at a closer inspection (they retested the entire thing at some page), they found out they had the WORST IQ cheat ever seen in the Nvidia drivers, far more 'optimised' than 51.75! :p
 
lol...maybe this will give nvidia reason enough to stop cheating with their drivers.
 
Originally posted by dawdler
They said it was no image quality loss with the new Detonators (compared to the 51.75 crap) at first too, but at a closer inspection (they retested the entire thing at some page), they found out they had the WORST IQ cheat ever seen in the Nvidia drivers, far more 'optimised' than 51.75! :p

and where is your proof of this, one test site reporting thism, 3dcenter by chance? hah, you beleive them over the THOUSANDS of users who have the 52.13 drivers and have reported no quality loss whatsoever in anything including myself?

the whql are out next week so things will be official then
 
Originally posted by mrk
and where is your proof of this, one test site reporting thism, 3dcenter by chance? hah, you beleive them over the THOUSANDS of users who have the 52.13 drivers and have reported no quality loss whatsoever in anything including myself?

the whql are out next week so things will be official then

Read the same thing fake trilinear filtering or something or other. You can find a link through www.elitebastards.com for the article.

The article also criticized Anand Tech for not picking up the IQ discrpencies in there review as they were pretty blatant from the screens provided.
 
new nvidia drivers.....

...4000% performance increase in all widley benchmarked games(games that are not used as benchmarks will not recieve these enhancements)

note: nothing but a white dot will be rendered in actaul gameplay
 
Yeah, here's an attachment with a leaked screen from the new experimental 60.xx drivers! I mean, just DROOL over that AWESOME lighting quality!!! (note: running 16xFSAA/8xAF extreme quality setting, screen resized from 1600x1200 with no quality loss whatsoever from the reference)
 
Originally posted by dawdler
Yeah, here's an attachment with a leaked screen from the new experimental 60.xx drivers! I mean, just DROOL over that AWESOME lighting quality!!! (note: running 16xFSAA/8xAF extreme quality setting, screen resized from 1600x1200 with no quality loss whatsoever from the reference)


I trust teh LEAKED screenshot from an ANONYMOUS source. You have to be seriously NOT thinking. Nvidia would be FOOLS to do that in a driver. And with the market share they have they certainly are not fools. I dont know why everyone thinks they are 4th graders.
 
Originally posted by mrk
and where is your proof of this, one test site reporting thism, 3dcenter by chance? hah, you beleive them over the THOUSANDS of users who have the 52.13 drivers and have reported no quality loss whatsoever in anything including myself?

the whql are out next week so things will be official then

First hand experience by regular people is IRRELAVENT

There comes a time when it is considered 'cool' to trahs/hate something on the net that everyone with a 8th grade education and a 2bit argument jumps into the fray and is not questioned but believed. When oviously biased articles are believed and common sense is thrown to the gutter. When experiences by regular people with the drivers are disregarded as irrelavent.
 
Originally posted by dawdler
Yeah, here's an attachment with a leaked screen from the new experimental 60.xx drivers! I mean, just DROOL over that AWESOME lighting quality!!! (note: running 16xFSAA/8xAF extreme quality setting, screen resized from 1600x1200 with no quality loss whatsoever from the reference)

LoL
REAL nVidia drivers = www.omegacorner.com
 
Originally posted by DimitriPopov
I trust teh LEAKED screenshot from an ANONYMOUS source. You have to be seriously NOT thinking. Nvidia would be FOOLS to do that in a driver. And with the market share they have they certainly are not fools. I dont know why everyone thinks they are 4th graders.

DONT BE AN IDIOT THE SCREENSHOT WAS A JOKE!
 
actually, I have the drivers and the work very well(for me anyhow)i get better FPS and can up the details on all of my games. Halo now runs like a charm. ;)
 
Originally posted by darkmaster0016
actually, I have the drivers and the work very well(for me anyhow)i get better FPS and can up the details on all of my games. Halo now runs like a charm. ;)
Ah, but could you play Halo using cel shader rendering? Think on it. Its only a matter of time now :)
 
I though that the next version of DX was a couple of years away?:confused:
 
It is, 9.01 is more of a patch/improvement of the current version. It will not introduce any new practises or technologies.

DirextX 10.00 will be the next version.
 
Yeah, 9.xx are only "revisions" of the DX9 core programming. More or less they're tweaks to the original design. Remember that DX8 only had DX8.1b or whatever, so I'd expect this to be the last update we see to DX9 until DX10 makes it debut in a few years.

Can't wait to see what DX10 cards can do. I wonder if Microsoft will continue with the numbering system for DirectX versions after 10, of if they'll think up some sort of new name. Who really knows.
 
Just some things people may be forgeting or need clarification on.
Dx9.0 is different from DX9.1 just like DX8.0 is from 8.1.
Current games/gfx cards are dx9.0.
Both ATI and Nvidia will have DX9.1 cards this Q1 '04.
We do not know exactly what the DX9.1 spec will include but ATI and Nvidia do and they say their future gfx cards are DX9.1.
If DX9.1 requires 32bit precision then cards from both companies will support 32bit precision.
It does not matter that your card supports the newest Dx9.1 or 10 with old dx9/8.1 games as it will only run in the most uptodate version of DirectX that the game was made for.
If you have a DX10 card and want to play HL2 it will run in DX9 because all cards support previous versions of DirectX but HL2 does not support higher than DX9.
 
Originally posted by dawdler
They said it was no image quality loss with the new Detonators (compared to the 51.75 crap) at first too, but at a closer inspection (they retested the entire thing at some page), they found out they had the WORST IQ cheat ever seen in the Nvidia drivers, far more 'optimised' than 51.75! :p
You just contradicted yourself. In one single sentence! Impressive!
If the optimatisation wasn't noticable at first, how can it possibly be the WORST IQ optimisation ever?
In other words, the last remaining IQ optimisation in the NVidia drivers is a very minor one, and you're just in denial. Yes, the 45.XX suffered from IQ loss, but NVidia fixed this in the 52.XX. Great news for FX owners and it's just pathetic that ATI owners keep bitching about it.

Sorry about the not so polite tone of this message but the amount of bullshit posted by ATI fanboys in this thread is ridiculous. What's the point of posting stupid low quality pictures, when you need a special graphics tool to detect the IQ differences between Detonator 52.XX and the Catalyst drivers? :rolleyes:
 
Originally posted by Arno
You just contradicted yourself. In one single sentence! Impressive!
If the optimatisation wasn't noticable at first, how can it possibly be the WORST IQ optimisation ever?
In other words, the last remaining IQ optimisation in the NVidia drivers is a very minor one, and you're just in denial. Yes, the 45.XX suffered from IQ loss, but NVidia fixed this in the 52.XX. Great news for FX owners and it's just pathetic that ATI owners keep bitching about it.

Sorry about the not so polite tone of this message but the amount of bullshit posted by ATI fanboys in this thread is ridiculous. What's the point of posting stupid low quality pictures, when you need a special graphics tool to detect the IQ differences between Detonator 52.XX and the Catalyst drivers? :rolleyes:

http://www.3dcenter.org/artikel/det....14/index_e.php

Read the article yourself and come to your own conclusions instead of flaming someones opinion about something you didn't even read.

Your reaction was typical FanBoi'ish
 
Originally posted by Fenric1138
cheers Stryyder that one works :)

No Problem it is a good balanced article, they praise NVIDIA for improving shader performance but trounce them for 'fake' trilinear filtering that can only be dected visibly through motion making screenshot comparisons useless.

All in All it seems that those of you that plopped 400 bucks for an FX card should not bee that disappointed as NVIDIA seems to again have popped a rabbit out of the hat with the new drivers.
 
Originally posted by Stryyder
No Problem it is a good balanced article, they praise NVIDIA for improving shader performance but trounce them for 'fake' trilinear filtering that can only be dected visibly through motion making screenshot comparisons useless.

All in All it seems that those of you that plopped 400 bucks for an FX card should not bee that disappointed as NVIDIA seems to again have popped a rabbit out of the hat with the new drivers.

Atleast we've got something of a fair choice now (or will by the time HL2 is out) I'm hoping the prices come down, if both cards do well then its just a case of picking the cheaper of the two. I'm not a fanboy of either company I'll go with what can do the job. Though nVidia have better OGL support and I need that in other programs so it could be good news all round for me, woohoo :)
 
What you guys dont understand is that DX9 WAS MADE THE ATI WAY.
So, one could say that ATI is cheating. Now dont tell me Nvidia is cheating beceause now they will have their own stuff in DX9.1.
 
ATI DOES NOT HAVE BETTER HARDWARE THEY HAVE BETTER DX SUPPORT.

Nvidia cards are technically faster than ati anyway
 
Originally posted by Stryyder
http://www.3dcenter.org/artikel/det....14/index_e.php

Read the article yourself and come to your own conclusions instead of flaming someones opinion about something you didn't even read.

Your reaction was typical FanBoi'ish
Are you kidding? I read that article several days ago, but you didn't even look at it. There's not a single actual game screenshot in that article. They had to use a special tool to show the differences in Image Quality, otherwise it wouldn't be noticable.
Here's what the article said:
Thus it can be stated that the determined "optimizations" of the Detonator 52.14 won’t be recognized with the view of screenshots, if you do not look for them explicitly

Geez.... some people only read what they want to read and ignore the rest.
 
Originally posted by doa_master
What you guys dont understand is that DX9 WAS MADE THE ATI WAY.
So, one could say that ATI is cheating. Now dont tell me Nvidia is cheating beceause now they will have their own stuff in DX9.1.

NVIDIA chose not to participate in DX 9 development.

ATI participated and recommend 24 bit (along with other hardware developers). (PIXEL SHADER 2.0)

MS implemented 24 bit textures as part of DX 9.0 (PIXEL SHADER 2.0)

NVIDIA chose not to follow and did not redsign there hardware instead they used the same principals in the TI line of chips and implemented either 16bit or 32 bit textures.

ATI moved to a completely different type of core architecture and built in 24 bit precision.

ATI is fully DX 9 compliant NVIDIA is not either hoping that by the time DX9 games were released they would be into the next chip type or hoping the raw power of the chip would be able to overcome the defeciencies of the architecture. They gambled and seemed to have lost, at least the PR war.

DX 9.1 Will be using a new Pixel shader called Pixel Shader 3.0

Neither ATI will have a card that supports Pixel shader 3.0 until at least two more product cycles.

Basically DX9.1 will mean nothing in the Graphics card wars until at least the end of 2004 when the cards that support Pixel and Vertex shaders 3.0 will be released.

Check out this article for more info http://www.theinquirer.net/?article=5329
 
Originally posted by Stryyder
NVIDIA chose not to participate in DX 9 development.

ATI participated and recommend 24 bit (along with other hardware developers). (PIXEL SHADER 2.0)

MS implemented 24 bit textures as part of DX 9.0 (PIXEL SHADER 2.0)

NVIDIA chose not to follow and did not redsign there hardware instead they used the same principals in the TI line of chips and implemented either 16bit or 32 bit textures.

ATI moved to a completely different type of core architecture and built in 24 bit precision.

ATI is fully DX 9 compliant NVIDIA is not either hoping that by the time DX9 games were released they would be into the next chip type or hoping the raw power of the chip would be able to overcome the defeciencies of the architecture. They gambled and seemed to have lost, at least the PR war.

DX 9.1 Will be using a new Pixel shader called Pixel Shader 3.0

Neither ATI will have a card that supports Pixel shader 3.0 until at least two more product cycles.

Basically DX9.1 will mean nothing in the Graphics card wars until at least the end of 2004 when the cards that support Pixel and Vertex shaders 3.0 will be released.

Check out this article for more info http://www.theinquirer.net/?article=5329

Out of interest, would there be anything stopping someone writing a pixel shader 3.0 for their HL2 mods.. Or wont the source engine handle it atall.. Valve commented on how it will be easy to upgrade and keep up with new gear and possibilities. But will it stretch to things like that or will Valve be forced to release a new version to cope with it.. reason I ask is I'm sure some will begin to make use of the new shaders after learning the current 9.0 ones. And if HL2 is capable of making use of them, then the nVidia route would in theory seem like the best choice

Also could someone point me to a url that goes into detail what DX9.1 is capable of, particularly the shaders.. Or isn't that information currently publicly available?

Cheers
 
Originally posted by doa_master
ATI DOES NOT HAVE BETTER HARDWARE THEY HAVE BETTER DX SUPPORT.

Nvidia cards are technically faster than ati anyway

Intel chips are technically a lot faster than AMD chips, 3200MHz compared to 2200MHz for their top of the line offerings. But AMD seems to hold it's own against Intel in just about every benchmark. You're a total layman newb if you're still comparing computer products based on megahertage.
 
I might be mistaken but I'm fairly sure that Intel have overtaken AMD again with the new P4's. But AMD still have it on price...
 
Originally posted by jonbob
I might be mistaken but I'm fairly sure that Intel have overtaken AMD again with the new P4's. But AMD still have it on price...

That's true, but notice I said "hold it's own" and not "always ownes". If you compare the top of the line released chips, the AtholonFX and the P4 EE, they are about half and half for all benchmarks.
 
I couldn't begin to count the number of people that constantly misspell Athlon.

It's Athlon... not Atholon... not Athalon... not Athlong... not Athlone.

Athlon.
 
Back
Top