DirectX 9.1 soon, GFFX range on top form

mrk

Newbie
Joined
Aug 28, 2003
Messages
148
Reaction score
0
http://www.inpact-hardware.com/actu/news/11346.htm

translation of main info; the upcoming DirectX 9.1 will add a better support of PS2.0 for Geforce FX in 32bits precision !
60% better... good news...


MS and NVidia working together to finally siklve the dx9 pixel shader issues, great news, most people now have a reason to keep the FX cards and buy FX now.

I guess this also means the recent HL2 comparisons caining the FX range compared to ATI's top end cards are all irrelevant until the new DX is out, and as HL2 is pushed back to "holiday season~" (who knows) then it makes sense to wait before choosing the next gfx card perhaps
 
that's cool... this will make people who bought the 5900 breathe easier :)
 
yay...i have a g4 ti 4600...dont know if i ever had a problem but woo anyway
 
how's this gonna play out for the Radeon 9800 128 ??
 
Originally posted by Turin
so when is 9.1 due to be released?

I would assume because of teh recent HL2 articles regarding performance that it is to be out roundabouts the same as HL2 and due to the delay this give sthem more time, or was the delay all related to this in anyway ????

oooh controversy!
 
Originally posted by mrk
I would assume because of teh recent HL2 articles regarding performance that it is to be out roundabouts the same as HL2 and due to the delay this give sthem more time, or was the delay all related to this in anyway ????

oooh controversy!


your thinking a little to hard here......these guys dont all neccasarly colaberate on every release date.


microsoft isnt going to delay becuase of hl2 and vice versa.......when it is something that could be patched in later
 
Contrary to popular belief, DirectX specs are NOT created by Microsoft. Microsoft solicits specifications from hardware vendors (ATI and Nvidia), and then approves them. In the case of DX9, MS _tried_ to enforce certain specifications for PS2.0, but ATI said 'screw that, we'll do it our own way', and forced Microsoft to release the ATI spec as "DX9.0". DX9.1 is Nvidia's version of the DX9 spec.

Neither ATI nor Nvidia followed Microsoft's guidelines for DX9, they both made their own decisions on the precisions for PS2.0. ATI got their spec published as 9.0, now Nvidia is finally getting their spec published in 9.1.

I don't know the specifics of why Nvidia dropped the ball on getting their PS2.0 spec into DX, but it's a lot more complicated than "ATI followed DX9 and Nvidia didn't". DX9.0 is ATI's own spec, they 'followed' it by definition.
 
i'm so glad i waited to buy a vid card. i came so close to ordering a 9800pro. now i can wait for the 9800pro price to drop (because of the XTs) and i can see if nvidia will be a good choice again.

all those guys who upgraded just for hl2 before sep 30th are gonna be even more pissed now...heh
 
Originally posted by L33731
how's this gonna play out for the Radeon 9800 128 ??

You don't have to worry, the Radeon series does DX9 better from a hardware point of view.
 
Guys, none of this changes the fact that Nvidia still has HALF the pixel shader pipeline as a ATI. For one, I can't read french so I don't really know what was said in article. The thing is is that Nvidia still has half the hardware needed to have true DX9 performance. All this probably is is a workaround for Nvidia. I hope for geforce owners this helps and when the drivers come out and dx9.1 does too and they do benchmarks, hopefully there will be no cheating. But we ATI owners have nothing to worry about methinks.
 
Originally posted by Maskirovka
i'm so glad i waited to buy a vid card. i came so close to ordering a 9800pro. now i can wait for the 9800pro price to drop (because of the XTs) and i can see if nvidia will be a good choice again.

all those guys who upgraded just for hl2 before sep 30th are gonna be even more pissed now...heh

Well, i'm not pissed... i've been playing games at extremely high res with beautiful framerates and Anti-aliasing and stuff for over 2 months now :cheese: I couldn't be happier :)

And well, i don't think a radeon 9800 Pro is going to be performance junk within the next year or so.
 
What I would think would be possible is to play DX9(HL2)/DX9.1 games, in full precision without the performace hit they have now, using future Nvidia hardware (dx91. compatible). Future hardware may or maynot have the same pipeline spec and we are not sure how ATI will use DX9.1 or what their specs are for their DX9.1 gfx cards (loki).
But the 5600/5900 series (dx9.0) is still screwed :)
 
Maskirovka, thats one reason which makes me also a bit happy about the delay.
I would have bought a new card just for HL2. So I will be able to save 100 bucks or get an upgrade version for the same price.

I have now a Geforce3 and was sure for a ATI 9800pro.
Now this looks interesting. New round in spring 2004...
 
well..

Sorry to like, burn you, but doesn't hl2 use 24-bit precision? Which means if you go to 32-bit precision, you WILL have a performance hit, and it will NOT make it look or function better.

Basically it seems hl2 is practically designed for current generation ATi cards. It might not be deliberate,.... but I don't think there will be any major improvements for FX owners, certainly nothing of the order of 60%.

Maybe 10% in the next nvidia driver release.

The new directx spec will let FX cards function OK in games that use 32-bit precision, but that's a helluva lot of precision.. I don't know whether it's really necessary for a long time. I don't know if 24-bit is even necessary.. but obviously ATi thought so...


I just had an idea - what if, like me, nvidia didn't see the need for better precision, so they went with a 32/16 bit system, which could do both very efficiently. ATi cards calculate everything in 24-bit I think, which means they lose performance on 16-bit precision applications (most 3d games right now... I think)


Anyway, I'm not really sure about this stuff so feel free to correct me if I'm wrong =/
 
i dunno, 'cause valve and id have both issued statements about how mush the geforceFX series sucks for DX9 apps. Nobody in the know has said "Oh! Wait til we optimize DirectX 9.1!" Not even nvidia has said that.

60% performance increase will be great, but remember that the radeons were already way over twice as fast. I think you can still expect the FX owners to be playing HL2 in DX8.
 
You have the right idea but HL2 doesnt only use 24bit precision.
Nvidia cards run things, like you said, 16/32 bit while ATI's current series is 24bit.
The 5600/5900 Nvidia cards are what will not benefit and stay as they are in performance.
HL2 uses full precision (which differs for those cards) or mixed mode (Nvidia's special path to increase performance by mixing 16/32bit instead of full 32bit precision).
So when HL2 is run in full precision is matters what card you have to know what precision HL2 will run on your PC.
It may be possible for Nvidia to use their DX9.1 hardware to gain performance in 32bit precision for DX9 game although it may not. Whether Valve would have to make some code adjustments for that to happen or not, I'm not sure.
They may be totally screwed with anything DX9.
They do good with DX8.1 and they may now do good with DX9.1.
Guess we will just have to see.
 
Originally posted by dscowboy
Contrary to popular belief, DirectX specs are NOT created by Microsoft. Microsoft solicits specifications from hardware vendors (ATI and Nvidia), and then approves them. In the case of DX9, MS _tried_ to enforce certain specifications for PS2.0, but ATI said 'screw that, we'll do it our own way', and forced Microsoft to release the ATI spec as "DX9.0". DX9.1 is Nvidia's version of the DX9 spec.

Neither ATI nor Nvidia followed Microsoft's guidelines for DX9, they both made their own decisions on the precisions for PS2.0. ATI got their spec published as 9.0, now Nvidia is finally getting their spec published in 9.1.

I don't know the specifics of why Nvidia dropped the ball on getting their PS2.0 spec into DX, but it's a lot more complicated than "ATI followed DX9 and Nvidia didn't". DX9.0 is ATI's own spec, they 'followed' it by definition.

Ehh, you're almost right, you left a few things out. DirectX is a collaboration between Microsoft, ATi, and nVidia in an effort to create a standard language and hardware specification. In the early developement of DX9, Microsoft and nVidia had a falling out because nVidia refused to let Microsoft hold certain pattents on the chip design (which is understandable). So with that, nVidia was out of the loop as far as the developement proccess for DX9 went.

So while nVidia was developing their DX9 compatible cards, they were kinda being left in the dark about what the DX9 specifications were going to be. Early on, before the split, DX9 was going to incorporate 32-bit precision, but later it was found that the gradient difference between 32-bit and 24-bit precision isn't noticable to the human eye (not to say that there aren't benefits of 32-bit precision in shaders). The unnecesarrily high precision of nVidia's cards is the main reason for it's poor performance, along with it's lack of 32-bit precision pipelines.

I have a feeling that DX9.1 will most likely incorporate an universal and much more efficient "mixed-mode" for all DX9 titles, only without any degradation to IQ. It's good to hear nVidia's jumped back aboard the MS bandwagon though, they would have gone out like 3dfx if they hadn't.
 
Originally posted by iamironsam
Ehh, you're almost right, you left a few things out. DirectX is a collaboration between Microsoft, ATi, and nVidia in an effort to create a standard language and hardware specification. In the early developement of DX9, Microsoft and nVidia had a falling out because nVidia refused to let Microsoft hold certain pattents on the chip design (which is understandable). So with that, nVidia was out of the loop as far as the developement proccess for DX9 went.

So while nVidia was developing their DX9 compatible cards, they were kinda being left in the dark about what the DX9 specifications were going to be. Early on, before the split, DX9 was going to incorporate 32-bit precision, but later it was found that the gradient difference between 32-bit and 24-bit precision isn't noticable to the human eye (not to say that there aren't benefits of 32-bit precision in shaders). The unnecesarrily high precision of nVidia's cards is the main reason for it's poor performance, along with it's lack of 32-bit precision pipelines.

I have a feeling that DX9.1 will most likely incorporate an universal and much more efficient "mixed-mode" for all DX9 titles, only without any degradation to IQ. It's good to hear nVidia's jumped back aboard the MS bandwagon though, they would have gone out like 3dfx if they hadn't.

This is a perfect example of the big fish finally getting caught up in the "net" NVIDIA being that big fish have been using bully tactics for years now. This should be a lession to them that they can not just push who ever around whenever they want. NVIDIA fumbled the ball and then ATi picked that ball up and ran it to touch-down! If NVIDIA dosen't get there act together ATi will kick a field goal as well. :cool:
 
Re: Re: DirectX 9.1 soon, GFFX range on top form

Originally posted by nsxownzme
Actually, you mean, the recent HL2 comparisons are valid, DX9.1 is irrelevant until it's out.

Hell if ya think about it, until Valve "finishes" it is all irrelevant. Be kind funny (or not) if whatever they may be doing to the code now changes ATIs "advantage". JUST SPECUALTING NOT BASING ANYTHING ON FACT.
 
It's possible that FX cards take a big performance hit from using non-32bit shaders... just like how 32-bit processors take a performance hit when things they deal with aren't multiples of 32bits in length.

In that case, converting everything to 32-bit would provide a big performance gain.
 
nvidias hardware just cant do pixel shaders as well as ATI..........75% of there performance gains are through loss in visual quality.....lowered precision...."optimising" drivers.

so basically dx 9.1 will be full of official cheats for nvidia....
 
Ok, Nvidia is OUT of DX co-op with MSoft....


Nvidia doesnt use ps 2.0, they dont use anything that utilizes anything over DX 8, because their shitty cards cant handle it, because they were overhyped so every nub bought one...


DX cant save Nvidia's FX line
 
Originally posted by crabcakes66
nvidias hardware just cant do pixel shaders as well as ATI..........75% of there performance gains are through loss in visual quality.....lowered precision...."optimising" drivers.

so basically dx 9.1 will be full of official cheats for nvidia....

The same thing came to my mind when I heard of that mysterious 'increase' of performance. How can you you gain 60% more performance when your hardware is not good enough? Doesm't make sense to me. If ATI cards were 60% faster in DX9.1 too, it would make sense, because they can handle PS much better than the FX.
 
Originally posted by iamironsam
Ehh, you're almost right, you left a few things out. DirectX is a collaboration between Microsoft, ATi, and nVidia in an effort to create a standard language and hardware specification. In the early developement of DX9, Microsoft and nVidia had a falling out because nVidia refused to let Microsoft hold certain pattents on the chip design (which is understandable). So with that, nVidia was out of the loop as far as the developement proccess for DX9 went.

So while nVidia was developing their DX9 compatible cards, they were kinda being left in the dark about what the DX9 specifications were going to be. Early on, before the split, DX9 was going to incorporate 32-bit precision, but later it was found that the gradient difference between 32-bit and 24-bit precision isn't noticable to the human eye (not to say that there aren't benefits of 32-bit precision in shaders). The unnecesarrily high precision of nVidia's cards is the main reason for it's poor performance, along with it's lack of 32-bit precision pipelines.

Thank you for clarifying. :cheers:
 
Yeah, its either heavily Nvidia optimized or they actually improved the API universally.
 
Originally posted by FreeYayo
Ok, Nvidia is OUT of DX co-op with MSoft....


Nvidia doesnt use ps 2.0, they dont use anything that utilizes anything over DX 8, because their shitty cards cant handle it, because they were overhyped so every nub bought one...


DX cant save Nvidia's FX line
That was posted on the gearbox forum, i never have beleived it and i dont think i wil till i have a press release from M$ on my screen, you dont take out the biggest(nv is still bigger than ati) player in 3d graphics markets because of a bad set of cards.... meh, i could defintally see this helping nvidia alot, why, because when you change something inside of dx it should make everything loads faster, trying to cheat in drivers is completly differnt because thats on top of Dx
 
Originally posted by FreeYayo
Ok, Nvidia is OUT of DX co-op with MSoft....


Nvidia doesnt use ps 2.0, they dont use anything that utilizes anything over DX 8, because their shitty cards cant handle it, because they were overhyped so every nub bought one...


DX cant save Nvidia's FX line
This is untrue...obviously.
Nvidia does use PS2.0 and DX9.0. And they will use DX 9.1 just like ATI will. Their performance under these specs for the FX line does take a major hit though. Which is why everyone is pro ATI right now, that and the fact that Nvidia takes away quality just to perform better.
 
If this is true then it's very good news. But I can't read French and NVidia's website doesn't mention this, so I have my doubts.

Developers always use the latest DirectX Software Development Kit. So once the new DirectX9.1 SDK is released, everyone will use that and all the new games will be DurectX 9.1 compatible. No doubt Half Life 2 as well. DirectX9.1 will be backwards compatible with DirectX9.0, so bringing HL2 up to DirectX9.1 will be no problem at all.
I just hope the information in the article is true.

Originally posted by crabcakes66
nvidias hardware just cant do pixel shaders as well as ATI..........75% of there performance gains are through loss in visual quality.....lowered precision...."optimising" drivers.

so basically dx 9.1 will be full of official cheats for nvidia....
75%? LOL.... did you just made that number up? Is there any website that said "75%"? I don't think so.
Recent reports in hardware sites show that the NVidia 52.XX drivers give a good image quality. Any questionable optimisations are only noticable when examining screenshots very carefully. Supposedly these new drivers will be officially released in a week.
 
yeah, I have a 5900ultra and was scared until I tried the new leaked detonators, which GREATLY improved my performance, but once there puppies are released, I'll do EVEN BETTER!
 
The detonator 52.xx drivers use "Compiler" technology to translate DX9 code into a proprietary langauge that nVidia cards can digest much easier. It's more like translating, but nVidia claims it makes DX9 more efficient (though only to their cards), so they consider it compiling. I'm not sure if this "compiling" is done in real-time or during load up. So far there doesn't seem to be any loss to IQ, but I'll wait till the 52.xx's are officially released to make any judgement calls. It seems like a whole lot of trouble to go through, I bet nVid won't make that mistake again.
 
32 bit precision is the problem of nVidia cards not one of their strenght, is an advertised feature but only on paper, because when u try to run a shader using 32bit precision the card can't run it a proper speed, DX9.1 is not going to make the card run faster, is just going to set as standard 32bit precision instead of the actual 24bit...

FX are flawed!
 
Yea - before you all go Nvidia nutz keep in mind, ATI Radeon 9500 and higher will still be better then then current FX lineup with DX 9.1, although it's good to hear Nvidia owners might get some help finally. DirectX 9.1 can't add 4 more pipelines to the FX cards...

When is DX9.1 scheduled to be released publicly? Before or after April 2004?
 
also think about it this way, ATI cards can use 16/24 or 32 bit precision BUT they default to the fastest mode which is 24 bit precision, now if NV and MS are optimising direct X 9.1 API for 32 bit precision under nvidia cards architecture then this is surely a good thing and coupled with the proven quality and performance of the beta 52.xx drivers so far it can only get better when the official outings emerge.

Competition is good, it not only dirves proices down but also means faster advances in gfx hardware technology and uptaking of the full range of direct x features and everyone stays happy
 
Originally posted by Netherscourge
Yea - before you all go Nvidia nutz keep in mind, ATI Radeon 9500 and higher will still be better then then current FX lineup with DX 9.1, although it's good to hear Nvidia owners might get some help finally. DirectX 9.1 can't add 4 more pipelines to the FX cards...

both using the latest drivers the 9500 is only better than the 5600 ultra range not the 5800/3900 ultras
 
It was just easier for nVidia to take their existing 16 bit design and essentially double it. The layout of nVidia's DX8 and DX9 cores are pretty similar, way more so than ATi's. ATi's DX9 solution was a complete departure from their old design. That's why the 9500pros were faster than the 9600pros in DX8 titles. That was the trade of for going from full 16bit to full 24bit precision, speed.
 
Back
Top