DirectX 9.1 soon, GFFX range on top form

Originally posted by mrk
also think about it this way, ATI cards can use 16/24 or 32 bit precision BUT they default to the fastest mode which is 24 bit precision, now if NV and MS are optimising direct X 9.1 API for 32 bit precision under nvidia cards architecture then this is surely a good thing and coupled with the proven quality and performance of the beta 52.xx drivers so far it can only get better when the official outings emerge.

Competition is good, it not only dirves proices down but also means faster advances in gfx hardware technology and uptaking of the full range of direct x features and everyone stays happy

Agreed - ATI and Nvidia cards could be running HL2 evenly, or atleast close.

I think ATI's cards will be a tad better though - considering they have 8 pipes and the FX cards have 4. But it won't be noticably better, so everyone will win!

Now if Valve would just get that damm game out to us...
 
Well I don't care who agrees, Nvidia had many chances to change their performance vs Image Quality, but they couldn't. I cannot believe that they can do it with 9.1 its too late for them. Wait for the near future NV40 vs R420.
 
Originally posted by G0rgon
Well I don't care who agrees, Nvidia had many chances to change their performance vs Image Quality, but they couldn't. I cannot believe that they can do it with 9.1 its too late for them. Wait for the near future NV40 vs R420.

a bit quick to jump to conclusions aren't we there?

Here, have this
 
Originally posted by mrk
also think about it this way, ATI cards can use 16/24 or 32 bit precision BUT they default to the fastest mode which is 24 bit precision, now if NV and MS are optimising direct X 9.1 API for 32 bit precision under nvidia cards architecture then this is surely a good thing and coupled with the proven quality and performance of the beta 52.xx drivers so far it can only get better when the official outings emerge.

Competition is good, it not only dirves proices down but also means faster advances in gfx hardware technology and uptaking of the full range of direct x features and everyone stays happy

Actually ATi's line of DX9 is only natively capable of 24 bit precision. 16 bit floats are actually turned into 24 bit floats by tacking on eight zeros to the back of the number. 32 bit floats are essentially split into two 24bit floats, taking twice as long to process (extremely simplified version of what actually happens).

The reason the DX development commitee decided against 32 bit is because it wasn't necessary to achieve the type of effects they were going for in DX9. 24 bits were more than enough for everything they wanted to do, and are much quicker to process. The difference between 24 bit and 32 bit precision is extremely minimal in effect, but extremely taxing in speed. You have to take into consideration that every digit being tacted onto the back of a floating point number is becoming less significant by a power of 10, so it get's to the point where humans aren't able to percieve the difference in image quality.

Once shaders become more complex, we'll probably see 32 and 64 bit precision, but by that time, today's cards will be chugging along at 5fps anyway.
 
Originally posted by mrk
a bit quick to jump to conclusions aren't we there?

Here, have this

ouch.

Wait a second man, this is the fact. (Starting from G5800FX till the latest one). They all have been tested with HL2 and they all performed very well only in 8.x.

:dork:
 
Ah, never knew that tahnks for pointing it out :)
 
Originally posted by G0rgon
ouch.

Wait a second man, this is the fact. (Starting from G5800FX till the latest one). They all have been tested with HL2 and they all performed very well only in 8.x.

:dork:

what the hell man! you fail to understand they were being tested with BETA drivers on a BETA build of HL2, the game snot even finished yet, if it was we would be playing it right now, ISNT THAT RIGHT GABE?
 
Drivers weren't beta :) Thats why they used det. 45 drivers and not det.50 beta drivers.
 
Originally posted by iamironsam
Actually ATi's line of DX9 is only natively capable of 24 bit precision. 16 bit floats are actually turned into 24 bit floats by tacking on eight zeros to the back of the number. 32 bit floats are essentially split into two 24bit floats, taking twice as long to process (extremely simplified version of what actually happens).

The reason the DX development commitee decided against 32 bit is because it wasn't necessary to achieve the type of effects they were going for in DX9. 24 bits were more than enough for everything they wanted to do, and are much quicker to process. The difference between 24 bit and 32 bit precision is extremely minimal in effect, but extremely taxing in speed. You have to take into consideration that every digit being tacted onto the back of a floating point number is becoming less significant by a power of 10, so it get's to the point where humans aren't able to percieve the difference in image quality.

Once shaders become more complex, we'll probably see 32 and 64 bit precision, but by that time, today's cards will be chugging along at 5fps anyway.

So would you suggest sticking with buying an ATI 9800pro or 9800xt. Or would you wait to see what nVidia come up with, or are they best avoided now, atleast for the time being anyway?
 
Originally posted by Asus
Drivers weren't beta :) Thats why they used det. 45 drivers and not det.50 beta drivers.

Exactly:cheers:
 
weeell, i was worng on that part then!, I wonder what they would have got had they used the 52.13+ then because they are the ones with the new code optimised to run ps2.0 and dx9 apps. of course we shall wait for official !
 
Originally posted by Fenric1138
So would you suggest sticking with buying an ATI 9800pro or 9800xt. Or would you wait to see what nVidia come up with, or are they best avoided now, atleast for the time being anyway?

If you're upgrading for HL2, wait for it to come out first and see what's out there.

If you're upgrading because you honestly need a new card right now, go with an ATi from the 9600 or 9800 series (except for the 9800SE), depending on your financial situation. I'm not gonna tell you the 9800XT isn't worth it and go with the 9800pro. It's a minimal difference, but if you want the best, go with the XT.

I've got a 9800pro and I love it, though I'm not an ATi fanboy by any means. I just went with the best after doing a ton of research. nVidia just isn't on their A game these days, and I don't know if the det 52.xx or DX9.1 are gonna change that.
 
Originally posted by Asus
Drivers weren't beta :) Thats why they used det. 45 drivers and not det.50 beta drivers.
But no owner of an NVidia card will use the old det. 45 by the time HL2 is released, so the benchmark results are inaccurate.
 
By the way, AA and AF still tax the crap out of nVidias, even with the 52.13s. Their cards aren't built to process them very well, and software isn't going to change that.
 
Originally posted by Arno
But no owner of an NVidia card will use the old det. 45 by the time HL2 is released, so the benchmark results are inaccurate.

I totally agree, those benchmarks were very pre-mature. Even if they thought they could get the game out by 9/30 at the time, they should have held off till the game went gold. Shame on Valve and ATi for that obvious marketing ploy.
 
Id did the same thing with Doom3 when the 5900u was released. Is it just me or are graphic card and game companies getting really sleazy?
 
Yeah, agreed. NVidia had the opportunity to test and update their drivers prior to the Doom3 benchmark, while the ATI cards were forced to run with outdated drivers. That was a pretty unfair benchmark as well.
 
this is all pretty irrelevant...

1. hl2 isn't done
2. dx9.1 isn't done
3. new detonator drivers aren't out yet
4. new cards will almost certainly be out before hl2...if not shortly after.

if you wanna worry about what's coming, check this out:
http://www.xbitlabs.com/news/video/display/20030619023125.html

2x as fast as the 9800pro? and the 9800xt just came out for $500? how much will these suckers cost? i can't see those coming out before february or something...if something 2x as fast as the 9800pro came out in december, the 9800s would have to drop in price drastically.

either that or it'll cost like $800 :D
 
Originally posted by Arno
Yeah, agreed. NVidia had the opportunity to test and update their drivers prior to the Doom3 benchmark, while the ATI cards were forced to run with outdated drivers. That was a pretty unfair benchmark as well.

Not only that, but they were forced to run under a path specifically designed for nVidias and only 128mb of the 9800pros 256mb of ram was used, while the 5900u used all 256mb. Doom3 may utilized that 256mb of vram.

Edit: Now that I think about it, Doom3 probably won't use all 256mb unless they start using higher quality textures.
 
Originally posted by Arno
Yeah, agreed. NVidia had the opportunity to test and update their drivers prior to the Doom3 benchmark, while the ATI cards were forced to run with outdated drivers. That was a pretty unfair benchmark as well.

benchmarks aren't fair or unfair unless you don't read them.

if you just look at a few graphs and don't read what they actually did, you're misinforming yourself.

so if you read the benchmark, realized ATI was getting screwed because of something in the way the benchmark was being performed, then you got the correct information...trust the numbers, but know improvements are on the way.

with ATI and nvidia constantly jockeying for top spot, there will be very few benchmarks ever done that don't have one company or the other slightly behind...and the one behind about to release new drivers or whatever...that's just how it works
 
Originally posted by Maskirovka
this is all pretty irrelevant...

1. hl2 isn't done
2. dx9.1 isn't done
3. new detonator drivers aren't out yet
4. new cards will almost certainly be out before hl2...if not shortly after.


Then just about everything we talk about in this forum is irrelevant, because...

1. hl2 isn't done
2. we can't talk about the prerelease

Originally posted by Maskirovka
2x as fast as the 9800pro? and the 9800xt just came out for $500? how much will these suckers cost? i can't see those coming out before february or something...if something 2x as fast as the 9800pro came out in december, the 9800s would have to drop in price drastically.

either that or it'll cost like $800 :D
[/B]

Who said the XT was 2x as fast as the pro? It's more like 5% faster and that's with overdrive. And contrary to popular belief, they are almost identicle cores.
 
Loki (R420)=2x9800P(R350)
9800XT - R360


Also, reliable sources say that while compensating for 4 missing pipes on nv HW dx9.1 will also propel pigs to Mars. A joint NASA-MS* statement is expected soon.

*Microsoft Corp. is NOT responsible for any flaws in the propulsion system and/or consequences of such flaw.
 
Originally posted by iamironsam
Not only that, but they were forced to run under a path specifically designed for nVidias and only 128mb of the 9800pros 256mb of ram was used, while the 5900u used all 256mb. Doom3 may utilized that 256mb of vram.
Actually, Radeon cards can't run in Doom3's special NVidia paths, because of all the NVidia-specific code. So the Radeon cards were running on the regular OpenGL path.
 
I know hl2 will run pretty good on earlier graphics cards, so to the question will a gf4 run it ok, yes your machine will run it great, this is a fact. Although u will miss the graphics of pixel shader 2.0. Radeon or FX will be fine.

To the question about quality with nvidia drivers, the new drivers coming out(very shortly) Will have excellent quality and will still increase peformance many times, not through cutbacks in iq, but using a diff shader routine for the fx, which should have been done when that card was released.

Actually fx cards to have 16bit or 32bit percision on shaders, and it also has 8 pipelines, nvidia=4x2=8 pl, and radeon is 8x1=8.

Radeon is faster in single texture mode and fx is faster on multitexturing.

But all in all, whatever system you've got from GF3 way up to fx,rad etc. This game is gonna rock!!
 
Originally posted by voodoomachine
I know hl2 will run pretty good on earlier graphics cards, so to the question will a gf4 run it ok, yes your machine will run it great, this is a fact. Although u will miss the graphics of pixel shader 2.0. Radeon or FX will be fine.

To the question about quality with nvidia drivers, the new drivers coming out(very shortly) Will have excellent quality and will still increase peformance many times, not through cutbacks in iq, but using a diff shader routine for the fx, which should have been done when that card was released.

Actually fx cards to have 16bit or 32bit percision on shaders, and it also has 8 pipelines, nvidia=4x2=8 pl, and radeon is 8x1=8.

Radeon is faster in single texture mode and fx is faster on multitexturing.

But all in all, whatever system you've got from GF3 way up to fx,rad etc. This game is gonna rock!!


well said and I agree with all of that
 
FX have 4 pixel pipelines
9500-9800 have 8 pixel pipelines
the other number that you included is textures per pipeline.
 
From what i've read etc, reviews, In many cases it uses 4xpixel pipelines. But Nvidia did say that there were some cases where its chip can turn out 8 pixels per clock. Here is a quote:

"GeForce FX 5800 and 5800 Ultra run at 8 pixels per clock for all of the following:
a) z-rendering
b) stencil operations
c) texture operations
d) shader operations


But my overall point was, This game is gonna be great on any machine with pixel shaders.

By that i mean not fixed function.
 
All I am concerned about is how will the different series of cards effect my game of HL 2 ?

I mean I played Deus Ex on a Voodoo 2 (12 meg ram) and gamewise it was just as fun as when I replayed it later with a geforce 2 , except for a FPS increase it wasn't all that GFX improved.

What I mean is what will I actually miss visually when I play HL 2 on a GF4 Ti 4200 ?
 
Originally posted by ASnogarD
What I mean is what will I actually miss visually when I play HL 2 on a GF4 Ti 4200 ?

yes, on eye candy like refraction in water, HDR, reflection and refraction on shiny surfaces etc.
nothing too important BUT...
 
Originally posted by The Grim Reaper
yes, on eye candy like refraction in water, HDR, reflection and refraction on shiny surfaces etc.
nothing too important BUT...

Refraction in water is DX8.1, will be there on a GF4 TI
 
Nvidia drivers

Nvidia is releasing there new drivers (52.14) next month. These drivers will have 60% increased performance. :cheers: with those drivers the geforce fx 5900 beats the ati 9800 pro, on almost all benchmarks.
 
Well this is a confusing time, I was all set to go ATI but now I'm not so sure

What may tip the balance is there has been a lot of talk over the months that the ATI cards do struggle even with simple OpenGL.. Not important for HL2 you say.. True, but if your planning on using Softimage or others that run in OGL mode for the most part, you may find problems with it. Where as nVidia cards perform much better with OGL.. So the question is, will the nVidia cards suss the DX9 issues and be the best choice for the mod makers out there. Or will people be forced to make a choice, good HL2 performance and crashes, slow response and other issues with the modding apps that use OGL. Or plump for nVidia and make great stuff in properly working OGL only to not get to see them at their best in the HL2 engine

Anyone else miss the days of the Amiga where everything worked, everything was compatible and that was that :)
 
Ha! I just had a mental image of some sort of Supramiga powered by weapons-grade plutonium and running HL2... I still have Space Hulk for the old thing somewhere.

Um, anyway... I think this is a good sign- as said, things can only get better for nVidia, and resultantly the nVidia-users in the HL2 sector. The rift in the gaming community caused by ATI/nVidia fanboyism can only get bigger of course...

Now, what to get. I could stick with my MMX 440, but somehow I doubt it's potential with the Source engine.
 
The only OpenGL problems I have heard about is that the ATI cards used to not be able to do AA in Half-Life if you ran it in OpenGL... but it worked fine in other OpenGL games/programs.

Are there any new problems that have popped up?
 
Back
Top