!! NV Drivers!!

i got my 9800 pro for 340 american. i could have gotten it for around 310 but it was oem and off of a site. i just couldnt trust that so i went retail at circuit city. it was 60 bucks lower than the regular price so i figuired what the hell. hopefully i wont have to upgrade for a long time now.
 
What was the big difference between your 4200 and your 5900? I have a P4 2.4, 512 RDRam, 64mb 4200. My friend has a P4 3.06, 2 GB of Corsair Ram, 9700 AIW oced to a 9800, and a whole bunch of other stuff whihc added up to a price tag of around $4,000 USD. He bought it like within teh last 6 months. I bought my comp last September ofr less than $2,000. The only major difference is our benchmark scores. There aren't too many visual differences. Sure his fps are alot higher than mine but its not noticeable. I get about 90 fps in RTCW: ET in multi. He can get over 500 but itll disconnect him so he has it around 200+. We both have the same resolution of 1280 x 1074 or whatever and there is no noticeable difference except for teh bluriness his screen has around the edges since he has a flat panel and I have a trinitron. So you and I shaitheatery had more or less the same config. Then u bought a 5900. What was the difference?
 
How do I install these right... and which file do I download? That site confuses me... there are 52.16 and 52.12 drivers... which is for what card, and how do I download them? I have a GeForce FX 5600...

Please help...
 
Originally posted by Majin
How do I install these right... and which file do I download? That site confuses me... there are 52.16 and 52.12 drivers... which is for what card, and how do I download them? I have a GeForce FX 5600...

Please help...

you d/l the 52.16 if you have windows 98 i believe and you d/l either the 52.12 or 52.14 if you have above windows 98. when you d/l make sure you remove your old drivers and then under your hardware configuration you have to go in and manually select them from what folder you extracted the d/l to. after that your computer with install them and then you have to restart for the changes to take affect.

edit*...you select the folder you extracted the d/l from under hardware configuration not the actual files. sorry if this is confusing, im terrible at explaining things.
 
Originally posted by Ridic
actually i heard something like a *2% or a 2*% decrease with each new driver.
From who? A neighbour of a friend of your cousin's roommate said so? Please don't spread any "facts" when you don't have the links to back them up.

AnandTech.com
The new 52.14 drivers are much better than either the 51.xx or the 45.xx series. The image quality issues are corrected from 51.xx, and a lot of speed has been inked out over the 45.xx drivers.

AnandTech.com got corrected on that one, as 3DCenter found two remaining optimizations after some searching:
3DCenter.org
Thus it can be stated that the determined "optimizations" of the Detonator 52.14 won’t be recognized with the view of screenshots, if you do not look for them explicitly.

Those optimizations only occur in DirectX. OpenGL renders 100% correct as far as I know. What's more, other optimizations got fixed:
TomsHardware.com
The current NVIDIA v45.23 had issues with lighting and shows in AquaMark3. This is fixed in the new v52.16 driver release

Based on these reports, I conclude that the IQ problems with the 45.23 drivers are fixed in the 52.XX drivers, except for the two practically invisible optimizations found by 3DCenter.org. So overall there should be a definite increase in IQ. :cool:
 
HL1 - crowbar becomes a super fast weapon on dead bodies, everything else normal, game runs very fast and very smoothly now
yes, that has happened to me for as long as i can remember

since my Kyro32mb to my Radeon 9500Pro right now

and it's only on some corpses
 
Hey I just thought I'd weigh in with my experience of the 52.16 drivers.

Approximately 1/3 improvement in framerate all around in AquaMark3 (Total Average FPS from 31.82 to 42.45). Whether this is due to "hacks" or not...who knows... Image quality looks good to me. Lighting good.

Thanks for pointing them out.
 
Originally posted by Arno
From who? A neighbour of a friend of your cousin's roommate said so? Please don't spread any "facts" when you don't have the links to back them up.



AnandTech.com got corrected on that one, as 3DCenter found two remaining optimizations after some searching:


Those optimizations only occur in DirectX. OpenGL renders 100% correct as far as I know. What's more, other optimizations got fixed:


Based on these reports, I conclude that the IQ problems with the 45.23 drivers are fixed in the 52.XX drivers, except for the two practically invisible optimizations found by 3DCenter.org. So overall there should be a definite increase in IQ. :cool:

there are still lighting issues with the new drivers. when running benchmakrs like aquamark 3 and 3dmark sometimes you actually lose whole lighting scenes and things like shadows as well.....as soon as i find the page i was looking at i will link it for everyone to see.
 
This is still my favourite quote ever :)

Other than the Detonator 51.75, the Detonator 52.10 and 52.14 differ in so far as that the faked trilinear filter is also used for the texture stage 0 (which usually is the basis texture). That’s why both new Detonators are more "optimized" than the 51.75!

This shows the obvious decline in IQ:
http://www.3dcenter.org/artikel/detonator_52.14/pic21.php
Its so obvious, one doesnt even have to be a tech guy to see it. One has smooth transitions. One have sharp transitions. One even lowers quality by an entire mipmap lenght in the the secondary layers (in common terms, textures will start to get blurry closer to the camera.).

Is there even anything to discuss?
 
You're correct on one point, the mipmap does move. But there are no sharp transitions. This is not bilinear filtering. There are still smooth areas in between the mipmaps. The smooth areas are just less wide. Here's an example of REALLY sharp edges: ATI "trilinear" filtering.
And if Tomshardware and Anandtech are correct, and the lighting issues are indeed fixed, then on the whole the IQ would definitly increase.
 
dawdler, what am I looking for in those pictures? I know you know what you're talking about, but I can't see it....maybe it's cause my LCD has .285 dot pitch....
 
The effect is actually much better visible in the ATI screenshot I linked too. The NVidia screenshot still has some smoothness in between the mipmaps.
 
Here is ATI:
http://w1.855.telia.com/~u85528876/aflayer.jpg
16xAF, all texture layers identical, taken on my very own box.
You have to use application, simple as that. The control panel override optimisation is a nasty one, true, and I dont really like it (it isnt a problem on most games as many use layer 0 mostly, but some can be a problem).

On Nvidia you cant do this, what you see is what you get. Btw, no, the edges arent sharp, that's very true. But they are ALOT sharpER than previous drivers, no one can say differently, unless they are blind. And its alot sharpER than ATI. Though its hard to compare them, since the AF pattern is all different (I aint technical savvy enough to know why they choose to do it differently).
 
Originally posted by dawdler
Here is ATI:
http://w1.855.telia.com/~u85528876/aflayer.jpg
16xAF, all texture layers identical, taken on my very own box.
You have to use application, simple as that. The control panel override optimisation is a nasty one, true, and I dont really like it (it isnt a problem on most games as many use layer 0 mostly, but some can be a problem).
That image looks really nice. As I understand it, overriding the optimized filter does come with a performance cost.
It's true that texture layer 0 is the most important. Layers 2 through 7 are usually used for lightmaps (quite common, but hardly noticeable) and detail textures (less common, but will probably be more noticeable).

Originally posted by dawdler
On Nvidia you cant do this, what you see is what you get. Btw, no, the edges arent sharp, that's very true. But they are ALOT sharpER than previous drivers, no one can say differently, unless they are blind. And its alot sharpER than ATI. Though its hard to compare them, since the AF pattern is all different (I aint technical savvy enough to know why they choose to do it differently).
I think we are in agreement here. :cheers:

I honestly don't know either why the AF patterns are different. I would like to know, though.
 
hrm.....new drivers, i d/l and installed them and for some reason i cant play a certain game (wink wink) right now. might have to roll back the drivers......i could play this certain game with the 52.14's though.
 
That's funny. I remember that after I installed Det 45.23, I suddenly couldn't play another certain alpha from a game that starts with a "D".
 
Originally posted by Arno
That's funny. I remember that after I installed Det 45.23, I suddenly couldn't play another certain alpha from a game that starts with a "D".

Exactly. That damn "D" game. I just get a grey flickering screen. 52.xx don't work either. Has anyone found a workaround for this "D" game alpha?
 
Originally posted by voodoomachine
Theres gonna be probs with these drivers, when infact they are beta!!.

I.Q for me looks nice, i get 2000 extra points in aquamark3.

Also from what i've read etc etc. These drivers speed up pixel and vertex shader. Trust me when hl2 and other games come out and offical drivers come out, 50.xx. Nvidia cards will be fine running ps 2.0.

You do realize aquamark points represent your average frame rate right? So that 2000 points is really a mere 2 frame per second LOL
 
If he went, for example, from 10 to 12 frames/second, that's quite impressive. But if it's something like from 38 to 40 frames/second, that's a meaningless increase. So it depends.
 
I wonder if it will make any difference to my aging GeForce4Mx 440.

The last betas made everything uber-blurry. In Half Life, the distance before the ground texture becomes blurry became less than a quarter of what it used to be.
 
LOL.

I guess it's all OK to bash the drivers when you aren't on the side being bashed. I remember when the 8500 was released and ATI did stupid shit with the driver to boost FPS because there were some pretty big flaws in the 8500s design. All the Nvidia fans loved to slam ATI for that and all the ATI fans hated it.

Now we've got the GFFX which has the same kind of flaws in the hardware and Nvidia doing the same thing they were complaining about ATI doing with the 8500 back in the day.

The difference is that when ATI was caught doing the stupid shit they said, "Yeah, we ****ed up. It won't happen again." and they fixed it. Nvidia gets caught and they release a PR junket and pay off benchmark companies to say that what they're doing are legitimate "optimizations".

I have owned several video cards over the years. In fact the first real 3D accelerator that I owned was a Riva 128. I had a TNT and owned the GeForce, GeForce 3 ti 500 and now a GeForce FX 5600. I also have a Radeon 9600 and a Matrox card in other machines. Nvidia's bullshit "the way it's meant to be played" campaign pissed me off a little, but I stuck with them. But now I am not buying another Nvidia card until they fix the drivers and stop screwing the IQ just so they can lose to the 9800 by 10 fps instead of 15. It should be the users choice to trade off IQ for fps, not the company that made the card.

The GeForce FX is screwed at the hardware level. I'm sure Nvidia's engineers have since figured out where the problems are and will fix them with the next release. They can't fix the problems just with new drivers though.

Makes me wonder how Nvidia intends to go to the one driver per year release model they were talking about at the Nvidia shader days.
 
The difference being ATi was a runt compared to nVidia when they were caught cheating. nVidia ruined their reputation for a lot of former nVidia fans when they were caught, including myself.
 
Originally posted by Unnamed_Player
LOL.

I guess it's all OK to bash the drivers when you aren't on the side being bashed. I remember when the 8500 was released and ATI did stupid shit with the driver to boost FPS because there were some pretty big flaws in the 8500s design. All the Nvidia fans loved to slam ATI for that and all the ATI fans hated it.
uh the 8500 had no flaws it was a great card for its time, and the cheat your talking about is Quack, they made it so quake 3 would perform better, but as soon as every one was aware of it, ATI removed them right away, every one has been aware of Nvidia's cheating for well over a year, and Nvidia isnt trying one tiny bit to remove them, hell no, in fact, they find NEW AND INTRESTING WAYS to hide, or create NEW cheats.
 
Originally posted by shapeshifter
uh the 8500 had no flaws it was a great card for its time, and the cheat your talking about is Quack, they made it so quake 3 would perform better, but as soon as every one was aware of it, ATI removed them right away, every one has been aware of Nvidia's cheating for well over a year, and Nvidia isnt trying one tiny bit to remove them, hell no, in fact, they find NEW AND INTRESTING WAYS to hide, or create NEW cheats.

Not to mention ever since then, ATi has strived to improve performance across the board. Not in just the popular games and benchmarks. ATi also hasn't given in to the whole "The Way It's Ment To Be Played" campaign. They just adhere to the established standards of DirectX without any special coding required for optimization.
 
Originally posted by ShaithEatery
I'll wait for the official drivers to come out

Originally posted by Saltpeter
Same here. I'm not going to risk anything here...

Er.. The official 52.16 drivers ARE on Nvidia's driver page..

http://www.nvidia.com/object/winxp_2k_52.16

I can say with absolutely certainty that these drivers will have the same IQ issues as the other 52.16 drivers floating around :)
 
Originally posted by ElFuhrer
Exactly. That damn "D" game. I just get a grey flickering screen. 52.xx don't work either. Has anyone found a workaround for this "D" game alpha?

hmmm i finally got the certain game that begins with a H to finally work. no major increase in fps. still i really hate the 5200 and i really cant wait to get my 9800 pro in this machine. i get like all of 10-20 fps MAX with this crap geforce. havent seen a noticable jump in improvement since the 45.23 to the 51.75 and those were unofficial drivers that were not supposed to be released. didnt gain much in the way of 3dmark01 SE, 3dmark03, or aquamark 3 points either. less then a 100 point gain. i think i even lost points on 3dmark03. another wonderful pos by nvidia. they need to get their shit together.
 
Originally posted by shapeshifter
uh the 8500 had no flaws it was a great card for its time, and the cheat your talking about is Quack, they made it so quake 3 would perform better, but as soon as every one was aware of it, ATI removed them right away, every one has been aware of Nvidia's cheating for well over a year, and Nvidia isnt trying one tiny bit to remove them, hell no, in fact, they find NEW AND INTRESTING WAYS to hide, or create NEW cheats.

In a word: bullshit. I've gone over this a few times in these forums. The 8500 was an all new design (from the ground up) for ATI. As such it had a number of bad flaws that held it back from the levels ATI had wanted it to perform at. It's a normal issue with a chip that is an all new design and ATI fixed the problems suffered by the 8500 in their next big card, the 9700. That's why the performance of the 9700 really blew everyone away. It wasn't just an incremental speed increase as most people were used to seeing, but a wholesale jump to another level, all because the engineers had learned what problems the 8500 had and fixed them in the 9700.

Quake 3 wasn't the only benchmark that ATI cheated in either, it was just the one that was first brought to light and, as I said (if you had bothered to read the whole post), they admitted it and fixed the issues promptly. Nvidia doesn't even try to hide the cheats anymore. They spin them as "optimizations" so they don't have to.

Anyway, the problem that the GeForce FX is suffering now is the same kind of problem the 8500 suffered. There are hardware flaws holding back it's performance. It's totally normal and I don't doubt that Nvidia will have something kick ass speedwise out for next spring. I won't be buying it though. All the driver BS has put a bad taint on them that they need to wash off, and right now they don't seem to be all that interested in doing that.
 
yeah how dare nVidia release drivers that don't work well with an old stolen, unoptimised, unfinished cobbled together collection of half-life 2 files. What must they have been thinking, those evil bastards!

/sarcasm obviously ;)
 
Yeah these drivers won't get them out of hot water with the FX series.
Although I have heard from an interview that they said the NV40's core is going to be more "short and wide" rather than "tall and narrow" which is more like ATI's design. They said something like that...lol
Hopfully the NV40 is a big improvement over the FX. And I'm not talking performance but rather fixing all the flaws the FX has.
Although I like the added 2D performance and options these drivers give me (GeForce 4 4200 128MB).
 
Speaking of which: http://www.tomshardware.com/graphic/20031023/index.html

It's not the NV40, it's the fall refresh part. The NV38 they're calling it. I like how they can take the same design, add faster memory, overclock it and call it a new product. I see they went back to the leafblower.

I would prefer it if graphics companies (particularly ATI and Nvidia) would just do a new product once a year instead of these Fall overclocks.

I guess that's the nature of the biz though.
 
lol yeah, funny, what i'm saying is on my gf3 old card.(when nice new cards come out will buy one, but for now it can wait).

I see ur what ur saying about Aquamark, but watching the timedemo, i easily got about 20-25fps more in most sections, sorry not the 2-3fps u were on about :).

Also the funniest thing i've ever seen is a 233mhz(yes not 2.33ghz) beating a p4 2.4ghz pc in 3dmark2003.

233mhz radeon 9800
2.4ghz radeon 9600

Now thats funny.

3dmark2003 sux a$$
 
U know why the fx is having the most problems. and this is a fact, is because the ppl that made 3dfx prob voodoo3,etc designed the Geforce FX,lol
 
Originally posted by Unnamed_Player
It's not the NV40, it's the fall refresh part. The NV38 they're calling it.

I don't think anyone here said anything about the Geforce FX 5950 Ultra being the NV40. Everybody already knows (I was assuming) it was the NV38.
 
Originally posted by Rarehero
Er.. The official 52.16 drivers ARE on Nvidia's driver page..

http://www.nvidia.com/object/winxp_2k_52.16

I can say with absolutely certainty that these drivers will have the same IQ issues as the other 52.16 drivers floating around :)

I have seen no IQ reduction (at least not noticable) and a big performance increase on my GF4 Ti4200.
In the CoD duty demos I used to get quite big frame lag with the 45.23 or whatever they were (last new 1s) especially when firing the Thompson. Now it's a LOT smoother and the tommy fires virtually as it does compared to shooting straight down at the floor (where it's as smooth as you can get). And in the 2nd demo when the tank bursts through the wall with the guys coming over the wall behind it used to frame lag very bad, now it is as smooth as the rest of the demo.
 
Originally posted by Rarehero
I don't think anyone here said anything about the Geforce FX 5950 Ultra being the NV40. Everybody already knows (I was assuming) it was the NV38.

That was in reference to the cards covered by the link (the NV38 and kin) not the spring Nvidia card (aka NV40).
 
Back
Top