DOOM III admits DX9 is slow on GF-FX Hardware

please not another one of these threads!!! STOP THE MADNESS!gsj;dfgdfklgjkldsfjgljdgf
 
kid...seriously add to the thread with useful comments. if not dont post in my threads...god dont try to make me look like a n00b.

this thread is here because i thought, and many other people were under the impression that since nVidia was backing DIII so much that it would run very good on nVidia hardware.
 
This has been posted before and noted by many, take some time to use the search feature next time around.
 
They deserved it. First the NV3x is horrible, it's obvious when they were developing it they weren't expecting ATi do put up the amount of competition that the R300 was, so they did their normal thing. Meaning make a card that is "Compadible" with the newer techniques, but not neccessarily fast with them. Focus on the current games, and "Support" newer ones. Then when the newer ones come closer, release a card that can do the effects better.

ATi on the other hand focused on the future. I like that and hope Nvidia catches on.....
 
Originally posted by KiNG
kid...seriously add to the thread with useful comments. if not dont post in my threads...god dont try to make me look like a n00b.

this thread is here because i thought, and many other people were under the impression that since nVidia was backing DIII so much that it would run very good on nVidia hardware.

yeah your right it pisses me off when i make a thread and people add pointless shit. sry dude
 
this still doesn't mean anything since nvidia's release 50 drivers haven't been tested with dx9 games :p
 
Well King, I haven't seen this info before. Thanks for posting it.
 
Originally posted by Vash63
They deserved it. First the NV3x is horrible, it's obvious when they were developing it they weren't expecting ATi do put up the amount of competition that the R300 was, so they did their normal thing. Meaning make a card that is "Compadible" with the newer techniques, but not neccessarily fast with them. Focus on the current games, and "Support" newer ones. Then when the newer ones come closer, release a card that can do the effects better.

ATi on the other hand focused on the future. I like that and hope Nvidia catches on.....

Isn't this extremely similar to the coup de gras that nvidia inflicted on 3dfx a few years ago?
 
Originally posted by Beazil
Isn't this extremely similar to the coup de gras that nvidia inflicted on 3dfx a few years ago?

Yup i wonder if Nvidia will be bought out by ATI? hehe
 
Hope not, then ATI will monopolize. I know they will, but I hope Nivida gets back on track.
 
Originally posted by TheOriginalEvil
Hope not, then ATI will monopolize. I know they will, but I hope Nivida gets back on track.

Me too. I remember not that long ago when ATI had some of the worst products out there. I know, cause I bought some of em. But competition is good. It keeps prices down (at least that's what they tell me), and it drives technology. Up to now, Nvidia has been the leader in graphics. Let's hope they can pull something good off to get themselves out of this hole they're in.
 
Originally posted by TrueWeltall
Yup i wonder if Nvidia will be bought out by ATI? hehe

Only to be burned and buried in turn by the next card manufacturer coming up (matrox? :) )

"The old king is dead, long live the new king! The new king is dead, longer live the newer king! He's dead too! Long li... okay, who wants to be king?"

There probably is something to it; the minute a company assumes market dominance it gets complacent, staid, etc., and gets ousted from the top by the still-hungry techno-savvy companies below it.
 
Matrox! lol! You know Trident also has some 3d cards on the go! :)

Seriously though, you are right snark. Complacency is the killer of progress.
 
From the link posted by PoLo:

"XGI recognizes that acquisition of Trident Graphics was very important for the startup since besides Trident’s patents and inventory, XGI also got a boost in attracting talents for its research, sales and marketing teams, which it says are now in place. XGI plans to maintain Trident’s product lines and customer base but will release a series of low-end, medium-range and high-end graphics chips by year-end, including a new generation of chips in September, according to Chris Lin."

mmmm.... tasty foot.
 
Originally posted by TrueWeltall
This has been posted before and noted by many, take some time to use the search feature next time around.

i was not aware of this. i look at the forums nearly everyday and since it was notable on the bluesnews site as new info from an e-mail i took it as new and thought people might like to know.
 
Actually there was commentary on the .50 drivers saying that they actually lowered performance so they could run better. It had Halo PC images to compare. Images with geforce were much darker, and Blurrier in Tomb Raider Angel O' darkness.

Nvidia needs to take a round out and work hard on the next chipset, or catchup Reallll fast
 
This has been known for quite a long time. Like 6 months ago or something. Its just that its only now that people bring it up.

And we do have a hardware forum...
 
yes the release .50 drivers have nothing to do with doom3...they're improving directx9 performance....which doesn't matter to doom3.

and like carmack said, doom3 uses the lower precision because of customizations...which won't be an option for future dx9 games.

so he's saying that the FX series will be ok for doom3 but not so great for all the new dx9 games due out.
 
KiNG, thanks for the info. I feel sorry for NVidia as I've always used their cards until recently. It just goes to show that complacency can hit even the biggest corps.

Luckily I bought a Hercules Prophet 9800 Pro (basicaly a optimised Radeon 9800 Pro) about a month ago and it looks like I chose the right brand. Plus it was £100 (yes £, I'm from the UK :p) cheaper than the FX 5900 Ultra :thumbs:
 
Sure Nvidia has gone down but i hope they get a new card out faster and cheaper.
The competition is important not only for the price of the cards but also the quality.
Lets hope new companies can grow.
2 companies for chip for GPU is not enough.
 
Originally posted by KidRock
yeah your right it pisses me off when i make a thread and people add pointless shit. sry dude

You're not forced to read them, you know.
 
Originally posted by Mountain Man
Except Doom III doesn't use DirectX. It's an OpenGL game.

Ummm... are you sure about that? If it's an OpenGL game, why would Carmack comment on writting a custom DX9 back-end that downgrades from the 32-bit DX9 precision standard?
 
Btw, for anyone wondering: It was at the end of january this year he noted the FX was much slower in the standard ARB paths.

If it would only have been a little earlier, I could with all right say:

WELCOME TO LAST YEAR! :p

Edit: and for anyone curious about the OpenGL/DX9 issue: DX9 is not only a renderer, its a crapload more. Audio, network, math, you name it, it got it. Maybe he is using parts of it. But I dont know the specifics.
 
Originally posted by Maskirovka
this still doesn't mean anything since nvidia's release 50 drivers haven't been tested with dx9 games :p

Actually, gamersdepot.com had a review of the 50 drivers, including image quality screenshots from Halo and Tomb Raider. And the drivers were leaked by a site earlier in the week..

Halo colours were messed up, textures were blurred and it didn't look anywhere as vibrant as the ATi cards did...with Tomb Raider, the entire screen looked like it had vasaline smeared all over it.

Not to mention that the 50. drivers don't work with hl1 that great atm due to problems with c-d detecting it as "hacked" drivers...great, nice one nvidia!

And there's also the little matter of the tomb raider benchmark utility being removed, as it showed poor nvidia performance even though it's one of the "way its meant to be paid" or whatever...co-incidence? Not quite sure about that..

This is quite literally out of order as far as i can see, and Carmack has done a great job for sorting it out for the fx cards to run well with Doom3. I'm hoping to see how the ATi cards run with later builds, as well as image quality tests in the near future.

Saying that, the XGI cards look damn nice on paper and may be worth picking up...how odd.
 
hmm... Sounds like Nvidia may be in a lot deeper than I first thought. I would also like to see more competitors in the graphics cards biz, but they would have to be solid competitors, when you had to really choose hard between companies. If you look at the pre-voodoo era it was alot like this. Of course, then we got 3dfx, and everythinng was blown away overnight. Although quantum leaps in performance or technology are always welcome, the downside to it is that you lose your competition. And when you lose your competition, if your not careful, you could lose your edge.
 
Originally posted by JavaGuy
Ummm... are you sure about that? If it's an OpenGL game, why would Carmack comment on writting a custom DX9 back-end that downgrades from the 32-bit DX9 precision standard?


D3 is an OpenGL game like all other id games, Carmack was reffering to DX9 LEVEL HARDWARE i.e HW that has full support for PS2.0, VS2.0 etc.
Since both APIs (OGL and DX9) now include High Level Shader Language, they both can utilise these HW features.
 
Originally posted by The Grim Reaper
D3 is an OpenGL game like all other id games, Carmack was reffering to DX9 LEVEL HARDWARE i.e HW that has full support for PS2.0, VS2.0 etc.
Since both APIs (OGL and DX9) now include High Level Shader Language, they both can utilise these HW features.

Ahh... gotchya. Thx for the info.
 
Originally posted by twofold
with Tomb Raider, the entire screen looked like it had vasaline smeared all over it.

You mean you know what that looks like? :cheese:
 
Originally posted by theGreenBunny
You mean you know what that looks like? :cheese:

Oh come on Bunny, don't act like you've never lubed up the screen!

JK.:cheese:
 
Originally posted by JavaGuy
Ummm... are you sure about that? If it's an OpenGL game, why would Carmack comment on writting a custom DX9 back-end that downgrades from the 32-bit DX9 precision standard?
He talked about using a lower precision pipe-line, but it had nothing to do with DX9.
 
hmm, doom 3 don't use directx, it will use openGL. so i dont worry.
 
Back
Top