ATI or NVIDIA?

ATI or NVIDIA?

  • ATI

    Votes: 121 72.0%
  • NVIDIA

    Votes: 47 28.0%

  • Total voters
    168
Status
Not open for further replies.
Originally posted by DuncanIdahoTPF

If you dont believe us, then you are effin retarded.

yep, I'm retarded:

So we actually author for hardware that doesn't exist. And then our game will scale from a [Direct X 9 system] all the way down to [Direct X 6]. Of course, it certainly won't look like this on a DX6, but the game will still play and it will still be fun.

right, let me explain this.... they said DX9 system, they will have DX9 grade graphics, but they won't have had time to code in all teh new stuff that has come out with DX9...... OMG do you guys have any f*cking idea how long it takes to code in features exclusive to DX9...... i very much doubt the HL2 is a DX9 game.....
 
Originally posted by mrBadger
right, let me explain this.... they said DX9 system, they will have DX9 grade graphics, but they won't have had time to code in all teh new stuff that has come out with DX9...... OMG do you guys have any f*cking idea how long it takes to code in features exclusive to DX9...... i very much doubt the HL2 is a DX9 game.....
Personally, I dont think the devs are lying. There will be DX9 features to take advantage of on the DX9 hardware. Whether its enough to classify it as a "DX9 game" is another matter. If you had a DX9 game that was strictly DX9 without any shortcuts, you would only play it on the newest cards (actually only the R3x0 and NV35, but cutting pixel precision is probably just loss in quality, its still DX9 for the Nv3x). No developer want that...
 
PRECISELY! FINALLY SOME SENSE :) FROM THIS DUMASS NON-THINKING NEWBIE-INFESTED FORUM. Please guys, think about what you are saying, read the interviews you quote... and I am sorry for any rants from me you incurr :)
 
hAHAA.

This game will work fine with full AA and FSAA enabled only on ATI card. Goodbye Nvidia
 
:LOL:...... you believe that :p....ahahahahaaahahaha ..... Comeon kiddies lets believe all that is said by people who are paid to market a specific product! Right, let me point out two things:

1. This is highly unlikely
2. The SDK is gonna be out soon, and I will personally edit the engine until it runs with full AA and FSAA on nVidia cards
3. If this is true then most likely this balence will be readdressed soon, since Valve want the most people possible to play this game

so either way.... nVidia won't be hurt as bad as you ATi fanboys think :p
 
**** you man I am after the best cards only. I had N G4 ti and it works very fine. But now its time to upgrade the shit. and nvidia is no more part of this cycle, ATI is the new beast.
 
Don't swear at me, thank you... I don't take kindly to that. Thing is ATi may have the drop atm, but nVidia's latest FX card is VERY powerful, and offers inproved performence in games, where it counts, over it's closest rival.
 
....what do you idiots think DX9 is?

How exactly do you think you can be "slightly" DX9?

DX9 is lke a light switch. It's on or it's off. If you use it, you TOTALLY use it. If you don't, you don't. That simple.

It will have DX9 features, it will run on a DX6 card. The reason this is is because DX9 contains all previous instructions from the other DXs. It's why you can still have programs that run on Win 98, ME, 2000, and XP. Windows is big on the compatability.

It will be, as far as I can see, the first DX9 game out the gate. A game that used nothing but DX9 intructions wouldn't by default look better than any other game. They are just tools, they don't make anything better by default. It's just what you do with them.
 
Originally posted by Boogaleeboo
....what do you idiots think DX9 is?

It will be, as far as I can see, the first DX9 game out the gate. A game that used nothing but DX9 intructions wouldn't by default look better than any other game. They are just tools, they don't make anything better by default. It's just what you do with them.

I think DX9 is an engine, and a set of instructions for games :) (I may be wrong, thats just what I know)

it won't be a DX9 game for the last time, we won't see DX9 games for YEARS..... Warcraft 3 required DX8 I think, but no way was it a DX8 game :p
 
mrBadger your very old history, I think your brain is not functioning fully, DX9 is fully in this game and if you make a comparison between STALKER and HL2 the level of graphic is similar, STALKER is fully DX9 and HL2 as well.

COPY THAT end.
 
erm:
Microsoft DirectX is an advanced suite of multimedia application programming interfaces (APIs) built into Microsoft Windows; operating systems. DirectX provides a standard development platform for Windows-based PCs by enabling software developers to access specialized hardware features without having to write hardware-specific code. This technology was first introduced in 1995 and is a recognized standard for multimedia application development on the Windows platform.

it's an engine :p... thats what it says, and its a set of instructions (they help the hardware and software talk to each other)... thats what the 'you are wrong' thing says :p
 
What it is, what it's always been, is instructions. Period. Nothing special. It's not this grand amazing thing that takes 3 years to figure out. It's not something that something needs to be redesigned from the ground up to use. It's a serious of programs microsoft tied together so all new hardware and software would have one set base to work with. It's not rocket science. HL2 uses DX9 for some of the more advanced things, like bump mapping on NPC characters and such. If it uses 1 tiny DX9 feature, it is a DX9 game. You can only get a DX9 feature using DX9. There is no half way shortcut. It DOES use a DX9 feature. More than 1 in fact.

It IS a DX9 game.
 
I'm sorry DX9 is an engine, it has an SDK and code is required (some of it quite complex) to run it effectivly in games.
 
And 80% of it is the same shit as 8.0.

I repeat. They've had 4 years of solid work and they've talked to the card makers and Microsoft about what they were going to be doing. It's called "planning ahead". It's not a massive deal. Everyone else does it too. How do you think Carmack knew that cards would be out today that could handle DoomIII?

Lucky guess?

And do you think Nvidia and Ati make cards without telling the game makers, the people that sell the cards, what wow l33to features to do? They aren't going to release a 400 dollar card if no games need it to run yet, or for the next 3 years while somebody needs to learn how to actually code something that uses them.
 
Originally posted by mrBadger
I'm sorry DX9 is an engine, it has an SDK and code is required (some of it quite complex) to run it effectivly in games.

exactly......HL2 was written using DX9..... it may not use all dx9 features..but it still a dx9 game.


so i dont understand what your talking about
 
yah, Boogaleboo, you just said it wasn't an engine :rolleyes: I know.. .I think it doesn't use DX9 features really that much at all :p You are right about the rest of stuff though...
 
Originally posted by DimitriPopov
Thnx badger , man everyone in this thread is making me feel like I bought a bad card!!!!! FX5900 256.

Making me feel a little down :-/

Don't worry about it matey, nvidea do excellent cards - ati do excellent cards, but at ripoff prices.

I really don't see the point paying a further £100+ for a few more frames per second, IMHO.
 
Originally posted by Lifthz
ERr... no, not necessarily at all. Those are just myths, in fact the Geforce FX 5900 Ultra is the most powerful hardware overally right now.

Thank you. :)
 
I suppose I didn't make it clear enough, but the part of your post I was saying was wrong was this "it won't be a DX9 game for the last time, we won't see DX9 games for YEARS..... Warcraft 3 required DX8 I think, but no way was it a DX8 game ".

To which I replied "You are wrong.". How it works is that a DX8 instruction by definition makes it a DX8 game. It's the same for DX9. Taking advantage of new DX9 instructions doesn't take years and years to figure out, most of it is the same shit as 8 and to some degree 7. It's just intergrating the new instructions into your game.

I'm sorry I didn't make that clearer.
 
Yeah, I guess I wasn't too clear either :p.... I meant games built on DX9 engine :).. sorry about all the confusion

/me shakes hands with Boogaleeboo in a gesture of peace :)
 
I wuv woo too huggybear.

In honor of this, you can be the first to try out my HL2 mod. I envision it as a giant open field with nothing but NPCs and small rocks. You will have a manipulator gun to pick up the rocks and shoot them at the genitals of the NPCs.

I call it "Shoot rocks at the genitals of NPCs mod".

I'm not set on the name though.
 
You could have the special "testical targeting" feature, or the "sack seeking" rocks. It would be an amazing mod. If I had money I would give you some money Boogaleeboo.
 
Personally I intend to hit some of the local Drama schools and pick up some people for voice acting. Little things like "Oh my God this hurts so much!" "I can't feel my legs!" "Jesus, I taste blood..." "Why are you doing this you depraved monster?" and so forth.

That, and using all fourty of those facial muscles to convey someone has been nailed in the happy.
 
awwwwww thanks Booga, you don't mind if I call you Booga do ya? :p Sounds a great mod :) :)
 
Originally posted by Lifthz
Sounds like "I got a 9700PRO so now I lost all my faith in Nvidia and they can never make anything good again" to me...

And Nvidia never "cheated" on 3dmark, and 3dmark admited it. Also 3dmark is not a true Dx9 benchmark ANYWAY.

Bullshit. Futuremark had new ferarri's after making that statement. Either that or they were about to get sued.
 
Originally posted by omlette
Bullshit. Futuremark had new ferarri's after making that statement. Either that or they were about to get sued.
Nvidia never "cheat", havent we learned that? They just do "Nvidia Optimisations(TM)" :D

I actually would like ATI to give out a special driver set using their optimisation technique, it would be awesome to see a 9800 Pro score 15,000 marks in 2k3 :eek:
 
God, that whole thing is blown outta proportion.... it's in games that it counts :)... which is where nVidia rock the socks off :)
 
Originally posted by mrBadger
God, that whole thing is blown outta proportion.... it's in games that it counts :)... which is where nVidia rock the socks off :)
Except Unreal Tournament 2k3 of course, where they rock the trilinear filtering off it, literally :D
 
Eheh, thanks to the nVidia's cheatoptimizations.

Damn, people still don't get it...:eek:
 
it all depends on who you trust really...... I am immensly dissappointed in the last developer I thought would 'sell out' ... and have told the programmer who I talked to that I am, and I hope they take note of it. I trust PCGamer UK.... and they benchmarked using real games, and the FX 5900 went FAR better than the corresponding ATi card. I would like to point out that I am not an nVidia fanboy, and it seems to me that everyone knocks them for no real reason, I have never had a problem with them.

oh and btw, I think that gamepc.com is in the pocket of ATi if they report nVidia cards performing worse on games that are 'the way it's meant to be played' which quite frankly is bullsh!t.
 
My last 5 video cards were nVidia cards, and my current 2 cards
are also nVidia. ( ASUS GeForce 4400, ASUS GeForce 4 MX Dual DVI )

My next video card will likely be an ATI 9800 ( I'm waiting for Tyan's
Dual DVI 9800, as I use dual LCDs on my workstations. )

What's turned me off nVidia are the strong-arm tactics, and those
wonderful driver "optimizations", which totally piss me off.

i.e. Winning by cheating = Losing

I'm hoping that eventually nVidia will clean up their act, and stop
trying to shovel garbage on us.

my 2 cents.


G.F.
 
mrBadger, if you still trust nVidia, even after the 3DM03 fiasco, the UT2003 (and other "The way it's ment to be played" games) Anisotropic Filtering lower IQ/Adaptation, the decreasing of Fixed Point precision (lowering Image Quality) to achieve better speeds, I think you really should rethink about your concepts of TRUST.

nVidia has to CHEAT (and use proprietary driver paths) in games in order to beat ATi, that's why it cheatoptimizes the games/benchmarks reviewers often use.
That's exactly what GordN FreeLoadR, dawdler and other people here are trying to point you out.

Again, if an unbiased reviewer starts using uncommon games/benchmarks to test the videocards, you'll see what we're trying to tell you.
If nVidia don't apply their "optimizations" in the drivers, the application/game will show a very noticiable gap between a (e.g.) 9800 and a 5900.
Sometimes, even applying the optimizations, nVidia isn't able to beat ATi.

I don't know about you, but that's the way I see it:
ATi is now the leader in speed and image quality while nVidia is number two. It finally happened and people still don't believe it, pointing out biased reviews with common benchmarks.
I'm not telling you this because I want to be a fanboy, I'm telling you because I build PCs for a living and I did all the necessary tests ;)
 
Well, I shall test for myself, thats always the second step for me..... I'll make my own benchmark.... then I'll buy the card I feel performs best..... nVidia will be back IMO.... bigger and better than ever before :)
 
Status
Not open for further replies.
Back
Top