Computer Used To Show HL2 at E3?

My Specs:

3.0C
IC7-MAX3
1024 3500C XMS Corsair Ram
9800XT 256 Meg
Audigy 2

:cool:
 
Originally posted by Prom3theus
well GF 4 dont have 8x Anti aliasing right? how the fu*k could thy use it then maybe it was a radeon 9800

Originally posted by skogum!
Some of them has it. Mine does.

No. Yours doesn't.

Geforce 4 / Geforce FX cards don't support 8xAA. At all. You were mislead/misinformed.

Geforce 4 / Geforce FX cards support 4xAA & 8xAF. You were either told otherwise, or you accidentally mistook 8xAF for AA.

AA (Anti-Aliasing) reduces jagged edges along the edges of polygons. AF (Anisotropic Filtering) increases texture quality on textures that stretch far out and away from the view of the player.
 
Wait, you're debating on what card was used to record the scenes in the E3 demo? I seem to remember the narrator stating it was a GeForce 4 being used in the docks scene. It makes sense cause none of the scenes in that demo showed off any DX9 specific effects (PS 2.0, HDR, Character Bump-Mapping). Top of the line GF4's actually run DX8 better than the FX series, so it makes sense. The Bink videos are run at 1024x768 and normalized to 30fps. Where are you getting that it was run at high resolutions and framerates?

Also, if you think HL2 is going to be more taxing on your video card than your cpu, you're dead wrong. The physics are very taxing on your cpu. They already stated that lower-end machines will not be capable of displaying realistic physics. I'm guessing that your cpu will be just as limiting as your video card in this game.
 
"2.8GHz Dell Dream Machine"

OH THE IRONY!!!!!!!!!!!!!!!

dell will never be anything else than a crap machine

Oh and since every1 else is bragging like a bitch i might as well tell yah what i is having..

pIII 2.8c 9800 pro 7.1 sndcard(w/ optical) and
50" hdtv w/ DVI(1x1 pixel mapping) . oh and 512 ddr400 ram, dont need moe than that yet
 
Originally posted by iamironsam
It makes sense cause none of the scenes in that demo showed off any DX9 specific effects (PS 2.0, HDR, Character Bump-Mapping)
No Dx9 effects?? hahahahhahaahahhha
 
Seriously guys, your dick doesn't grow when you post your badass specs. Besides, we don't give a flying f*ck on what kind of penis-compensation machine you'll be playing HL2, no really we don't care.
 
Ugh, all of ya bighting your fingernails with the question *will HL2 run good on my comp* will if you have anything above a 1.8ghz *pentium mind you* and a 128 with atleast 512 DDR, your freakign set.
Yuk, computers are so damn primative right now. hard to imagine that 3.60 is the highest they go for a desktop.
 
and as Ryan said, we dont care about your machine. now, if it has a cool mod that you did by hand, then we might say wow.
 
heres my 2 cents:
I also heard that they used a 2.8ghz (highest clockspeed out then) equipped with a radoen 9800 pro or was it a 9700 pro?
But this is the thing, they would never use a geforce 4 at an ATI BOOth, for starters, it means nvidia cards are better than ATI's and second it would discredit em and third, nividia would flip. They used a product related for commercial use without asking them...

and isnt the whole point of E3 showing what the engine is fully capable of?
Its simple logic to me......
 
Originally posted by PvtRyan
Seriously guys, your dick doesn't grow when you post your badass specs. Besides, we don't give a flying f*ck on what kind of penis-compensation machine you'll be playing HL2, no really we don't care.


lmao ha I understand what you're saying.
 
OMG my penis is teh bigg3st cuz i got 2 gigs ram omfg wtf!1!!!

..........
 
From a major news source, Microsoft is, with the assistance of nVidia, about to release a new version of the great and essencial DirectX 9. And it will be Direct X 9.1. What's so amazing about this? Well, this new DX will favour nVidia's CPU's, especially regarding the pixels shaders 2.0. When accompanied by a regular update, the expected performances would be with the height of the unconditionnal hopes of the californian company. The perfomances in DX9 (for example HL2) would then be similar to those of the competition in terms of quality and especially in terms of speed. And all in precision mode of 32 bit in the PS 2.0.

Maybe a better management of the instructions of the PS 2.0 (improvement management of the streaming instructions ). To quantify, that could go to 60% of profit. A percentage that seems confusing and amazing but that we can't verify before it is released. All that we can say is if that is going to occur exactly as annouced, it will be an enormous relief for all the purchasers of GeForce FX.
 
where did u get that info jackeld? cause i bet it cant, half the pipes of the ati chip cause problems, unless they are lame hacks. And if microsoft do this, then wont the creaters of the game have to write nearly twice as much code? one for ATI cards and the other for Nvidia cards?
 
Nvidia has 4 pipes. ATI has 8 pipes.
FX can do 2 textures per pipe. Radeon 9800 does 1 texture per pipe.
Add it up. 4x2 =8 textures per clock for FX. 8x1=8 textures per clock for 9800.
There are times when z-color is used, I believe, when Nvidia cannot use all the pipelines. Something to that affect.
DX9.1 should support DX9 specs (e.g. 24bit precision) as well so they shouldn't need to write twice as much code.
DX9 games support dx9 but not dx9.1 unless the developers modify the game code which is doubtful. Although games made for DX9.1 from the start of course will take advantage. ;)
What does this mean? Only good for the future. Oh and the next gen. gfx cards Q1 '04 support dx9.1 (from ATI and Nvidia)
 
Originally posted by jameth
you keep thinking that, but in reality the radeons shit all over nvidia


they stated the ally AI was not scripted, they never said the enemy AI wasn't scripted.

a game without scripting would just be like deathmatch

Yeah ok,

"Hi guys this is my new game it rocks, the AI is not scripted.

a few months later, oh well i meant apart from this this and that, obviously duh!!!

but theres 1 part its not scripted

trust me ive only lied to you about the release dates the ai and god only knows what else :dork: "
 
What they stated was that the ally AI was scripted. However if you read the first Gamespy preview you'll see them say that the soldier kicking the door down wasn't scripted and was entirely AI. Gabe, or anyone else from Valve never said that the "Traptown door kicking sequence" was not scripted.
 
Originally posted by Ridic
trust us they used a geforce 4. IN THE ATI BOOTH.

I dont want to read this entire thread, but I'm sure Im not the first to correct you...They DID NOT use a ****ing Gforce in the ATi booth. I dont understand why so many people on these forums seem to "remember" complete bullshit and every other dumb ass just takes it at face value and before you know it, theres 300 asshats running around spreading that bullshit like a virus.
 
Who cares? We all know it'll run better on a 9800 PRO than it would on a GF4.
 
Yeah ok,

"Hi guys this is my new game it rocks, the AI is not scripted.

a few months later, oh well i meant apart from this this and that, obviously duh!!!

but theres 1 part its not scripted
hey, if you read my post correctly i never said ALL the AI wasn't scripted

i agree that the enemy AI is scripted, every game is scripted in some parts, but what i said is that the ALLY AI wasn't scripted (which is what Gabe said as well, you just misunderstood)


trust me ive only lied to you about the release dates the ai and god only knows what else"
wow, we're just making up shit here?

i never mentioned released dates, why dont u learn to read properly
 
Back
Top