Computer Used To Show HL2 at E3?

Require

What will halflife2 use most of? I mean the graphics are great but my geforce 4ti4299 64mb 8xAGP would be able to play the graphics without any problems. But i think my cpu may be to slow (athlonXP 1700+) because of the engine, and my memory of course.


What do you think it will use the most of? :bounce:
 
Re: Require

Originally posted by skogum!
What will halflife2 use most of? I mean the graphics are great but my geforce 4ti4299 64mb 8xAGP would be able to play the graphics without any problems. But i think my cpu may be to slow (athlonXP 1700+) because of the engine, and my memory of course.


What do you think it will use the most of? :bounce:

Good job showing off your crazy computer. The MINIMUM requirements for HL2 are a 800mhz cpu with a DX6+ video card. But im sure you knew that already.
The game, like any other game, will take up most of the video card's processing.
 
huh?

Crazy computer? Im not even sure what you mean with that. If you think it sucks i can understand that, if you think its good i dont know if it will even run half-life2! :bounce:
 
Originally posted by Prom3theus
well GF 4 dont have 8x Anti aliasing right? how the fu*k could thy use it then maybe it was a radeon 9800 :eek:


Some of them has it. Mine does.
 
Originally posted by urseus
That guy got anoying. He made one good joke at the start, but had to keep it going.

Im like the people who make that initial funny joke, but know not to keep going and going and going untill people want that smartass assbandit to shut the **** up.
Bah. I guess you never enjoyed the ages of 486 and C64 computers...ah, those days spent playing with spreadsheets...memories...

-Vert
 
;( Bhuuu this just bring back old memories.......

Did the E3 computer had a Ge4 or a Radeon ?!!?
 
Originally posted by Laguna
;( Bhuuu this just bring back old memories.......

Did the E3 computer had a Ge4 or a Radeon ?!!?


I think it was a geforce4 (hopes) :bounce:
 
Can you people please use that that muscle that is present in your skull? So sit down, shut up and listen. (I do acknowledge that there were some people who were not brainless idiots :cool: )

Its being shown on the ATi stand, don't you think that it'd mean they'd be using an ATi card? To use a nvidia card would be a kick in the teeth to ATi, and they wouldn't allow Valve to show it. Therefore, through simple logic, an ATi card will have been used, the flagship model is impled by Gabe mentioning the 9800...The HL2 presentation has two purposes, one, show off HL2, two, pragmatically suggest that ATi cards are the choice for running HL2 on.

Second point, getting pixellated to increase fps?! That idea is too much, afaik, that would create a performance hit, rather than a performance gain (loading new textures etc). You people act as if you're viewing a perfect direct feed stream. You're not. Its a cam, a DV cam, and they pixelate during high detail due to the limitations of the format it records in.

Take this in account too, the monitor was huge! What resolution would the game have been running at?! 2048x1536 at least, along with AA and all the other bells and whistles required to make it look nice.

As for the GF4 thing. That was from the original HL2 article that was scanned, where it was said that a 2mhz pentium with a gf4 would be all that was needed to run the game on full detail.

...now all i have to do is wait for the foolish people to start flaming me..
 
Originally posted by twofold
Can you people please use that that muscle that is present in your skull? So sit down, shut up and listen. (I do acknowledge that there were some people who were not brainless idiots :cool: )

Its being shown on the ATi stand, don't you think that it'd mean they'd be using an ATi card? To use a nvidia card would be a kick in the teeth to ATi, and they wouldn't allow Valve to show it. Therefore, through simple logic, an ATi card will have been used, the flagship model is impled by Gabe mentioning the 9800...The HL2 presentation has two purposes, one, show off HL2, two, pragmatically suggest that ATi cards are the choice for running HL2 on.

Second point, getting pixellated to increase fps?! That idea is too much, afaik, that would create a performance hit, rather than a performance gain (loading new textures etc). You people act as if you're viewing a perfect direct feed stream. You're not. Its a cam, a DV cam, and they pixelate during high detail due to the limitations of the format it records in.

Take this in account too, the monitor was huge! What resolution would the game have been running at?! 2048x1536 at least, along with AA and all the other bells and whistles required to make it look nice.

As for the GF4 thing. That was from the original HL2 article that was scanned, where it was said that a 2mhz pentium with a gf4 would be all that was needed to run the game on full detail.

...now all i have to do is wait for the foolish people to start flaming me..


But if they got the computer from valve then? Huh? Ever think of that? :bounce:
 
twofold just told you everything you need to know, why are you still replying ? lol
 
Well why would it run better on a radeon then on a gf4? :bounce:
 
Originally posted by skogum!
Well why would it run better on a radeon then on a gf4? :bounce:

Maybe because the Radeon series is more powerful? Maybe because Valve would have been paid by ATi so that they would show HL2 on their stand?
 
HECK NO! Radeon is stronger and higher clocked memories and so but they dont giva a f**k about graphics. Im sure that with a poor geforce3 card in your computer it would look as nice as with a radeon... (ok maybe not as nice but almost) :bounce:
 
The computer they showed it on may have been a radeon but the computer they recoded it on may have been a gf4... :bounce:
 
skogum!, you bring new meaning to the phrase "ignorance is bliss".
 
I know but i really want it to run on my PC...SO BAD! :bounce:
 
wtf? a GeForce 3 is about 2 years older than a 9800 pro, in computer terms thats a long time for changes to happen. The 9800 pro literally pulls out its microchip penis and wees alllllll over any nVidia card that is lower than an FX 5800.
 
What I meant was that radeon is fast but they dont care of quality,....:bounce:
 
ok...I believe you but I must say at radeon is not that much better...:bounce:
 
Originally posted by Murray_H
wtf? a GeForce 3 is about 2 years older than a 9800 pro, in computer terms thats a long time for changes to happen. The 9800 pro literally pulls out its microchip penis and wees alllllll over any nVidia card that is lower than an FX 5800.

Hehe, i do have to agree with that sly remark ;)
 
On speed, Yes...but I'm stickin' to the low graphics part...I'm not supposing that it will look crappy just that they almost never come and say that they have done anything new with the cards other then giving them a hogher clockspeed

:bounce:
 
Well, after 60+ inputs to this thread, many of them unsupported, I think it's somewhat safe to declare the the computer used to show HL2 at E3 as having the following specs:

Pentium 4 Processor 3 ghz (3,066 mhz)
ATi Radeon 9800 128 Meg of Ram 8x AGP
> 512 meg of RAM ( at least PC-2700, probly top of the line 32 of 3500)
 
yea that sounds about right, i dont know about most videos, but i think a few of them were with a different system(somewhere else maybe?) i remember reading that it was a 2.0GHz g4 1024 ddr ram. but i dont know which particular demo was made with that machine.
 
Well ive got a Geforce4 Ti 4200 128 model with a Pentium4 2.2ghz, 512mb pc333 ram so im hoping like hell it will run at least "ok" on my system.

-Razor2YK
 
Whoa ! Nice ! Then it will run like hell with my 2ghz + GF FX5600 256mb.
 
Wait... I thought HL2 takes advantage of DX 9!!! GF4 is certainly not DX9 compliant. The system which they MADE the demo with was a 3 GHz with a 9800, but they played it on a 2 GHz and GF4.

Still, I have a 1.7 and Radeon 9000 Pro, and I expect a system similar to mine will pull out around 20-30 fps. No, really.
 
Originally posted by Ridic
i know, what a bumch of irony. yes it was definitly useing a geforce 4 in the ati booth ;) i remember them stating that.

they also stated Sept 30th.

and stated the AI was not scripted at E3.
 
jesus christ! every 2nd post ha the bouncy green things! stop it!
 
ok...I believe you but I must say at radeon is not that much better...
you keep thinking that, but in reality the radeons shit all over nvidia

and stated the AI was not scripted at E3.
they stated the ally AI was not scripted, they never said the enemy AI wasn't scripted.

a game without scripting would just be like deathmatch
 
To answer the original question is THEY USED A DELL XPS GAMING SYSTEM FOR THEIR E3 PRESENTATION AND THE GFX CARD WAS A 9800PRO 128MEG

THAT IS OFFICIAL
 
You retards. Look in the VALVe INFO ONLY THREAD. It was a 2.8ghz 800fsb, 1 gig of ram, Radeon 9800 128mb. I've never read anything that said it was on a xps
 
Wow, 5 pages just to come to the conclusion that they are using a Radeon 9800. Who the hell cares. If your comp is above a 2.0, you'll run it. Just maybe without 1600x 1200 and sub par. woopdy shit.

BTW I wouldn't even try playing a game at minimum reqs. You think they'll be honest and tell you a min. req that actually runs smooth? I bet it will run like shit ~800 mhz, like a slide show. Jesus, Warcraft 3 runs pretty shitty at times on anything less than a 1ghz and you expect HL2 to run on something around that speed? They'll tell you a min req that u can actually boot the game up at, not one that offers actual quality gameplay.
 
Originally posted by staddydaddy
You retards. Look in the VALVe INFO ONLY THREAD. It was a 2.8ghz 800fsb, 1 gig of ram, Radeon 9800 128mb. I've never read anything that said it was on a xps

it was a dell XPS gaming system, there are pictures from E3 of the system :)

and yes, 2.8 800fsb, 1gig ram, 9800 pro 128mb, correct.
 
Is that the system that played the recorded demo(or bink files was it at e3?), or the system that recorded them originally while it was being played
 
Originally posted by Fenric1138
Is that the system that played the recorded demo(or bink files was it at e3?), or the system that recorded them originally while it was being played

at E3 they were running .dem files, which renders the same exact scene everytime (other than some of the ragdoll physics which Were different each time they played it back) in real time.

i dont know where people got this notion that they were jsut playing videos, it was ingame demos running in real time in the engine.
 
What are you guys talking about? The were using a 2.8GHz Dell Dream Machine equiped with a Radeon 9800 Pro.
 
Originally posted by Xtasy0
at E3 they were running .dem files, which renders the same exact scene everytime (other than some of the ragdoll physics which Were different each time they played it back) in real time.

i dont know where people got this notion that they were jsut playing videos, it was ingame demos running in real time in the engine.


oki cheers :)


as for the notion, probably cause less things can go wrong with pre-recorded clips from a demo made on it from someone playing the game, i dunno, all i ever saw was a closeup on the screen so hard to tell i suppose
 
I remember some where that they said they did use a dell xps with a radeon 9800. And for the guy who said that radeons have bad quality but fast speed is a dumbass.
my specs!:
pentium 4 3GHz
Radeon 9700 pro
512 DDR RAM
80 GB
Sound blaster Audigy with 5.1 Creative/ Dolby surround sound
Oyeah!!!!!!!!!!!!!
im going to have me some fun when DOOM3 and hl2 come out!!!!!!!!!!!!
 
Back
Top