The First Unreal Engine 3 Game

alan8325

Newbie
Joined
Sep 30, 2003
Messages
277
Reaction score
0
Anyone know what the first game utilizing UE3 to be released will be? The only three possibilities I know of are Huxley, UT2007, and Gears of War.
 
Believe it or not there is already atleast one game out using the engine.. I cant remember but there was definately a demo on fileplanet.

The next 'big' game I think is UT2007.
 
yeah.....it was called robohordes I think.....robo-something anyways.
 
This requires a system with an intel processor w/ HyperThreading to play! This doesn't function on AMD CPU machines.

failllllllllllllllllll
 
I'm guessing that Gears of War will be the first as it might be released during the summer. I suspect that Huxley and UT2007 might be released sometime in the fall.
 
babyheadcrab said:
failllllllllllllllllll
QFT. Nearly all (imo) hardcore gamers go for AMD's if they know what they're doing.
 
dekstar said:
QFT. Nearly all (imo) hardcore gamers go for AMD's if they know what they're doing.
lol, you sound like a marketing spy ...
 
dekstar said:
QFT. Nearly all (imo) hardcore gamers go for AMD's if they know what they're doing.

I used to be an AMD fan. I bought my first Intel chip at a time when AMD was a loser in speed and price. Not to mention my Intel will never overheat and burn up compared to an amd, as it has safety protection should my fan fail or something.
 
Raziaar said:
I used to be an AMD fan. I bought my first Intel chip at a time when AMD was a loser in speed and price. Not to mention my Intel will never overheat and burn up compared to an amd, as it has safety protection should my fan fail or something.

You must be talking about the old AMD's. The new AMD's never get hotter than around 30*C compared to Intel P4's that usually gets 50-60*C

:)
 
Garfield_ said:
You must be talking about the old AMD's. The new AMD's never get hotter than around 30*C compared to Intel P4's that usually gets 50-60*C

:)

I'm talking about back in 2003 when I bought my computer for halfllife 2. Not to mention I sort of had a vow for a while that I was never going to put on another mother****ing amd heatsink again, due to the extreme frusterations in the past of having to use a screwdriver to slip it on, and it was inferior enough that I slipped and jammed the screwdriver tip into the motherboard(I had to use that much force). Thankfully it wasn't broken.
 
Regardless, I think UT 2007 will be the one that uses it best. Judging from the limited media I've seen of Gears of War, plus some constraints on the XBox 360 programming (at this time), it be near the behemoth the engine can create.
 
Raziaar said:
I'm talking about back in 2003 when I bought my computer for halfllife 2. Not to mention I sort of had a vow for a while that I was never going to put on another mother****ing amd heatsink again, due to the extreme frusterations in the past of having to use a screwdriver to slip it on, and it was inferior enough that I slipped and jammed the screwdriver tip into the motherboard(I had to use that much force). Thankfully it wasn't broken.

The heatsinks that come with 64bit AMD chips, are easy to install as turning the oven onto the right temperature.

I went for the oven as an example, because difficulty is varied between people. The difficulty ranges between Easy, and downright ludicrous to the point where you could fit 10 on in 45 seconds
 
DEATH eVADER said:
The heatsinks that come with 64bit AMD chips, are easy to install as turning the oven onto the right temperature.

I went for the oven as an example, because difficulty is varied between people. The difficulty ranges between Easy, and downright ludicrous to the point where you could fit 10 on in 45 seconds

The AMD's i'm talking about are quite old. amd 1.2ghz I think from quite a ways back. I can't remember, one of my older computers. they were a ****ing BITCH to install the heatsink on.
 
Maybe yours was broken/bent somewhere. My AMD 3700 that I recently bought was a pain to install the heatsink on because some metal part was bent...
 
Idonotbelonghere said:
Maybe yours was broken/bent somewhere. My AMD 3700 that I recently bought was a pain to install the heatsink on because some metal part was bent...

It required the force of a thousand Short Recoil's to install. Thankfully I have those in droves.
 
dekstar said:
I'm not

*subliminal message: Buy AMD!*



*runs*

Hm...there's something suspicious here, I think it's the way he's running.
 
Play UT2004 and check out a Gears of War FAQ or Interview.

Very different games.
 
About RoboHorde, it's not an actual game, but IIRC it's simply just a sort of game advertisement for Intel... (Buy Intel and play the first UE3 "game"!)

The first real one will probably be UT2k7.

You should expect a benchmarking tool to come out before that though, so you can tweak your computer accordingly :)
 
Intel actually just 1up'd AMD...

The Pentium Extreme Edition marks real progress for Intel on multiple fronts. It is the fastest all-around desktop CPU that Intel has ever produced, and thanks to its faster bus, larger cache, and higher clock speeds, the Extreme Edition 955 consistently outruns the older Extreme Edition 840. These features, combined with NVIDIA's multithreaded graphics drivers, even make the Extreme Edition 955 a reasonably solid choice for 3D gaming—faster than the P4 Extreme Edition 3.73GHz, believe it or not. At the same time, the Extreme Edition 955 consumes less power at peak than the Extreme Edition 840, proving that Intel's 65nm fabrication process can deliver the tangible benefits that we've come to expect from a die shrink. That's comforting news after our faith was shaken by the Pentium 4's power and heat problems at 90nm. Not only that, but there's apparently quite a bit of clock frequency headroom left in this 3.46GHz processor. Ours ran stable for hours at 4.26GHz with nary a hiccup.

http://techreport.com/reviews/2006q1/fx60-vs-955xe/index.x?pg=1

THAT SAID! I'm still the proud owner of a dual core AMD chip :)
 
babyheadcrab said:
Intel actually just 1up'd AMD...



http://techreport.com/reviews/2006q1/fx60-vs-955xe/index.x?pg=1

THAT SAID! I'm still the proud owner of a dual core AMD chip :)

techreport said:
Our test results make it clear, however, that Intel probably won't be able to catch up with AMD using processors based on the Netburst microarchitecture; it will have to wait for its new microarchitecture for that. Despite being produced at 90nm and having a much lower clock speed, the Athlon 64 FX-60 nearly ran the tables in our array of benchmarks, and it did so while consuming less power—both at idle and under load—than the Pentium Extreme Edition 955. The FX-60's performance dominance wasn't always deep, but it was very wide, with the top spot in only a few tests going to an Intel processor.

I think you need to read reviews more carefully. Intel make solid chips, and techreport praises this new one as their best chip ever, but it _still_ underperforms compared to AMD's top chips in most situations.

Competition is good and I am happy to see that both intel and amd are producing worthy cpus. Intel leads in market share which is why I'm particularly happy about amd's lead in performance/price. I remember the "good olde days" before the amd k6 units started to do good things for amd, and believe me, we don't want to go back to one company dominance. Intel's progress and pricing were both attrocious. Progress skyrocketed and prices plumeted after intel realized that amd were actually getting a foothold in the industry.

.bog.
 
One thing that has always pissed me off About AMD... is their names. Why the hell can't they just be speifically clear what gigahertz their processors are, without having these funky names, like 2800 and stuff, yanno? Because with amd, as far as I know, 2800 is not a 2.8ghz.
 
Raziaar said:
One thing that has always pissed me off About AMD... is their names. Why the hell can't they just be speifically clear what gigahertz their processors are, without having these funky names, like 2800 and stuff, yanno? Because with amd, as far as I know, 2800 is not a 2.8ghz.

That's true, but it is equally true that gigahertz does not imply how fast a cpu is anymore. Actually AMD have been pretty spot-on with their pr-rating, which is supposed to illustrate approximately how fast their cpus are compared to other cpus with less efficient clock-cycles but more of them (either older athlons or current intels, depending on who you ask). Intel has also started to use a non-gigahertz naming for their cpus.

.bog.
 
As boglito said.

Amd dosn't state their Ghz because it's a useless measurement.
My Amd athlon 3500+ runs at 2.4ghz. It can beat Intel Processors that run at 3.5ghz.
Why?
Amd's do more at once, but slower.
Intel's do less at once, but faster.
 
Well, when you compare the EE you might as well bring in the FX-60. Any of those high-end CPUs will cost you more than is totally necessary. Just grab a 3800+ Dual Core and spend the rest on a good video card and you'll be set forever
 
Well I didn't know any of this, okay? For the general consumer, and someone like myself who buys parts to build their own computer, megahertz has always been the defining factor for processors since as long as I can remember. It's always megahertz this, gigahertz that.
 
Shouldn't this hardware talk go in the Hardware & Software board?

More talk about the games and less about processors.
 
satch919 said:
Shouldn't this hardware talk go in the Hardware & Software board?

More talk about the games and less about processors.

Too bad there is absolutely nothing to talk about yet, though.

"prerendered trailers look good..."

.bog.
 
boglito said:
Too bad there is absolutely nothing to talk about yet, though.

"prerendered trailers look good..."

.bog.

Everything that's been shown on Gears of War and UT2007 has been real-time.
 
Iced_Eagle said:
All UE3 stuff is real-time man.

Unless Epic has been lying to us this whole time.

http://www.computerandvideogames.co...com/interviews/interviews_story.php?id=134530
New Mark Rein interview.

satch919 said:
Everything that's been shown on Gears of War and UT2007 has been real-time.

The u2k7 videos are just base enough that I might believe that. GoW and various other videos (huxley trailers spring to mind) are mostly prerendered. There is nothing anyone can say to prove that wrong untill a playable demo/game is available, so don't bother.

Basically if you take 10 videos of unreal tech, and you pick the two "worst" ones, then those are the two that are not prerendered.

.bog.
 
boglito said:
The u2k7 videos are just base enough that I might believe that. GoW and various other videos (huxley trailers spring to mind) are mostly prerendered. There is nothing anyone can say to prove that wrong untill a playable demo/game is available, so don't bother.

Basically if you take 10 videos of unreal tech, and you pick the two "worst" ones, then those are the two that are not prerendered.

.bog.
Have you not seen the old Huxley walkthrough video? The one where the car comes skidding in and soldiers jumping out. There's no doubt that's in-game. I think that looks better than the latest trailer.
 
boglito said:
The u2k7 videos are just base enough that I might believe that. GoW and various other videos (huxley trailers spring to mind) are mostly prerendered. There is nothing anyone can say to prove that wrong untill a playable demo/game is available, so don't bother.

Basically if you take 10 videos of unreal tech, and you pick the two "worst" ones, then those are the two that are not prerendered.

.bog.

Here's a Gears of War video from the Tokyo Game Show that shows Mark Rein playing the game.

http://www.xboxyde.com/leech_1682_en.html

Here's a pic from the video:

786_0010.jpg
 
Back
Top