REAL Unreal Engine 3 Pic

reap said:
I can't wait to get Geforce 6800! but it is too expensive for me ;(
Do you know does the 6800 work in HL 2 with highest details?
In a word, YES! :)
 
MaxiKana said:
No simulators are photorealsitic, already simulators have like 3-4 comps running them (for the small ones).

The NASA simulations will have very accurate physics (game physics have barely any relation to real life), but there graphics are about on par with MS flight simulator.
 
Those graphics are amazing. Some of the effects they are achieving now, and will be achieving soon are unbelievable. I'm sorry, but from what I've seen, if you think hl2 or doom 3 looks better, then you need glasses.
I would much rather have hl2 than any unreal game, hands down. But if your going to compare them, ue3 wins.
And who says epic are behind when it comes to graphics. Whos ahead of them? Their engines power some of the most beautifull games around. They may not be on par with far cry right now (graphics wise) but the unreal engine has been out for a few years as well... cryteks engine is pretty new.
As far as simulations go, real time sims are pretty rugged looking, mainly because physics processes and other math are more important in those cases than looks. Pre rendered sims can look how ever they want though.
I've been at work since 11:30 last night, its 12 noon now, i'm going to be..
 
mrchimp said:
How would you know which rays are hitting the object... you can't make the object give out rays, because they have to pass through it.
You could only send out rays through every pixel on the object that can be seen from the point of the light (as if you were looking at the object from that position)... or the more simple way to do it would be just to send out a cone of rays toward the origin of the object and kill off any rays that never hit a reflective/refractive surface (so they don't screw up the lighting of the rest of the scene).
 
good point but the performance loss of rendering a scene (not a full scene obvously) for every light might be a little too high.

fireing a cone of light or rather cylinder of light would work (with a simple algo to work out how big it should be), but even traceing a couple of hundred rays would be slow.

fakeing it is the best way to go in my opinion.
 
This reminds me of when everyone saw the Unreal 2 engine for the first time, they said "omg! this owns everything else on the market!" well, a fat lot of good it's engine did that game, if you ask most people they'll go "What? There was an unreal 2?!".
 
Like I said before, I'm more excited about how the developers will utilize the Source and Doom III engines in their creations.
 
mortiz said:
This reminds me of when everyone saw the Unreal 2 engine for the first time, they said "omg! this owns everything else on the market!" well, a fat lot of good it's engine did that game, if you ask most people they'll go "What? There was an unreal 2?!".

just imagine how bad it would have been without the engine... at times the gameplay was just plain annoying (and cutscenes urrrggggg), it was only saved by the graphics.
 
Epic's goal for Unreal Tournament is to move it into the sci-fi direction and less of the WWF "**** you!" attitude. They are planning Unreal Tournament 2005 and 2006, that's their idea. Probably would use the Unreal 3.0 Engine and mean they've ran out of ideas for names. But due to the given success of Unreal Tournament 2004, 2005 and 2006 may be 2 times better.:D
 
Check this out everybody: (VERY IMPORTANT)

The guy's over at 3Dchips-fr.com have a demo of a new 3D technique called parallax mapping or offset mapping or kind of a super bump mapping that was used in the Unreal Engine 3 tech demo!
Note: To run the demo your going to need a DirectX 9 graphics card with pixel shaders 2.0.
Source:warp2search.

What makes so beautiful the textures posted by Unreal Engine 3:
http://translate.google.com/transla...&hl=en&ie=UTF-8&oe=UTF-8&prev=/language_tools

Parallax mapping :http://www.3dchips-fr.com/download/sendfile.php?DownloadID=1825
 
I'v seen both those demo's before, however the screebshot next to the download appears to be from one I havn't, although it looks similiar to one I have...

EDIT:when on the uber bumpmap demo hold down ctrl to make the light follow you :afro:
 
G0rgon, can you find any HTTP links for that parallax demo? For some reason FTP's using port 21 are blocked by my ISP now :( (I think 'cause I got caught port-scanning for pubs, *opps!*).
 
NSPIRE said:
G0rgon, can you find any HTTP links for that parallax demo? For some reason FTP's using port 21 are blocked by my ISP now :( (I think 'cause I got caught port-scanning for pubs, *opps!*).

not a chance. I search alot of parallax.zip files on google.com and it gave me alot files but no with the same size 3.5mb.

:(
 
hey guys, if the new radeon runs hl2 better but the new nvidea runs unreal, splinter cell, and far cry, and doom 3 better.......which would you rather get?
 
Sai said:
hey guys, if the new radeon runs hl2 better but the new nvidea runs unreal, splinter cell, and far cry, and doom 3 better.......which would you rather get?

I would get someone to pinch me, because I would likely be sleeping.

I would propably get the Ati card... (before the children start screaming) no Im not a fanboy. (I have had decent cards by both companies, and I own both the radeon 9800 pro and the GeforceFX 5800 ultra)

I just kind of expect HL2, to carry on the legacy of HL1 (both in a single player experience and in terms of being the most modded/played game out there)

if someone had said "what card would you rather have, if this one ran HL perfectly and you will still be playing it 5 years in the future, or this one that will play other games well but are pretty forgettable" I would obviously choose the card that gave my gaming longevity.

a good comparison would be to take HL1 vs Unreal Tourney2k3 and Quake 3 etc.
yes I liked the games and spinoffs of the other engines, but Im still here playing HL1 engine games 5 years later. :)

yes Doom3 looks to be a great game, Splinter Cell doesnt interest me and I own Farcry, and love it (and runs perfectly on my radeon 9800 pro) as for
Unreal (Im assuming you mean Unreal 3) doesnt look THAT impressive, and to buy a card with Unreal3 in mind right now would be ridiculous, since we wont see it for like another 4 years. (by then we would be comparing Hl3 to it lol)

but still I would have to go with HL2 (which would be ATI in your arguement)
 
Sai said:
hey guys, if the new radeon runs hl2 better but the new nvidea runs unreal, splinter cell, and far cry, and doom 3 better.......which would you rather get?

Off course ATI. But it won't support UE3.0 cause the ATI420 core deos not support shader 3.0 but next core which will be released next june will have PS3.0 hopfully.
 
well it's just that in every other pc game out there the intro movie is an nvidea card and if hl2 is the only game that a radeon runs better with......i dunno, hl2 is only one game and onslaught on unreal 2004 is just undescribely fun, then you put in the atmoshphere of d3 and the very well thought out mp of sc; pt, if you haven't guessed i'm gonna go for the nvidea
 
Sai said:
well it's just that in every other pc game out there the intro movie is an nvidea card and if hl2 is the only game that a radeon runs better with......i dunno, hl2 is only one game and onslaught on unreal 2004 is just undescribely fun, then you put in the atmoshphere of d3 and the very well thought out mp of sc; pt, if you haven't guessed i'm gonna go for the nvidea

You see, this is the result of the marketing campaigns by both nVidia and ATi. None of the games you've listed are optimized for one of the two cards more (with the possible exception of Doom 3, but we've yet to see)... it's just the companies attempting to tell you that they will.
 
Sai said:
well it's just that in every other pc game out there the intro movie is an nvidea card and if hl2 is the only game that a radeon runs better with......i dunno, hl2 is only one game and onslaught on unreal 2004 is just undescribely fun, then you put in the atmoshphere of d3 and the very well thought out mp of sc; pt, if you haven't guessed i'm gonna go for the nvidea

read some reviews on the cards. dont make your decision based on how many games have ati/nvidia logo on them.

www.anandtech.com and www.tomshardware.com are good places to start.

also the forums of www.overclockers.com are a great wealth of knowledge on peoples experiences with current cards.

to answer your question though, ati and nvidia both make very good cards. with the exception of the 6800 though, I would say ati has a slight edge as far as image quality and performance goes in today's games. <-- (based on my own experience as well as reviews ive read from the sites ive posted).

in conclusion, just because a game has a certain logo on it, doesnt mean that card is optimized for it. It just means that they paid alot of money to get their logo on their game.
 
There are many "the way it's meant to be played" games that run better on ATi cards.
 
If you think about it, its in the developers best interest NOT to favor one card over the other. If someone makes a game that runs great on ATI hardware, and utter crap on Nvidia, then the only people that will buy that game and then next one and the next one in that series will be people who own ATI cards. Therefore they (developer) are cutting out a huge fan base from themselves, basically shooting themselves in the foot.
 
Mac said:
read some reviews on the cards. dont make your decision based on how many games have ati/nvidia logo on them.

www.anandtech.com and www.tomshardware.com are good places to start.

also the forums of www.overclockers.com are a great wealth of knowledge on peoples experiences with current cards.

to answer your question though, ati and nvidia both make very good cards. with the exception of the 6800 though, I would say ati has a slight edge as far as image quality and performance goes in today's games. <-- (based on my own experience as well as reviews ive read from the sites ive posted).

in conclusion, just because a game has a certain logo on it, doesnt mean that card is optimized for it. It just means that they paid alot of money to get their logo on their game.

tomshardware is dodgy...

www.Extremetech.com - reasonably relaible
www.beyond3d.com - the most reliable, best forums
www.theinquirer.net - Good for a laugh, biggest rumour mongers on the planet
www.hardocp.com - reliable reviews but post rumours as well
www.xbitlabs.com - complete bastards, they benchmarked the HL2 alpha 3 times, but still reliable.

There are many more but I lost all my bookmarks :( Still those sites (includeing anand) should give you a good perspective of the current situation, just don't put any faith in the Inq.
 
Back
Top