So, What do you guys really think happened?

cadaveca said:
what difference the 939 made weighed in on thier decision to/not to develop 64-bit compatibility.Obviously you aren't into hardware.
I love the way you say that like it's so obvious! I must have missed the meeting where Gabe told you all this, because otherwise you'd have to be making it all up as you go along!

Well, what *is* Valve's decision on 64-bit support, because I have no idea at the moment? Socket 939 makes no difference - what Valve will be looking at is the take-up rate of 64-bit systems, the success of the 64 bit drivers, and the cost/benefits of any optimisations. I still don't understand what this has to do with the delays of the game though. They aren't going to delay it for 64-bit support.

Edit:Removed unnecessary insults. Left the necessary ones.
 
I'm with Koopa, and i'm into hardware too, but what you are saying has'nt made the slightest bit of sense to me cadaveca....

edit: to a couple of your earlier points...

AUGUST 2003:
DirectX9 announced

DirectX9 was released december 19th 2002...

It is also ATI's version of DirectX9 that Microsoft opts to go with, and based on the number of physical registers(8 for ATI, 5 for Nvida) that ATI uses,, offering a more stream-lined texture path.

What do you mean by that? Your making it sound like Ati and Nvidia designed theyre own versions of DX, and microsoft used the best one? Microsoft designed DX9 entirely, its up to the gfx chip makers to implement its features in their hardware designs.
 
Valve did NOT decide to support 64-bits right away. but recently, like in one of the interviews since gold, they said there is a 64-bit version in the works.


Like I said in that last post 939 brings pci-e to AMD. PCI-e uses a different bus than agp, and current driversets are AGP directed. this is in part the reason you do not get more performance on pci-e over agp, but, like when the ATI256mb cards got no better than ATI128mb cards of the same flavor, once the driver is there to take advantage of the hardware, things will speed up.

SO, why did it have an impact? well, pci-e. It(pci-e) is the new standard, and pci is on it's way out. although it may be a year yet, like alot of people have said before, when you release a game, it should work on about 99% of hardware. Pci-e announced in april...no hardware to test on...and the goal-date for completion was probably set for august/september.

What would you do, if you knew new technology was coming out, and your product would ,quite possibly, not work? You prepare for it. I'm pretty sure that at the beginning of march, they were still working on DX9 compatibility, and when the new stuff was announced, they said "what the hell, a few more months, a couple of rewrites, and we have everything covered."

Remember that valve are a business, and making money is one of the most important goals, if not THE goal. How do you get the most money? You produce the best possible game you can. Andi think they have.

NB. said:
What do you mean by that? Your making it sound like Ati and Nvidia designed theyre own versions of DX, and microsoft used the best one? Microsoft designed DX9 entirely, its up to the gfx chip makers to implement its features in their hardware designs.


oh koopa/NB., i'm sorry. DX9b. 9b. And microsoft outsourced 9b. your not into hardware if you don't know that.
 
cadaveca said:
Valve did NOT decide to support 64-bits right away. but recently, like in one of the interviews since gold, they said there is a 64-bit version in the works.
Well, everybody says that (because it costs nothing to say). There haven't been many real 64-bit games yet though. It shouldn't really cost them much to make the 64-bit version since it's just different compiler settings.

cadaveca said:
What would you do, if you knew new technology was coming out, and your product would ,quite possibly, not work?
Why would it not work? They will leave driver/support etc to Nvidia/ATI/the mobo people. There's nothing in the code that precludes using PCI-E or any other technology. If it doesn't work with someone's new card - it's the graphics card manufacturer's fault, not Valve's.

cadaveca said:
You prepare for it. I'm pretty sure that at the beginning of march, they were still working on DX9 compatibility, and when the new stuff was announced, they said "what the hell, a few more months, a couple of rewrites, and we have everything covered."
I doubt your timescales - DX9 was 2001/2002 time, so for them to not know about it in March 2003 would be pretty surprising.
 
read my edit up a post, koopa.

PCI-E requires a different driver than AGP. Into hardware, my ass you are. Uh, doesn't work on NEW hardware! Glad you don't control any of the companies i deal with... you'd cost me money.


Going 64 is more than just compiling...lmao.
 
cadaveca said:
oh koopa/NB., i'm sorry. DX9b. 9b. And microsoft outsourced 9b. your not into hardware if you don't know that.

What so your trying to tell me that developement of Directx9b was outsourced to Ati and Nvidia?
 
the more i think about it i believe the ati had a large part to play in the coverup. because of my gullible nature i bought an ati 9600pro in anticipation of hl2 last year. maybe it wasn't completely my fault. i'm still kinda pissed about it. they suckered me out of $200. i think lombardi was a full and willing partner. i like their products, but lombardi...well, he's a liar. he doesn't care about fans, just the bottom line.
 
NB. said:
What so your trying to tell me that developement of Directx9b was outsourced to Ati and Nvidia?

Uh, yes. and microsoft chose ATI's version. Do some research before you try to knock my FACTS. This was covered @ ShaderDay...google that. DX9b requires that the gpu have 8 physical registers to deal with commands, and at the time of ShaderDay, Nvidia's only had 5. Vavle basically said at that point that the only reason that NVidia's were so poor performing was because of a driver, and left it up to Nvidia to remedy the problem. In the end, Nvidia's card get DX9b code sent to the card, and aswer the commands, in reverse from how ATI does it. They also break up a few commands. It was the extra processing that led to Nvida's poor performance, and the eventual switch to partial DX9, partial Dx8.1 for Nvidia's cards. you'll notice that Nvidia's packaging for pre-6800 cards also says DX9-compatible hardware, not DX9 hardware.
 
cadaveca said:
read my edit up a post, koopa.

PCI-E requires a different driver than AGP. Into hardware, my ass you are. Uh, doesn't work on NEW hardware! Glad you don't control any of the companies i deal with... you'd cost me money.


Going 64 is more than just compiling...lmao.

Your not making any sense, koopa has quite rightly said that the drivers are entirely Ati's business. Explain what exactly Valve has to do with any of this, and why any issues relating to Atis drivers would impact on them?
 
cadaveca said:
Uh, yes. and microsoft chose ATI's version. Do some research before you try to knock my FACTS.

Show me the evidence that Microsoft outsourced the developement of DX9B to Ati and Nvidia....
 
read my edit up. I won't link it, because there was a very long thread, that MR. Freeman and i were talking about earlier in this thread, with all the links you want. Maybe i'll check my CP here and link THAT thread to you. Notice that now MR Freeman agrees with me.NOW. not then, but NOW.
 
cadaveca said:
PCI-E requires a different driver than AGP. Into hardware, my ass you are. Uh, doesn't work on NEW hardware! Glad you don't control any of the companies i deal with... you'd cost me money. Going 64 is more than just compiling...lmao.
Yes. With PCI-E, Valve has to use PCI-POWERX, the new replacement for DirectX 9. Feel the AWESOME power of PCI-POWERX!!! If only there was some kind of hardware abstraction layer that game companies could rely on! It would make these little details so much easier, don't you think?

Would you like to tell me about 'going 64', since you know so much about it. I'm looking forward to this response!

BTW, I don't think you deal with any companies because I think you're about 14. No offence to the rest of the 14 year olds who don't just make stuff up, and then furiously resort to backpeddle/insult mode when questioned. You guys are ok.
 
back ontopic:

IMO unneeded thread, since we'll know all the truth when raising the bar book be released. Just wait a month and avoid repeating discussions while we don't have new info about, and save your time.
 
koopa said:
Yes. With PCI-E, Valve has to use PCI-POWERX, the new replacement for DirectX 9. Feel the AWESOME power of PCI-POWERX!!! If only there was some kind of hardware abstraction layer that game companies could rely on! It would make these little details so much easier, don't you think?

Would you like to tell me about 'going 64', since you know so much about it. I'm looking forward to this response!

BTW, I don't think you deal with any companies because I think you're about 14. No offence to the rest of the 14 year olds who don't just make stuff up, and then furiously resort to backpeddle/insult mode when questioned. You guys are ok.

Okay, oh great and mighty one, explain how easy it is when most devices are 32-bit, but the OS and code are 64bit?

Most PCI and AGP devices are only 32bit, so they can't handle 64bit addresses. AMD's 64bit architecture has a section known as an IOMMU, which maps memory address for 32bit devices. This means that PCI and AGP cards may have difficulty talking to memory. To work around this issue in EM64T, the OS has to reserve a section of memory that's within the 32bit address space that the PCI/AGP device can talk to. Once the device has done it's transfer (eg, a disk read), then the processor has to move the data from that transfer buffer into the bit of memory where it's actually needed.

And it's dealing with this issue that is the problem, at least i believe. Like I said, going 64 is more than just compiling.

Here's a link on the problem:
http://lwn.net/Articles/90958/
 
cadaveca said:
Okay, oh great and mighty one, explain how easy it is when most devices are 32-bit, but the OS and code are 64bit?

Most PCI and AGP devices are only 32bit, so they can't handle 64bit addresses. AMD's 64bit architecture has a section known as an IOMMU, which maps memory address for 32bit devices. This means that PCI and AGP cards may have difficulty talking to memory. To work around this issue in EM64T, the OS has to reserve a section of memory that's within the 32bit address space that the PCI/AGP device can talk to. Once the device has done it's transfer (eg, a disk read), then the processor has to move the data from that transfer buffer into the bit of memory where it's actually needed.

And it's dealing with this issue that is the problem, at least i believe. Like I said, going 64 is more than just compiling.

Way to copy and paste my friend :D Theres nothing like using your own words to prove a point.

http://www.aoaforums.com/forum/archive/index.php/t-27313.html

7th post down for everyone else...
 
cadaveca said:
Actually, those aren't my words, they are words of one with far greater knowlegde than myself. Notice how i have posted there as well.

You dont say? your still happy to pass them off as your own though.
 
cadaveca said:
To work around this issue in EM64T, the OS has to reserve a section of memory that's within the 32bit address space that the PCI/AGP device can talk to. Once the device has done it's transfer (eg, a disk read), then the processor has to move the data from that transfer buffer into the bit of memory where it's actually needed.
Key words in this : the OS. This doesn't have anything to do producing a 64-bit executable, it's an OS issue.
 
NB. said:
You dont say? your still happy to pass them off as your own though.

Can't beat the real answer. Like most PR people, answers become standardized...because they are right.

Oh, i know nothing....geez....
 
koopa said:
Key words in this : the OS. This doesn't have anything to do producing a 64-bit executable, it's an OS issue.


But that's why i think they didn't go 64...inEMT64 the OS has to do it....what about Intel's 64 bit solution? Why is Longhorn not out yet for purchase? There has to be a standard...or Valve needs to write 2 programs....Like they were almost faced with in the ATI/Nvidia battle...optimize for ATI, who uses more physical registers, or Nvidia?
 
I am not disputing the validity of that post, i'm simply pointing out that you pasted it here as your own words without any mention of the fact that you didnt write any of it. Nicely done :D
 
cadaveca said:
But that's why i think they didn't go 64...inEMT64 the OS has to do it....what about Intel's 64 bit solution?

EMT64 is Intels 64 bit solution....
 
cadaveca said:
But that's why i think they didn't go 64...inEMT64 the OS has to do it....what about Intel's 64 bit solution?
It doesn't matter about the 64-bit implementation - it's not something that Valve has to worry about. HL2 is still C++, so which 64-bit opcodes to use are all down to the compiler. Of course, it might not run very efficiently under 64-bit mode - we don't know how optimised the compiler/drivers are - but for the most part it shouldn't be tricky for Valve to find that out. (It also depends if HL2 uses any 32-bit only third-party libraries, but it's hard for us to know about that).
 
If the code wasn't stolen, we would probably have had the game around May think.
 
NB. said:
I am not disputing the validity of that post, i'm simply pointing out that you pasted it here as your own words without any mention of the fact that you didnt write any of it. Nicely done :D

Good to see your actually doing your own research, which is why i post in the first place. Not much else to do until HL2 comes out.

You missed my point about INTEL....i was mearly showing that it refered to intel, not to both intel and AMD.
 
koopa said:
It doesn't matter about the 64-bit implementation - it's not something that Valve has to worry about. HL2 is still C++, so which 64-bit opcodes to use are all down to the compiler. Of course, it might not run very efficiently under 64-bit mode - we don't know how optimised the compiler/drivers are - but for the most part it shouldn't be tricky for Valve to find that out. (It also depends if HL2 uses any 32-bit only third-party libraries, but it's hard for us to know about that).


Well, your the expert, why don't you explain? Oh, you did..it won't work efficiently if just compiled. Geeze..you blew away your own point.
 
cadaveca said:
http://www.microsoft.com/presspass/press/2003/sep03/09-10HalfLifePR.asp

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dndxgen/html/directx9devfaq.asp


Forinfo on DX9. useless, but you'll see the linkage to valve with DX9, in the first link. This was right around ShaderDay.

Microsoft outsourced SP2, and used ATI's submission for the DX path for SP2. This lead to DX9b.

Where does it say anything about microsoft outsourcing DX9b on either ofthose pages?

As far as i can see the second page doesnt mention Ati once....
 
NB. said:
Where does it say anything about microsoft outsourcing DX9b on either ofthose pages?

As far as i can see the second page doesnt mention Ati once....
It mentions HL2, and Valve. Seeing how ATI is partnered with Valve for HL2, you would assume they are included.

I said it was useless, but does have the september date, for an update. Do some checking..it's been so long(a year) the digging will have to be deep....
 
cadaveca said:
Well, your the expert, why don't you explain? Oh, you did..it won't work efficiently if just compiled. Geeze..you blew away your own point.
For goodness sake, it's like arguing with a donkey. At least you can sort of drag a donkey in the correct direction with enough brute force. Every time someone presents a fact you run off to google and dredge up even more irrelevant technical info in a vain attempt to make yourself look smart.

If it doesn't run efficiently, it's because either a) the drivers suck b) the compiler sucks. There's nothing Valve can do about either of those two things apart from yell at the appropriate people because Valve develop COMPUTER GAMES and not GRAPHICS DRIVERS. At the moment, they're probably just waiting for everything to become mature. But swerving back on topic, none of these things have anything to do with delays!

Last post from me on the subject because I'm starting to enjoy flaming you and that's not good :)
 
They do have to do with delays, if they wanted to go 64, but where unable to because of the things that you yourself mentioned.

here's one for ya:


FiringSquad: ATI beat NVIDIA to market, and presumably developers, by at least 3 months. Do you think that developers starting development specifically with ATI hardware in mind contributed to NVIDIA’s woes?

Tim Little: The fact that ATI had hardware sooner than nVIDIA played into a number of things, not the least of which is how much influence they may have had on the DX9 specs. When you have hardware available for the API provider to work with during development, the API may just work better with your hardware. The opposite can also be true, if you have a full spec before you implement your hardware, your hardware will probably work better with it. I think nVIDIA fell in-between, they were both too late to strongly influence the spec, and not late enough to benefit from it. I believe that as nVIDIA's drivers improve in bridging the gap between API and hardware their performance will continue to improve. The other factor is that R300 and NV3x have different capabilities, and the fact that ATI was in developers' hands earlier did lead to developers using those capabilities, some of which are problematic to adapt to the NV3x cards. In short yes, nVIDIA was hurt by ATI beating them to market by such a large margin.
http://www.firingsquad.com/features/nvidia_editors_day/page8.asp
 
If you didn't understand what the last quote was saying...

DX9 was developed on ATI cards. Hence the GeforceFX having to resort to 8.1/9.x, as when DX was developed, it was developed on cards with 8 physical registers, and when it hit a Geforece FX, with 5 registers, issues began. He then goes to say that maybe NVida were playing the waiting game, and would develop hardware to meet the specifications of DX9.0b...and out came the 6800.

THERE'S YOUR PROOF NB/KOOPA.
 
KagePrototype said:
I say: it was unfinished.

Or they weren't happy with it and did loads of it again.

I remember Gamespot's upcoming "final hours" article on HL2 will have Gabe 'fessin' up about the delay. I think.

That would be cool. I mean how do you say a game wil come out on a day but it really comes out a year later. Not like i care the more they work on it the better. But it would be cool to know what really happened.

What i think is that they had the game done and everything. But they felt exactly what they felt when hl1 was being made. They finished it but they felt it wasnt really that good and remade it. I think they upgraded the gfx (as we can see in the ss) and added a lot more "things" that made hl2 get a 97-98 rateing.
 
I would just like to say, the 1 year + delay is NOT because of new hardware and graphics solutions (DX 9.0), if a game tried to keep up with absolutely every new development they would need a new engine every 15 minutes, aka Duke Nukem Forever :D
 
It's good they delayed it 1 year plus. Then it would'nt be as good as it is now..:D
 
NB. said:
Way to copy and paste my friend :D Theres nothing like using your own words to prove a point.

http://www.aoaforums.com/forum/archive/index.php/t-27313.html

7th post down for everyone else...

some things can only be said so many times in just certain ways.. if u can't accept that, nobody can help u.

as far as this 64bit issue.. some of u are discussing it as if it was the only issue at hand.. truth is Ati had alot invested in HL2 and they wanted DX9 mode refined more than it was around the September 30th 2003 time frame.. and refining technology as we all know, takes time.

if some of u are that interested in the delay.. maybe its best to go dig up that thread Cadaveca started a few weeks ago on the delay.

almost everyone in that thread was annoyed at the guy.. but to his credit, he as provided logical reasons why HL2 was delayed.
now whether u think it was those reasons or not, his explanations certainly do make sense.
 
Watch the E3 2003 videos, and watch the videos from recently. There are some nice changes in them if you watch for physics, fluid animations, cosmetics, and gameplay. For example, I've noticed that the zombie skins have changed a bit, as have the animations for City 17 folks. Or am I imagining this?

I've seen the alpha being played, it wasnt even close to what they had done at E3. Certainly wasn't the reason to delay the game.

Pure and simple, they weren't done. No harm in that.
 
Back
Top