Doom Three

I bought it today and saw that you have to have Windows 2000 or XP to even load it. WHY??? (Yes, I read the requirements before purchasing. But why the limitation?)

I don't want to leave my trusty Windows 98SE.... :(

Edit: Arshad, I agree, I think I'm more excited about the release of Half-Life 2. Doom 3 should be fun, though.
 
8 hours left.... arg...

Arshad, I didn't know you worked for ATI. Thats cool! I have lots of ATI stock! You guys keep up the good work.. It looks like Nvidia has surpassed you guys this time though. Hope you have something else up your sleeves.. ;)
 
I'm really liking the single player, though my overclocked card is having the same trouble carmack hinted at. It runs rock solid in every other game i've tried (including farcry)
 
NetViper said:


Arshad, I didn't know you worked for ATI. Thats cool! I have lots of ATI stock! You guys keep up the good work.. It looks like Nvidia has surpassed you guys this time though. Hope you have something else up your sleeves.. ;)

Most importantly...Can we get discounts from you on merchandise?
 
steveny said:
Most importantly...Can we get discounts from you on merchandise?

Good question :)

My brother works for RIM (blackberry's) and he can't get discounts :(
 

Arshad, I didn't know you worked for ATI. Thats cool! I have lots of ATI stock! You guys keep up the good work.. It looks like Nvidia has surpassed you guys this time though. Hope you have something else up your sleeves..


Yeah I'm the SW Architect for the Apple team at ATI. It totally rocks -- definitely my dream job! Hold onto that stock BTW, lots of stuff up our sleeves! I can't believe those dumba$$es at Merrill did an across the board semiconductor downgrade -- they have no idea what's going on. Keep a close eye on our DTV/handheld business and GPU marketshare -- it's going to get real interesting ;)

Unfortunately for us, D3 uses different shader paths for different cards+vendors and the way things are set up, they beat us pretty decisively in this game. Ah well, what can you do? It still plays quite well on the X800's, and do expect some further performance boosts down the road.


Most importantly...Can we get discounts from you on merchandise?


Sure, not a problem. I can get up to 6 cards a year w/ employee discount and I'd be willing to set aside 4 for NSXprime members who are interested in any of our products. Just PM me...
 
I'm honestly not surprised your (ati) cards are trailing a bit. Its been widely known that ID was working closer with nvidia than ati on this project, and the new hotfix shows it.
Its alright, you'll spank 'em when hl2 comes out right?

Arshad - Are you running a PCI-E board or a regular AGP one? Allt his hype about making the switch, is it really a big deal?
 
<B>Arshad</B> : So i guess now's not a good time to start a good old fashion NVIDIA vs ATI flame war, right? :D :D :D

From all my reading nVidia is slightly ahead when it comes to Doom 3. But let's face it... when you're playing a game and the difference between an nVidia and ATI is 58fps and 55fps, WHO CARES!?! :D


<B>paladin</B> : PCI-E has little to no performance gain. It's like comparing AGP4x vs AGP8x cards. It's only a better investiment if you plan to upgrade in the future (AGP is expected to become obsolete within the next 18-24mths). But as soon as motherboards come out with dual 16x PCI-E slots.... it will ROCK!!! It will be time to get me a couple of 6800 GT's and SLI them (to non-PC geeks, that means dual-video cards ;) )
 
Neo: Hehe, I'm always up for a flame war :D

I'll have to disagree with you on the PCIE comments though, although I know where you are coming from.

From the basic specs, PCIE x16 will have significantly higher throughput than AGP8x (max 2.5Gb/s per lane, but protocol overhead brings realworld throughput lower). However the big difference is that PCIE is bidirectional and has the same link characteristics going both ways. On AGP, you can transfer from host->card very quickly, but the other direction is extremely slow. With PCIE, you can not only transfer in both directions at the same speed, but both can happen simultaneously!

The big benefit here is for things like video editing where the GPU processes the image using shaders and then transfers the video frame back to the system for writing out to disk or whatever. All of a sudden it's possible to do this with HDTV frames in real time without breaking a sweat. There are other uses as well, such as many games now use rendering to texture heavily and often read the pixels back onto the host with glReadPixels() or whatever. Similarly, going forward to longhorn the OS can make use of the fast transfers for texture page-offs and so forth.

re: your comments about no significant delta between AGP 4x and AGP 8x. The real issue there was that most games/systems have been largely render bound, not bus bound. ie when you crank up the resolution and put on a fast CPU, the graphics card's fillrate was the limiting factor to how fast you could go. So it didn't really matter if your bus was 1GB/s or 2GB/s because the game wasn't sending data at even 1GB/s.

However, as we move forward, pure fillrate is not going to be as important as games move to more complex shaders and really start stressing the geometry pipe by sending up very highly detailed models. This will cause the bus to become a more significant factor, especially as games move from eg. 5k poly models for the characters to 60k or 100k per character, just because they can.

As far as proliferation of PCIE, it's going to be a pretty huge switch. I think our forecast is something like 50% of our volume shipping by years end will be PCIE. The big OEM's are pushing PCIE heavily and by end of next year AGP is not really going to be a significant factor. It's not necessary to jump on the bandwagon right away, but it's going to be a fast transition. Both our and nVidia's roadmaps quickly phase out AGP chips going forward, although we'll both continue to support them via bridge chips.
 
steveny said:
So lets talk about going to hell.
I am up at 8:30am and ready to go pick up my copy of DOOM III with visions of playing all day. Well at 8:45 the phone rings, it is one of my tenants. He tells me the alarm is going off for the SEWER EJECTION PUMP! ERRRR!!!!! So I head over there instead of heading to electronics boutique. I get there and the tank in the basement is almost ready to over flow, the pump is shot. I call every plumber in the book and no one has anytime to take care of it today. If I don't fix this the sewage is going to flood the basement soon. So I start calling around to find a new pump. I come back home put on the worst set of clothes I own and head back over with tools and the new pump. I have the new pump installed and ready for the connections to be tightened up. On the very last clamp on the Furnco I drop the ONLY screw driver I have with me into the four feet deep sewage pit. ARRGGG! So I spend another Two hours going to get another screw driver. Finally home now and just not in the mood to play.

Any one play today? How is it?


You should have called me .I own a plumbing co. :)
Actually your probably too far away.Are you sure its not just
the ejector pump float switch ,in that case you can bypass it in the power cord and empty the pit manually until your ready to swap the unit.
 
NetViper, I hope you don't hold any nVidia stock.. they are getting PUMMELLED right now after missing by about $50M. Down 23% after hours.. OUCH! Let's see what Jen Hsun has to say in the conference call...
 
Ashrad,

No Nvida stock for me. I WISH I had bought it back in the day when it IPO'd at like 12 bucks and went to $150 when it got Xbox. :(

I will hold on to my ATI stock. They wanted me to sell it when it hits 28. I bought it at 19.

I might take you up on your offer for the video card for prime members though. I am going to need to upgrade really soon. My card came with the dell and I think it is actually a 9500 PRO or radeon 9700 non pro.

I really don't want top of the line though as they cost so much. What do you recommend? I have a p4 2.8.
 
MYNSX said:
You should have called me .I own a plumbing co. :)
Actually your probably too far away.Are you sure its not just
the ejector pump float switch ,in that case you can bypass it in the power cord and empty the pit manually until your ready to swap the unit.

The tenant failed to pay the rent and I evicted her. She was pissed off and flushed wash cloths down the toilet which jammed up the impeller and burned out the motor. I explained to her how the system worked in great detail when she moved in 3 years ago. This was a deliberate act of revenge. F..ing B.tch. Very odd how the rent was on time with no problem for 3 years prior.

What surprises me most is the breaker did not trip. I replaced that too.
 
<B>Arshad</B> : haha... flame war.... ATI SUX!!! NVIDIA SUX!!! XGI RULEZ BECAUSE I SAID SO!!! :D :D :D

That's cool that you disagree with me; I guess our opinions are coming from our line of work. You gotta sell your product new features; i'm a consultant so i look at what people want to do in the NOW. And as cool as HDTV is, it's not a "now" thing. BTW, I thought 1st generation ATI and nVidia PCIE cards didn't support HDTV; that's the next round. ;)

So yeah, you're preaching to the converted regards the benefits of PCIE... i definitely understand and agree with it, but it will take some time for game developers & video software to catch up and start abusing the new technology.

One things PCIE has going for its early adoption is the price is comparable to AGP cards. But mainboards with PCIE are E-X-P-E-N-S-I-V-E, as is the new DDR2 RAM, etc. I've read where Intel wanted the complete phase-out of AGP within a 6mth timeline. I'm more realistic; it will be at least 12-18mths. Mainly because there are tight people who will only want to upgrade their video card as they consider their 3yr old system as new. :rolleyes: :D


> This will cause the bus to become a more significant factor, especially as games move from eg. 5k poly models for the characters to 60k or 100k per character, just because they can.

hahah... i loved this line... "just because they can".
Sounds like Bill Clinton's strategy. :D


> Both our and nVidia's roadmaps quickly phase out AGP chips going forward, although we'll both continue to support them via bridge chips.

Wait?!?! Bridge chips?!?! I thought ATI was selling a "pure PCI-E" solution and claimed nvidia's bridge chips as evil!?! :D :D :D Now ATI is considering bridge chips? Ahh.... hehehe.e...... i await nvidia's response to that. :D LOL
 
OK. Just played doom3 for an hour and man that game is INTENSE. It is really scary. It is SOOOOO DARK. Too dark actually. You can't see anything. When someone is shooting you from a distance, you cannot see them. You can see them with the flashlight, but as soon as you switch to a weapon, its black. Yet they still shoot the hell out of you. I think the guns need lights. This game would be amazing with CO-OP play like it has on the Xbox live.
 
Yeah, one man flashlight, the other riding shotgun. awesome. :D

It sux the PC version doesn't have CO-OP. don't u just hate microsoft for squaring that deal with ID? :(
 

You gotta sell your product new features; i'm a consultant so i look at what people want to do in the NOW.


Heh, I wasn't trying to give a marketing answer ;) I agree with you that today's games are primarily render bound so the bus won't make a huge difference. I'm looking at engines that are being developed for release 2-3 years down the road and they're all using huge poly counts. Heck, even some games being released later this year and early next year are going to significantly boost the poly counts as well as higher res textures and normal maps which will all dramatically increase the bus traffic. That's when you'll start to notice the delta between AGP 4x and 8x for example. In today's applications, you'll typically only see the bus differences come into play for high end video editing solutions.


And as cool as HDTV is, it's not a "now" thing. BTW, I thought 1st generation ATI and nVidia PCIE cards didn't support HDTV; that's the next round.


Well, HDTV is rapidly becoming a 'now' thing here in north america, and it's definitley a 'now' thing for cable/satellite broadcasters and for them PCIE is a huge deal. BTW, we demo'd HDTV editing on an RV380 PCIE card many months ago. I think it was at the last IDF when Intel introduced their PCIE chipset, but I'm not 100% positive which show it was.


i definitely understand and agree with it, but it will take some time for game developers & video software to catch up and start abusing the new technology.


Yep, I agree 100% there.. it always takes a while for people to start properly utilizing new technology.


Wait?!?! Bridge chips?!?! I thought ATI was selling a "pure PCI-E" solution and claimed nvidia's bridge chips as evil!?! Now ATI is considering bridge chips? Ahh.... hehehe.e...... i await nvidia's response to that.


It's a reverse bridge that ATI is implementing. ie going forward our chips will all be PCIE native, and in order to support AGP requirements, we'll include a reverse bridge that will convert PCIE to AGP signals. This is not a big deal since AGP can be considered a subset of PCIE in terms of performance. nVidia is going to be doing the same thing since they are moving completely to native PCIE as well.

The reason ATI was harping on nVidia's existing PCIE bridge is that the GPU itself is still running AGP cycles so it can't take full advantage of PCIE throughput or bidirectional transfers or low power modes, etc.
 
This techno-speak is all fun and munkies, but how about some reviews of the game, guys! I have to wait 'till november before i'm able to enjoy some non-live xbox co-op goodness! Satiate my appetite!:D
 
I'm really enjoying it. The first hour or so i was in the same boat as everyone else (Why cant i just duct-tape the flashlight to my shotgun?) But now i've gotten proficient at switching back and forth. the initial levels had a couple zombies that crouched behind boxes to shoot, and then duck down again. that was hard until you get grenades.
Unfortunately, i had to de-clock my radeon, cause i was getting artifacts (even with the ATI Doom 3 Hotfix drivers)
 
paladin said:
Unfortunately, i had to de-clock my radeon, cause i was getting artifacts (even with the ATI Doom 3 Hotfix drivers)


What can i say? It's ATI. :rolleyes:



oh wait... Arshad.... that's flame war material!!!!!!!!! :D :D :D ROFL :D just kidding man! :D sorry... i couldn't resist that comment. :D
 
Hahaha.. well as Carmack noted, people may have to lower their overclocked cards back to default because the D3 engine stresses certain aspects of the GPU which haven't really been used by other games. It's like cranking your turbo up to 20psi. Works great going up and down your driveway but the minute you take it on the highway your engine blows up :D
 
Arshad said:
Hahaha.. well as Carmack noted, people may have to lower their overclocked cards back to default because the D3 engine stresses certain aspects of the GPU which haven't really been used by other games. It's like cranking your turbo up to 20psi. Works great going up and down your driveway but the minute you take it on the highway your engine blows up :D

LOL, great analogy.

It will be interesting to know how the latest wave of "overclocked out of the box" nVidia 6800 cards will handle doom3. Because they're o/c'ed by the manufacturers, will that cause problems? Hmmm.... ponder that deep though. :rolleyes:

ie. BFG 6800 Ultra OC : 425MHz instead of 400MHz,
LeadTek Winfast A-400 6800 Ultra: 425MHz instead of 400MHz.
 

It will be interesting to know how the latest wave of "overclocked out of the box" nVidia 6800 cards will handle doom3. Because they're o/c'ed by the manufacturers, will that cause problems?


I doubt it, but you never know! What some people consider to be "overclocked by the manufacturer" are not actually overclocked at all. When we get back the wafer from the fab, there are corner lots where you can get dies that run much faster than others etc. The various dies are 'bin'ed according to their failure rates at different engine clocks running an exhaustive number of tests that check each functional portion of the ASIC. So while we may note a specific part has an issue let's say at 500Mhz, but passes at 475Mhz, the user may be able to overclock to 500 without any issues because they are not excercising the specific path that failed. For example, the 3D engine may work fine, but IDCT may have failures so you'll never notice a problem with the overclocked card until you play back a DVD and see corruption.

Memory timings generally don't exceed manufacturers spec, so that's not an issue.
 
Back
Top