Page 1 of 3

Some 3dfx Fun

Posted: Sun Mar 05, 2006 5:59 pm
by Gold Leader
well after a few days of testing the Voodoo5 6000 in Descent3 @ 1600 x 1200 and some tother games too I made these nice shots of my 3 Voodoo5 AGP cards :)

From top to Bottom:

3dfx Voodoo5 6000 AGP 128MB Rev.A 3700 [Prototype]
3dfx Voodoo5 5500 AGP 64MB Rev.A 2500 [Production Model]
3dfx Voodoo5 5000 AGP 32MB Rev.A0 3700 [Prototype]

Top sides
Image

Bottom sides :)
Image

and here the fun and glory I had with the 6K in the following events :)


Descent3 Ver 1.5 Beta @ 1600 x 1200 x 16 in Glide
Image
Image
Image

QuakeIIIArena Ver 1.32 @ 1600 x 1200 x 16 in WickedGL 3.02
Image
Image
Image

StarWars: Jedi Knight: Jedi Academy ver 1.01 @ 1600 x 1200 x 16 with MESA FX 6.2.0.2 OpenGL32.dll
Image
Image

Call Of Duty: United Offensive ver 1.51 @ 1600 x 1200 x 32 High Details Map Carantan, France:

Image
Image
Image

that Card simply has no jitter or stutters in those games at those settings simply amazing for such a rare piece of Hardware, it the best card For Descent3 with Glide I can tell ya that hehe :)

Posted: Sun Mar 05, 2006 6:58 pm
by Mobius
Heh. I bet my middle-of-the-road X800XL would rip them all a new one.

But for ancient, obsolete, gigantic, dead-end technology, yeah - it works. :P

Posted: Sun Mar 05, 2006 7:00 pm
by AceCombat
holy jesus..... thats a HOOOOGE card at the top!!


how does that thing even fit in a case!?!?

Posted: Sun Mar 05, 2006 8:45 pm
by FunkyStickman
Never seen a full-length PCI card before? You know on cases how they used to have those dumb plastic slots in the inside front of the case? These are why. Server cases still have them, as some huge RAID cards and graphics cards are still full-length.

Oh yeah, and as for old hardware... I love kickin it old-school!

Posted: Sun Mar 05, 2006 8:46 pm
by Jeff250
Reminds me of my old AWE32.

Posted: Sun Mar 05, 2006 9:42 pm
by Krom
I'm sure my 6800 GT would have no problem disposing of the vooodoo5, in all but one department. 3dfx had better dithering support in 16 bit resolution, and that never worked its way to nvidia hardware.

Posted: Sun Mar 05, 2006 10:25 pm
by Ferno
so what was the difference between the 5000 and the 5500?

Posted: Mon Mar 06, 2006 1:35 am
by Grendel
Interesting. Here's a shot w/ a 6800U in 1280x1024 for reference

Image

Re:

Posted: Mon Mar 06, 2006 4:20 am
by Gold Leader
Information not available

Posted: Mon Mar 06, 2006 10:12 am
by Krom
@ obi-wan: 32 bit color doesnt work in OpenGL in D3, no matter what OpenGL always renders in 16 bit color, only direct3d works in 32 bit color. So it is no surprise 3dfx looks better in 16 bit glide then nvidia/ati in 16 bit opengl.

Re:

Posted: Mon Mar 06, 2006 11:06 am
by Gold Leader
Krom wrote:@ obi-wan: 32 bit color doesnt work in OpenGL in D3, no matter what OpenGL always renders in 16 bit color, only direct3d works in 32 bit color. So it is no surprise 3dfx looks better in 16 bit glide then nvidia/ati in 16 bit opengl.
okay but still the game was made for 3dfx Glide Direct3D doesn't look that great there are some glitches in the game with that API as I said The Fusion Engine was designed for the Voodoo3 and up ;)

OpenGL and Direct 3D were just extra API's added for the game support. I tested D3D mode on GeForce 6800GT, ATi Radeon 9700 Pro and 3dfx Voodoo5 6000 all 3 cards gave glitch effects in D3D still the Voodoo5 6000 gives nicer effects in Glide than the Radeon and GeForce in 32Bit D3D as well.

still amazing to see it in action on the Voodoo5 6000 with 8x FSAA :) it was surely the nicest Glide I've seen :)

Posted: Mon Mar 06, 2006 11:17 am
by Neo
Help me, Obi-Wan Kenobi...!

Sorry, I just couldn't resist. ^_~
Krom wrote:@ obi-wan: 32 bit color doesnt work in OpenGL in D3, no matter what OpenGL always renders in 16 bit color, only direct3d works in 32 bit color. So it is no surprise 3dfx looks better in 16 bit glide then nvidia/ati in 16 bit opengl.
Actually, Obi-Wan tested everything in D3 1.5, so that 32-bit/16-bit color bug was fixed.

Posted: Mon Mar 06, 2006 11:38 am
by Gold Leader
norries Neo :)

but still the sky looks bad in d3d also glotch which is a rendering error also., no probs in Glide however.

Posted: Mon Mar 06, 2006 12:00 pm
by Krom
Neo, set 1.5 to 32bit color at 640x480 in opengl and look at fog, you will see it is still 16 bit color. 32bit color does not work in OpenGL at all, even if the setting says it is set to 32bit, D3 still only ever renders 32bit in direct3d.

Posted: Mon Mar 06, 2006 12:32 pm
by Ferno
The difference have been stated in my first Post :

From top to Bottom:

3dfx Voodoo5 6000 AGP 128MB Rev.A 3700 [Prototype]
3dfx Voodoo5 5500 AGP 64MB Rev.A 2500 [Production Model]
3dfx Voodoo5 5000 AGP 32MB Rev.A0 3700 [Prototype]
I saw that. I was wondering if there were any design changes, enhancements, etc

Re:

Posted: Mon Mar 06, 2006 1:34 pm
by Gold Leader
Information removed

Posted: Mon Mar 06, 2006 1:36 pm
by Xamindar
Hey Obi-Wan Kenobi,

How did you get the 6000? As far as I know they were never released to the public. I have a good old 5500 that I still use and love it.

Re:

Posted: Mon Mar 06, 2006 1:42 pm
by Gold Leader
Unit Lost

Posted: Mon Mar 06, 2006 2:14 pm
by Burlyman
Well, have fun playing with your Voodoo5's while I get 500-1000 fps in D3. =P

Re:

Posted: Mon Mar 06, 2006 2:24 pm
by Grendel
Krom wrote:@ obi-wan: 32 bit color doesnt work in OpenGL in D3, no matter what OpenGL always renders in 16 bit color, only direct3d works in 32 bit color. So it is no surprise 3dfx looks better in 16 bit glide then nvidia/ati in 16 bit opengl.
Forgot to mention that the shot I posted was D3 1.4 in OpenGL .. :P

Posted: Mon Mar 06, 2006 2:24 pm
by Ferno
wow that's one hell of a difference.

Posted: Mon Mar 06, 2006 2:41 pm
by Xamindar
I used to love 3dfx. I was soo mad at nvidia when 3dfx died. I don't think it was nvidias fault though.

What really did happen? Poor financial management? I wish they were still around.

Posted: Mon Mar 06, 2006 5:00 pm
by AceCombat
how does AGP have SLI? i thought only PCI-E has SLI functions?

im guessing its the dual GPU's? :? :?

i was looking through newegg.com and stumbled on a AGP Card that says its SLI Ready, is this the same thing.

ill have to look that card up again and post it.

Re:

Posted: Mon Mar 06, 2006 5:12 pm
by Xamindar
AceCombat wrote:how does AGP have SLI? i thought only PCI-E has SLI functions?

im guessing its the dual GPU's? :? :?

i was looking through newegg.com and stumbled on a AGP Card that says its SLI Ready, is this the same thing.

ill have to look that card up again and post it.
SLI was invented by 3dfx. Voodoo2 did it with 2 cards, voodoo5 does it with 2-4 chips.

Posted: Mon Mar 06, 2006 5:20 pm
by Jeff250
I'll agree that the sky looks much better on the 3dfx card, which makes me miss my Voodoo3, but I think that all things considered, a modern day card should look better. 16x (or whatever they're up to) anisotropic filtering makes a huge difference. It is vain to try to compare jpeg screenshots though. It's a contest of who has fewer jpeg artifacts. :P

Posted: Mon Mar 06, 2006 5:25 pm
by AceCombat
that dodgegarage thing............ DAYUM!!! thats some serious hardware he has in there.

Re:

Posted: Mon Mar 06, 2006 5:27 pm
by Gold Leader
Unit Erased

Posted: Mon Mar 06, 2006 6:56 pm
by Krom
I beg to differ on the FPS issue, 24.5 FPS is as low as you can go and still not see each individual frame for most people. However when I am playing D3, I can tell the difference between 100 Hz and 120 Hz vsync. You can't see it just looking at someone moving, but as soon as I start turning around and moving myself, I can tell the difference clearly.

I can also see the refresh rate of a CRT monitor, I can see the difference in flicker even on a 85 vs 100 Hz monitor. Some people can't some people can, its up to the quality of your eyes. But try playing a game locked at 24 FPS vs playing locked at 60 FPS or 100 FPS and then try to tell me you can't tell the difference over 24.

The difference between 250 FPS and 1000 FPS is irrelevant. But 24 FPS is no good, anything under 60 is sub optimal for computer games.

Re:

Posted: Mon Mar 06, 2006 7:06 pm
by Gold Leader
Unit Deleted

Posted: Mon Mar 06, 2006 8:11 pm
by Jeff250
I'm spoiled these days. I can see the difference and get eye strain from anything below 100Hz.

Posted: Mon Mar 06, 2006 9:05 pm
by Krom
Only 87 FPS? Is that with AA or without AA? I thought the 6000 would have more fillrate then that. My 6800 GT @ Ultra gets an average of 290 FPS at 1600x1200 with 4x FSAA and 16x aniso filter. I thought the V5-6000 would do a little better then 87 FPS on its home turf.

Posted: Mon Mar 06, 2006 10:58 pm
by Duper
Wow! I had forgotten how good D3 looked on 3DFX. o_0

Wher on earth did you GET a V5 6K. There weren't that many made!???

Posted: Tue Mar 07, 2006 2:10 am
by Aus-RED-5
Duper wrote:Wow! I had forgotten how good D3 looked on 3DFX. o_0

Wher on earth did you GET a V5 6K. There weren't that many made!???
You missed something? ;) :P
Obi-Wan Kenobi wrote:
Xamindar wrote:Hey Obi-Wan Kenobi,

How did you get the 6000?
hehe I got it from a friend from Oakdale aak Gary Donovan here is his site:
http://www.thedodgegarage.com/3dfx/index.htm

he has the largest known 3dfx collection in the world he simply has all the known Production and prototypes that were made counting well over 300 cards hehe which including about 100+ of different Prototypes.
/me still has his Voodoo5 5500 AGP card in the box! :D
Oh and a Voodoo4 4500 PCI card! :D

But I like my BFG Nvidia 6800 Ultra 256MB AGP card better! :P

Re:

Posted: Tue Mar 07, 2006 5:27 am
by Gold Leader
Unit Destroyed

Posted: Tue Mar 07, 2006 5:50 am
by BUBBALOU
I do not see any Screenshots of you playing Battlefield2 with your voodoo cards, can you post some.

You aren't the lusr who dropped 2 grand on ebay to be the only consumer to get their hands on a Voodoo6000.... Especially since they are all of just 10 protoypes and no production boards

Sorry, my bad only 30 working ones and 120 duds!!! all reference boards They never made it to production...

Re:

Posted: Tue Mar 07, 2006 6:04 am
by Gold Leader
Infomation Lost

Re:

Posted: Tue Mar 07, 2006 8:46 am
by Suncho
Obi-Wan Kenobi wrote:the fastest the human eye can see is 24.5 fps awhile a very well trained human eye from a progamer can't see the difference above the 48 frames per second so what is your point, to me it's pointless talk laddie ;)
It's true that the eye can't see more than 24.5 fps, but you have to ask yourself what those frames contain. Movie cameras capture all the motion blur of real life, and therefore appear smooth. But when early computer graphics came out (eg. Alien 3) there were noticeable gaps in between frames when you saw the computer-generated object on-screen. Why? Because there was no motion blur.

The human eye collects data from all frames regardless of whether or not it can distinguish each individual frame from the one before it and the one after it. Descent 3 doesn't have motion blur therefore the difference between 100 frames and 500 frames is certainly noticeable.

However, the monitor can't display frames at a rate faster than its vertical refresh rate. So if D3 is running at 500fps and your monitor is only refreshing at 120Hz, you're not really gonna see the difference.

Remember though that your joystick is also polled once per frame. Higher framerate means more responsive controls, and if you're running with vsync off (you'd have to be a 500fps), the higher your framerate the shorter the frame jaggies appear to be (although there are more of them).

Posted: Tue Mar 07, 2006 10:46 am
by Krom
My old vanilla GF3 reached around 200 FPS at 1600x1200, granted without any AA. My FX pushed it up to around 300 FPS at 1600x1200, this 6800 can blow way past that without AA/AF. If your 6800 GT got artifacts you got a bad chip, this one is over a year old already and works just fine even overclocked. My old FX worked just fine even after years of overclocking (I should figure out what to do with that card rather then having it just sit around collecting dust :P) and my old GF3 is still running in one of my computers with no problems.

Also, on the AA quality, I thought the voodoo5 used streight up supersample just like the GF3/4/FX, so the quality would be almost exactly the same. It wasnt till the geforce 6 series that Nvidia finally moved to rotated grid multisample that ATI started a long time ago, but ATI still leads in AA quality.

Posted: Tue Mar 07, 2006 10:46 am
by Gold Leader
Information Acces Denied

Posted: Tue Mar 07, 2006 11:03 am
by Suncho
Obi-wan, it depends on the type of information each frame contains. If each frame only contains a snapshot of one point in time, then it's easy for the human eye to see flickering at 60fps. On the other hand, if each frame contains continuous information from the time interval between this frame and the last one, the rate can go as low as 24fps before people notice the difference.

Also, the cones of the eye have a higher \"refresh rate\". Instead of looking at your monitor straight, use your peripheral vision. You'll see the flicker out of the corner of your eye.