![Smile :)](./images/smilies/icon_smile.gif)
but I don't see stutters only get a headacke after a while
![Laughing :lol:](./images/smilies/icon_lol.gif)
![Smile :)](./images/smilies/icon_smile.gif)
Nope, maybe YOU should look again.Xamindar wrote:Umm MD, take another look. I think you are getting them mixed up.
Right, I was referring back to one of his posts when he mentioned how many frames he got. This would further back up his arguements as supporting evidence. Probably should've made that a little clearer.fliptw wrote:the fps counter has nothing to do with image quality
ever played Descent3 with T&L disabled on a 6800ULTTA then it's a fair play rightKrom wrote:/me owns a Voodoo3 3000 AGP 16MB card and a Diamond Monster 3D II 8 MB PCI card. I've seen 3dfx and glide plenty in action, aside from the sky, everything else looks better on a geforce.
You can't even compare a voodoo5 to a geforce in HL2 or any other recent game, since the voodoo5 can't run HL2 or any other recent game either for that matter. So that is a totally unfair comparison. The voodoo5 is a fixed function graphics accelerator, it is not programmable like everything from the GF3 and up has been. The voodoo5 would have been a good product, if it had come to market 2 years sooner. It was already out of the race by the time it reached the market.
And only the sky is the only thing that looks crappy, plus the sky still looks just as crappy on your favorite ATI hardware, thats the advantage of D3 being programmed for glide, unless it is updated OpenGL/D3D will never match the image quality you get in Glide. D3 is the exception out of most games from its day, by the time D3 game out most developers were moving away from glide support.
fliptw wrote:if you can't hold a worthwhile discussion here, then we have no good reason to go to the board you've linked too - outside of the fact this forum already went thru this 6 years ago, when all of this was germaine.
why no one has posted a 32-bit d3d screenshot is beyond me.
Must be your monitor then because it looks great compared to yours on mine. I even looked at the two on the laptop's widescreen LCD display and the difference between the two was like night and day.Obi-Wan Kenobi wrote:Well look at the sky it's crappy the Voodoo5 6000 still PWNs the crappyness of the nvidia card,
Maybe because Outrage did a piss poor job of implimentation of D3D? Gee, maybe thats why damn near everyone is using OpenGL instead?like I said Descent3 was made for 3dfx glide not D3D that was planned later, then tell me why were there so many fixes for Direct3D and not Glide
heh, my theory is wrong? My my my, aren't we on the defensive. Maybe you should check your own info before you start coming down on other users. D3 is built for DX6 support. If you do a little research, you'll note that D3 does not support T&L. T&L will only work in DX7 and later games.Once again if you really want to know the real truth it's time you visited http://www.falconfly-central.de/cgi-bin/yabb/YaBB.pl
there you will learn the true power of 3dfx and yes you will be confronted and yes your theorie is very wrong, I have told you, but looks like your belief is only stick'n to your self heh .
well for DX 8 support the is SFFT 35 best drivers for NT systems it uses SFFT Alpha 29 for the DX8.1 coreDuper wrote:Obi, how did you get the 3DFx cards to reconcile with Dx 8 and 9 fx? Never could get my V3 3k to run the more advanced stuff.
(btw, sry for the skim earlier. I went back through and read all the posts. (...Thanks MD!![]()
)
All this 3DFX talk makes me wish that Aureal 3D was still around. I have a Vortex 1 card that put out sound that I have seen..or rather heard.. equaled (save maybe the AWE32 which I've never owned or heard)
[dyk]Descent 3 is a Directx 6 game, thus no matter what video card you are using Transform and Lighting in the hardware is automatically disabled.[/dyk]Obi-Wan Kenobi wrote:ever played Descent3 with T&L disabled on a 6800ULTTA then it's a fair play right
Yeah, I loved them as well. It seems that all the cool companies get squashed by the punk ones. The 3D sound on those cards was simply amazing! Despite the low sound quality, they were great.Duper wrote:All this 3DFX talk makes me wish that Aureal 3D was still around. I have a Vortex 1 card that put out sound that I have seen..or rather heard.. equaled (save maybe the AWE32 which I've never owned or heard)
haha yeah, right after sueing them into oblivion over a midi chip of all things.Obi-Wan Kenobi wrote: didn't Creative buy Aureal?
Yeah, Creative bought them out like four or five years ago. There was a big stink about that too. Almost as big as the 3dfx/nVidia one. I remember a rather large flame war on the old Volition BB about that.Obi-Wan Kenobi wrote:and WoW you still got the Aureal Vortex! Good lord that Card rules Always wanted the Aureal Vortex 1 and 2I still make use of my good ole AWE 64 Gold! hehe but that Aureal 3D is awearsome I must say good to see that those cards are still being used to bad that they aren't around, didn't Creative buy Aureal?
nope didn't know that since I only play the game on Voodoo cards what it was built for lol.Krom wrote:[dyk]Descent 3 is a Directx 6 game, thus no matter what video card you are using Transform and Lighting in the hardware is automatically disabled.[/dyk]Obi-Wan Kenobi wrote:ever played Descent3 with T&L disabled on a 6800ULTTA then it's a fair play right
yeaps that is rightSuncho wrote:After buying a Riva 128 instead of a Voodoo1 and a Riva TNT instead of a Voodoo2, I promised myself that I'd only buy 3dfx for the rest of my life because of how much the nVidia cards sucked. So instead of going TNT2, I went Voodoo3, and instead of going Geforce2, I went Voodoo5. I don't regret it for a second (except for the part where I promised myself 3dfx for life) and I used my Voodoo5's through 2002. I still have 2 NiB Voodoo5 5500's (well... one was opened, then closed... when I realized it wouldn't fit in a modern motherboard).
No, that's not the answer. =PObi-Wan Kenobi wrote:hahah lol well a Voodoo5 6000 is some what like 5 years older than the 6800GT and it's like you really need that 290 fps, the human eye can't even see that so what's the point? --> Answer pointless for the user and great marketing for nVidia yippie!!
Hm I can help you with that a friend of mine from Romania called Zeckensack made a very good Glide wrapper which enables 3dfx Glide on GeForce4 and ATi Radeon 8500 and upSuncho wrote:I have a 6800 as well, and I dream of the day when nVidia starts supporting Glide. =)
Yes, but keep in mind it doesn't matter too much in a FPS. It's mainly good for making nice stills ..Obi-Wan Kenobi wrote:NV FSAA Sux big time to be honest![]()
3dfx and ATi still have the best FSAA.
Did you even look at that picture? That jpeg is so over-compressed its not even amusing as an image quality comparison shot. Also the geforce 6 series and up use rotated grid FSAA so the quality is virtually identical if not better because the grid pattern is better.
hmm nice story what's next? a party at the neighbour's house? I mean all you can say is that you do not agree with me right ?Krom wrote:Did you even look at that picture? That jpeg is so over-compressed its not even amusing as an image quality comparison shot. Also the geforce 6 series and up use rotated grid FSAA so the quality is virtually identical if not better because the grid pattern is better.
Just for fun, let's toss in a PNG image of GF6 AA shot. This image was only converted directly from tga to png using "save for the web" in adobe imageready CS2. That means there is NO gamma correction applied and that is what makes images from screen shots so dark not the FSAA, I could apply some gamma correction but that would wash out the colors.
![]()
MkayFloyd wrote:mine is bigger![]()