Hmm

#1
So, I'm currently using a Geforce 9800 with DX10 on Vista, and Descent 3 is acting strange. It runs ok, its very smooth - But the textures are s### and the dynamic lighting seems to be screwed up. Robots and items appear as full bright objects, and textures in mines seem very pixelated and blocky. I'm working on looking for a solution as we speak. Any tips would be welcome.
The Expanse. Watch it!

#2
Try a different renderer for one, and secondly get off Vista, but since you just got your computer I will assume that you are using it intentionally.
Heck, my ship sometimes shakes like a Hawaiian dashboard dancer in an old bug on gravel at 100mph...
"If toast always lands butter-side down, and cats always land on their feet, what happens if you strap toast on the back of a cat and drop it?"
-Steven Wright

#6
Ah, but it could be, DirectX 10 is newer (therefore there could have bugs in it. Plus Vista drivers have been notoriously bad (though I have heard that it has been improving) And Vista itself is new, which means that it could have bugs in it that are unknown. Just because one user has no problems doesn't mean that it can be ruled out. If you were using identical hardware, and driver\OS versions then I would agree with you. There are too many variables out there anyway.
Heck, my ship sometimes shakes like a Hawaiian dashboard dancer in an old bug on gravel at 100mph...
"If toast always lands butter-side down, and cats always land on their feet, what happens if you strap toast on the back of a cat and drop it?"
-Steven Wright

#7
I'm willing to go with the graphic card first. The first clue I got was that newer Geforce cards don't support the dithering that Descent 3 and many old games use. Gotta look into it in more detail.
The Expanse. Watch it!

#9
The reason D3 looks good for a 1998 game is because of dithering. Essentially, it dithers (blends) colours together to create smooth looking textures.
The Expanse. Watch it!

#10
Nvidia have removed 16-bit dithering from their cards, which Descent 3 uses. Although you can use 32-bit in D3D, it doesn't stick past the session. Here are some screenshots. These are in D3D, and is so far the best I can get D3 to look. They look OK but under closer examination certain textures are just pants. I actually took this in 32-bit, but for some reason the screenshots came out in 16-bit. OpenGL looks the worst, which is a shame, because I always liked how OGL looks in D3. These pics may look ok to the untrained eye, but this is not how nice I remember it looking a few years ago. Besides, my laptop and it's onboard card does better.
Image
Image
The Expanse. Watch it!

#12
Ugh...

Is there anything that nVidia gains by removing backwards-compatibility features like this? And if not, why do they even bother to do so? They can't be under the illusion that no one plays older PC titles on modern machines, right?
A.K.A. Mongoose, for you HLP denizens
Post Reply

Who is online

Users browsing this forum: No registered users and 9 guests

cron