Page 1 of 1

Hmm

Posted: Thu Jan 01, 2009 7:15 am
by Hunter
So, I'm currently using a Geforce 9800 with DX10 on Vista, and Descent 3 is acting strange. It runs ok, its very smooth - But the textures are s### and the dynamic lighting seems to be screwed up. Robots and items appear as full bright objects, and textures in mines seem very pixelated and blocky. I'm working on looking for a solution as we speak. Any tips would be welcome.

Posted: Thu Jan 01, 2009 8:36 am
by d3jake
Try a different renderer for one, and secondly get off Vista, but since you just got your computer I will assume that you are using it intentionally.

Posted: Thu Jan 01, 2009 9:06 am
by Hunter
I refuse to get off Vista, thanks anyway. This is a graphic card problem, not an OS one.

Posted: Thu Jan 01, 2009 11:14 am
by BloodEagle
Did you try running it under OpenGL?

Posted: Thu Jan 01, 2009 6:22 pm
by Matthew
Jake, it's not vista, I use vista and it works jsut fine, thank you very much.

Posted: Fri Jan 02, 2009 7:24 am
by d3jake
Ah, but it could be, DirectX 10 is newer (therefore there could have bugs in it. Plus Vista drivers have been notoriously bad (though I have heard that it has been improving) And Vista itself is new, which means that it could have bugs in it that are unknown. Just because one user has no problems doesn't mean that it can be ruled out. If you were using identical hardware, and driver\OS versions then I would agree with you. There are too many variables out there anyway.

Posted: Fri Jan 02, 2009 8:27 am
by Hunter
I'm willing to go with the graphic card first. The first clue I got was that newer Geforce cards don't support the dithering that Descent 3 and many old games use. Gotta look into it in more detail.

Posted: Fri Jan 02, 2009 3:22 pm
by Matthew
What's dithering?

Posted: Sat Jan 03, 2009 7:04 am
by Hunter
The reason D3 looks good for a 1998 game is because of dithering. Essentially, it dithers (blends) colours together to create smooth looking textures.

Posted: Sat Jan 03, 2009 6:24 pm
by Hunter
Nvidia have removed 16-bit dithering from their cards, which Descent 3 uses. Although you can use 32-bit in D3D, it doesn't stick past the session. Here are some screenshots. These are in D3D, and is so far the best I can get D3 to look. They look OK but under closer examination certain textures are just pants. I actually took this in 32-bit, but for some reason the screenshots came out in 16-bit. OpenGL looks the worst, which is a shame, because I always liked how OGL looks in D3. These pics may look ok to the untrained eye, but this is not how nice I remember it looking a few years ago. Besides, my laptop and it's onboard card does better.
Image
Image

Posted: Sat Jan 03, 2009 10:26 pm
by BloodEagle
Ew.

Posted: Sun Jan 04, 2009 1:48 am
by Top Gun
Ugh...

Is there anything that nVidia gains by removing backwards-compatibility features like this? And if not, why do they even bother to do so? They can't be under the illusion that no one plays older PC titles on modern machines, right?

Posted: Sun Jan 04, 2009 5:22 am
by Hunter
This problem has popped up in several games based on my research, but no sign of any patches for it.

Posted: Sun Jan 04, 2009 11:19 pm
by BloodEagle
I don't suppose that there is a way to emulate it. :/