=Rivatuner ver.2 RC12.3=
=Low Level System Settings=
Overclocking Tab:
How
far u can overclock ur card depends on many things. The card brand, amount of cooling ur computer have etc.
A typical
'fairly' branded GF4 Ti 4200 128mb card can handle 280core/500mem w/o probs (except "maybe" Inno3d cards 'coz I've
heard lots of bad feedbacks from Inno3d GF4 Ti card owners).
-Apply overclocking at start-up: I recommend enabling
this option. But if u prefer to only enable o/c'ing b4 playing games, then disable the said option.
***The trick is
that u increase the core/mem slider 5 increments at a time and play for at least an hour, and if graphics corruption or falling
white dots (looks like snow) etc arise, set it back to ur usual setting. If no graphics corruption or instability occurs then
juz repeat the process up to the time that u reached the max setting that ur card can handle.
**If u have a 'cheap'
vidcard w/c can't handle high clock and core settings at the same time. Just leave the core clock setting at default and increase
the memory clock to its max (and stable) setting. Afterwards, u can raise the core clock setting 5 increments at a time (if
ur card still can handle it).
More Button:
Clock Frequency Generation Accuracy: Select "Normal"
**Selecting this option will give u more precise numbers. Selecting "Low
(Safest)" will cause the numbers to bounce once you applied ur preferred setting and will only cause confusion.
Fake DDR Workaround: Select "Auto"
** Selecting "Force On" will cause Rivatuner to improperly detect ur Mem
Clock default setting.
AGP Tab:
Better leave the settings here at their default settings since the values
there are the ones detected from your BIOS settings and/or videocard capabilities.
NVStrap Driver:
U can use the options here to convert ur GeFore to a Quadro card. I haven't
tried it myself b'coz its a known fact that doing the said mod will only decrease ur videocards "gaming" performance. But
is theoretically going to increase ur videocards graphics quality ang rendering capability. So if ur into video editing/rendering,
u may enable the said options. If ur into gaming, leave the said settings alone.
=System Settings=
Compatibility Tab:
I recommend enabling the "Enable Chipset Compatibility Mode" option. B'coz
it can increase the videocards performance. There's no guarantee that the same result will apply with all systems though,
so it is recommended that u benchmark ur system with and w/o the said setting to verify if u did gain a speed increase or
not. It should however result in a more stable gaming experience even if it didn't give u any speed increase/gain.
=Direct3D=
=Mipmapping=
Mipmap
LOD Bias Adjustment:
-Default Value is zero (balance between speed and graphics quality)
-Move the slider to the
left for sharper images but more aliased (jagged) textures. I recommend setting this to -3.0. Lower values are said to
cause the textures to flicker or shimmer. If ur not experiencing any probs setting this at a lower value and ur system can
handle the performance drop, then set it at a lower value for better graphics quality.
**There is an issue with detonator drivers 23.10 (or higher)
that is said to cause Rivatuner to not properly set "negative" LOD Bias settings. More info and instruction on how to fix
it will be discussed later (under the "Misc Tips" header)
-Set to a number higher than 0 (zero) for better performance at the
expense of graphics quality (blurry textures).
Enable user mipmaps:
-Enables programs/games to produce mipmaps. I recommend leaving this setting at its
default value (enabled). Disabling this option 'might' increase the games texture sharpness but will drastically pull its
performance down. So I recommend that u leave it enabled and use the LOD Bias Adjustment (above) to improve ur games texture
sharpness.
=Depth Buffering=
Enable 24-bit Z-buffer: I recommend that u leave this
setting enabled. Disable if experiencing font related problems (eg. blurry and/or unreadable texts).
Enable
W-buffer: Leave this setting enabled.
**Both settings above are ignored by DX8 games.
=LMA=
Enable Losless Z-buffer compression: no reason to disable this option!
Enable Early Z-Oclusion Culling: Better left enabled for better performance
(option not available with some driver versions)
=Blitting=
-I recommend that u do not put a
check mark on any of the boxes cause they will slow down ur d3d games/apps. But if u experience some difficulty running d3d
games, enabling them 'might' fix ur probs.
=Vsync=
Syncronization with vertical retrace:
-Set to "always off" for better performance (recommended)
-Set to either "auto" (and enable/disable vsync from
within the game) or "always on" (to force vsync to become enabled even if it is disabled from within the game) if
ur experiencing tearing with ur d3d apps/games.
Pre-render limit: Set to "5" (or higher) for better performance. If
u experience lags with a high value, then give it a setting that is lower than 5.
=Textures=
Texture Memory Settings: Set to the lowest possible number. They are only used by PCI graphics card.
Texture Format
Settings: Leave them as they are with the default settings.
Texture Filtering Settings:
-Set
to "Force Level 8" for better graphics quality. (recommended)
-Set to "Force Level 2 or 4" for faster performance.
The "Determined by D3D application" option is also a good choice if u want the program to be the one who will set the filtering
method and might even cause some games to run faster if they only run in Trilinear filtering and don't support "anisotropic
filtering".
=Optimize=
Allow D3D to optimize filter Stage 0-3: I recommend enabling the said boxes
by adding a tick mark. Same goes with the "always optimize selected stages" box to force the driver to always apply ur selected
values.
-Optimization Strategy: I recommend setting this to 4 or lower (lower number
= better performance).
-Disable Trilinear Filtering for optimized stages: I also recommend enabling this option.
=Compatibility=
Enable Fog Togle Emulation: Leave this
setting enabled.
=Antialiasing=
-I highly recommend that u enable FSAA 2x or 4x (if ur
system/game can still handle it). GF4 Ti is still up to now fast enough to run games with FSAA enabled. U may also use the
Quincunx option, its quality is comparable to 4x FSAA but the performance drop/hit is only about the same as 2x FSAA.
**Enabling FSAA on D3D games sometimes makes the game's texture
look blurry ... specially if ur using Quincunx and 4x-9tap. The only way to combat this is lowering the LOD Bias value (shown
above). But as u decrease its value ... performance will start to become a problem. So if u can't stand how blurry the game
looks, I recommend that u use a higher resolution instead.
Force AntiAliasing in all Direct3D applications:
Select this option to enable FSAA for all D3D games. If graphics corruption becomes a prob ... disable this setting.
Enable Texture Sharpening: Another way to combat blurry looking
images when using FSAA. It is NOT however recommended to use in conjunction with 40.xx detonator drivers.
Enable Multisample Masking: This setting
is said to increase D3D FSAA performance, but may cause DX7 games to run slower. Only enable the said option with DX8
(and above) games.
=OpenGL=
=Mipmapping=
**Same with my D3D explanation
=Vsync=
Syncronization with vertical retrace:
-Set to "off by default" for better performance (recommended)
-Set to "on by default" if ur experiencing
tearing with ur OGL apps/games that doesn't have the option to enable Vsync.
Pre-render limit: Set to "5" (or higher)
for better performance. If u experience lags then give it a value lower than 5.
=Back/Depth Buffering=
Better leave this at its default setting
of "auto select".
=Rendering Quality=
Texture and S3TC Quality settings: Leave
them with their default values. (both boxes un-ticked)
Anisotropic Filtering Quality Preferences: Set to "performance
optimization" for better performance with very slight to unoticeable decrease in graphics quality. If ur sensitive enough
to notice the decrease in graphics quality then switch to "quality optimization"
Default Degree of Anisotropy: Set to "Level 8". "Level 4
and lower" produces lower quality graphics but increases performance.
=Compatibility=
OpenGL Hardware Acceleration Mode: Set
to "max acceleration mode" for better performance.
Disable support for CPU enhanced instruction sets: disable!
Amount
of memory for PCI textures: set to the lowest setting possible. Only used by PCI videocards.
=Antialiasing=
**Same as my D3D explanation
**But unlike DX, OpenGL FSAA doesn't have the
blurry side-effect when enabling Quincunx or 4x-9tap. You might notice that there is still a slight blurry effect (specially
if u have ur LOD Bias set at zero and Texture Sharpening disabled) but it doesn't ruin the games graphics or might even be
unoticeable for some.
=Misc. Tips / Info=
Recommended driver versions:
**I
am aware that there is no single correct answer for "what is the best driver for my videocard?" question. But seeing peeps
w/ me included having the same luck with the ff driver versions, makes them worthy of a recommendation:
Win98/se: Try Omega 43.45 or NVIDIA's 43.45 (both perform approx the same)
Win2k: Try
Omega 44.03
WinXP: Try 44.67 Beta (yes its
Beta)
=LOD Bias +/- 15=
There is a way to break the +/- 3 value
for LOD Bias Adjustment. Go to > power user (tab) > Click on Rivatuner\Detonator\Global > Give the LODBiasRange line
a value of "15".
*Note that any value higher than 15 is
ignored by the drivers. And the performance hit of giving the LOD Bias a value of -15 is quite drastic ... but the
graphics quality increase is noticeable.
=Enable Digital Vibrance=
I recommend that u enable Digital Vibrance
and give it a "low" value for more rich and vibrant looking colors!
=Rivatuner LOD Bias Fix=
As mentioned above. 23.10 drivers "are said" to cause Rivatuner
to set improper "negative" LOD settings. To fix this, u have to head into Rivatuners folder: \Rivatuner\PatchScripts\NVIDIA.
And head into the LODBiasFix folder and open the file with the win9x (for win98/ME) and win2k (for winNT/2000/XP). The description
will be given before the patch is applied. If u really want to update ur file, simply hit the "continue" button and browse
ur hard drive where the indicated file is located (u can use windows search function for this). The patch will automatically
create a file with an ."old" file extension after the patch application, so if something goes wrong, u can always revert back
to ur original file.
**There are also other patches located inside Rivatuner's
PatchScripts folder. The explanations will be automatically given when u click on them. Installing them will be the users
decision. good luck!
=Omega Drivers=
This driver is said to increase NVIDIA
cards performance and/or graphics quality. I tried it myself and I got a pretty impressive performance boost with it.
I recommend that u guys try all the available drivers available for ur
OS and try installing and benchmarking ur system with both the "Performance" and "Quality" setting/installation. Then compare
ur score/s with NVIDIA's "default" driver setting (against Omega's "performance" installation) and with LOD Bias set
at -.8 and AF set at 8x (against Omega's "quality" installation).
Be reminded however that it is not recommended that u use a 3rd party
tweaking software (like rivatuner) when ur using Omega drivers. That is b'coz the extra coding that the author applied to
the drivers is not recognized by 3rd party tweaking programs. So opening up or using tweaking progs will simply revert some
of its tweaked settings to NVIDIA's default coding and will render the tweaked drivers useless. The only settings that will
not have any bad effect on this driver (Omega) when altered is FSAA, overclocked settings, and V-sync. That is according to
the author, but IMO u can change the LOD Bias w/ no harmful effect on its performance (unless of course u set it to low
<negative> w/c will lower ur card's performance and vice-versa).
Additional notes: don't be alarmed if
ur goin to use Rivatuner to view the "Textures" tab (under D3D) or "Compatibility" tab (under OGL) and the program displays
an error. Simply close the error msg box and press "CANCEL" (for closing the program) instead of "ok". Pressing "ok"
will only force the driver to use NVIDIA's default setting or ur tweaking progs settings.
According to the author of the driver, the diff between Performance and
Quality installation is/are..
-Performance: same graphics quality as NVIDIA's driver, but contains
additional features and has a slight speed advantage.
-Quality: slightly slower than NVIDIA's driver but contains better quality
textures (I'm using this setting myself, and I can't notice any difference when compared to NVIDIA's 8x AF setting ... but
the bench score on this one is approx 3k higher!)
Version 1.4523 is said to only run in performance mode under d3d due
to coding problems. I do believe him but I think he forgot to revert the LOD Bias setting at "zero" instead of -.8 (w/c is
the default setting for Quality based installation).
=Starforce Drivers=
Yet another "tweaked" NVIDIA drivers. This one only works
for Win2k/XP OS. Its effectiveness IMO is a couple of notches lower than Omega's, but on the bright side this one gets newer
versions A LOT faster than Omega. So if ur looking for a wider range of tweaked 5x.xx drivers, u can try and check this
one out.
Last Updated: 050704