We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD’s Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD’s default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.
AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.
Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”. Special-casing of testing tools should also be considered a “cheat”.
Hihihi, najs. Opasni su mora im se priznati. Prvo sto im je 6850/6870 jeftiniji za proizvodnju od GTX460, pa im to nije bilo dosta pa su malko skinuli image quality po default-u u drajverima (a molili su review-ere da ne koriste H.A.W.X. 2 kao benchmark) nesto sto smo nekad mi sa nVIDIA Vanta karticama radili da bi se klincili sa fps-ovima u UT-u.
I onda kupci treba da donesu odluku baziranu na review-ovima gde se ne zna ko na koga vrsi pritisak, nVIDIA je tu daleko od cvecke pa je pritiskala revewer-e da u uporednim testovima testiraju fabricki O/C-ovane 460-tice (eVGA 460 FTW!) sto je gomila njih i ucinila.
I onda povrh posmatranja svih internalija (broj shader jedinica, memorijskog protoka, hercaze, ROP-ova, TDP-a) covek treba i da misli o tome da li AMD ili nVIDIA tweak-uju drajvere za specificne igrice (Crysis voli AMD, HAWX/FC2 etc gotive nVIDIA-u) + sto, ako gledate u buducnost, morate se zapitati da li ce igre sa nakrcanim nivoom teselacije dovesti na kolena vasu graficku od 200/250 EUR ili vise :)
Varaj, lazi, podmicuj, sve za revenue a ? :D