TheGoatMan
Vendor
Ok lets see if i'm clear on this. ATI uses 24 bit code on all image rendering meanwhile Nvidia uses 16 and 32. So if for example a game is optimized for ATI it would be in 24 bit mode meanwhile a Nvidia card such as a FX5900 would render it in 16 bit mode making it seem less graphic due to poorer quality right? Now if it was optimized for Nvidia chances are it would be 32 bit and the FX theoritically would render better images? My friend says that no matter what an ATI will always have better image quality even though 32 bit is of higher detail. Please explain to me how he might get this notion.