Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Image rendering

Status
Not open for further replies.

TheGoatMan

Vendor
Nov 11, 2003
128
US
Ok lets see if i'm clear on this. ATI uses 24 bit code on all image rendering meanwhile Nvidia uses 16 and 32. So if for example a game is optimized for ATI it would be in 24 bit mode meanwhile a Nvidia card such as a FX5900 would render it in 16 bit mode making it seem less graphic due to poorer quality right? Now if it was optimized for Nvidia chances are it would be 32 bit and the FX theoritically would render better images? My friend says that no matter what an ATI will always have better image quality even though 32 bit is of higher detail. Please explain to me how he might get this notion.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top