![]() ![]() This feature improves texture mapping performance in some applications by using lossy compression. This feature provides an alternate method of coloring specular highlights on polygons. This feature accelerates complex rendering such as lightmaps or environment mapping. This feature improves OpenGL performance by using video memory to cache transformed vertices. OpenGL driver version check (Current: 6., Latest known: 6.):Īccording the database, you are running the latest display drivers for your video card. Shading language version: 1.20 NVIDIA via Cg compiler Here is an excerpt from OpenGL Extensions Viewer: Renderer: GeForce Go 7950 GTX/PCI/SSE2 ![]() I am wondering if this is related to my system, I am running Windows XP Pro 64-bit, with two Geforce 7950s running in SLI. I have verified that I have the latest forceware drivers (圆4 for my system). I have hardware acceleration to full in my windows preferences, have tried uninstalling drivers, Driver Sweeping them, and re-installing. I am wondering if there is something in particular here that could be going wrong? I have been told that FurMark uses the same initialization code as the following ( ) and that this detection/initialization process is quite standard. OpenGL Extensions Viewer runs tests verifying me up to OpenGL 2.1 with good performance and 100% compatibility.įurMark, another benchmark, does not detect OpenGL 2.0 and defaults to GDI Generic. I’ve been experiencing a strange problem, where OpenGL extensions viewer detects my renderer as my video card properly (Nvidia GeForce GO 7950 GTX) but some other benchmarks/demos/games do not, instead defaulting to “GDI Generic” ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |