UltraSmooth
Programmer
Whenever I try to enable a secondary display on my laptop it detects the monitor but then all it does is flash the desktop and then go black, like it's stuck in this auto-detect loop or something. I have tried multiple monitors and get the same thing. What's even funnier is that it worked the very first time I tried and since then has not worked once. I have updated to the latest drivers available on Nvidia's website and Windows 7 is up to date as well. I have tried using the external monitor as the primary screen, tried duplicating the desktop and just extending. I'm doing all this using the Win+P key method.
My setup is a Dell Vostro 1720 laptop with an Nvidia 9600M GS graphics chip and am running Windows 7 x64.
Anyone have any idea?
My setup is a Dell Vostro 1720 laptop with an Nvidia 9600M GS graphics chip and am running Windows 7 x64.
Anyone have any idea?