Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

NOTNlCE

macrumors 65816
Original poster
Oct 11, 2013
1,087
476
Baltimore, MD
So, this problem only occurs under Windows, so I'm putting this thread here. I have a Mac Pro 3,1 - running Windows 7, with both a PC NVIDIA GTX 660 and the original Mac Radeon 2600 XT installed. The Radeon is powering my VGA display, as the 660 doesn't seem to like to run VGA displays under OS X. The configuration works fine under OS X, running my main DVI display off the GTX 660 and the VGA as a secondary off the 2600 XT, to provide an EFI boot menu if I need it, as well as the secondary display. NVIDIA released new drivers a few days ago, so earlier today I booted into Windows to install them, and to my dismay, Windows decided to use the 2600 XT as the display adapter. Under Device Manager, the Radeon is seen as a "VGA Compliant Video Adapter" while the GTX 660 is seen as functioning, though it will not display a picture to my DVI monitor. Oddly enough, if I boot with just the GTX 660 with the VGA and DVI monitor in the GTX 660, everything works fine. So my question is: is there a way to use my current configuration and boot into Windows WITHOUT switching my VGA display to the GTX 660 (which involves climbing behind my Mac, or even worse, having to remove the Radeon card)? Any ideas or workarounds would be great, Windows isn't a priority to me, I'm pretty satisfied with my OS X setup, but if I can get everything working the way it does in OS X, that's just that much better.
Thanks in advance!
-N

EDIT: Still no fix for VGA out in OS X, but a few driver updates in Windows allows both screens to run as it does in OS X.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.