I've just upgraded my graphics card to an Nvidia GTX 260 which has 2 DVI ports on the back. Before hand I had a GT240 with a VGA and a HDMI out. My main monitor used the VGA port on the 240, which only has a single VGA port on the back of the monitor. The model of the monitor in question is, Make: DGIM. Model: L-1931WS.
The second Monitor is a DELL (I can't find the model number) It has a VGA in and a DVI-D in. I was previously using a DVI to HDMI adapter to plug it in to the HDMI in on the GT240 as a secondary monitor. With the DGIM monitor plugged in to the VGA in on the 240 with a simple VGA wire.
I got a passive DVI to VGA adapter to use my main monitor with the new 260 card and it won't recieve data from the computer. I'm also using a signal splitter to send the same signal to my living room TV and the signal reaches the TV. It does not reach the monitor.
If I plug the DELL monitor into the same VGA plug that was in the DGIM monitor, the singal recieves fine. I already understand some about this. I'm certain it's down to the DELL having a DVI-D in on it's back meaning it can recieve the DDC from the vga cable through the adapter, however, the DGIM monitor is unable to recieve the data as it only supports analogue signal.
I'm unclear as the the specifics of wether the fault lies in the adapter, the Gfx card DVI port (Can't see wether it is DVI- D A or I right now) or wether it is simply down to the DGIM model monitor's restriction to VGA in. Though I have a hunch getting the right kind of adapter 'can' bridge the gap and succefully D-A convert.
Could I have some clarification on what I need to do to over come this? (new adapter, new monitor, e.t.c... Using this rather small screen is quite cumbersome :D )
Edited by Psycho Nomad, 13 July 2013 - 06:25 PM.