Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

DVI to VGA adapter not working on my main (of two) monitors


  • Please log in to reply
3 replies to this topic

#1 Psycho Nomad

Psycho Nomad

  • Members
  • 21 posts
  • OFFLINE
  •  
  • Local time:02:21 AM

Posted 13 July 2013 - 06:23 PM

Hi there!

 

I've just upgraded my graphics card to an Nvidia GTX 260 which has 2 DVI ports on the back. Before hand I had a GT240 with a VGA and a HDMI out. My main monitor used the VGA port on the 240, which only has a single VGA port on the back of the monitor. The model of the monitor in question is, Make: DGIM. Model: L-1931WS.

 

The second Monitor is a DELL (I can't find the model number) It has a VGA in and a DVI-D in. I was previously using a DVI to HDMI adapter to plug it in to the HDMI in on the GT240 as a secondary monitor. With the DGIM monitor plugged in to the VGA in on the 240 with a simple VGA wire.

 

I got a passive DVI to VGA adapter to use my main monitor with the new 260 card and it won't recieve data from the computer. I'm also using a signal splitter to send the same signal to my living room TV and the signal reaches the TV. It does not reach the monitor.

 

If I plug the DELL monitor into the same VGA plug that was in the DGIM monitor, the singal recieves fine. I already understand some about this. I'm certain it's down to the DELL having a DVI-D in on it's back meaning it can recieve the DDC from the vga cable through the adapter, however, the DGIM monitor is unable to recieve the data as it only supports analogue signal.

 

I'm unclear as the the specifics of wether the fault lies in the adapter, the Gfx card DVI port (Can't see wether it is DVI- D A or I right now) or wether it is simply down to the DGIM model monitor's restriction to VGA in. Though I have a hunch getting the right kind of adapter 'can' bridge the gap and succefully D-A convert.

 

Could I have some clarification on what I need to do to over come this? (new adapter, new monitor, e.t.c... Using this rather small screen is quite cumbersome :D )

 

Thanks.


Edited by Psycho Nomad, 13 July 2013 - 06:25 PM.


BC AdBot (Login to Remove)

 


#2 synergy513

synergy513

  • BC Advisor
  • 1,066 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:08:21 PM

Posted 14 July 2013 - 01:13 PM

ok, there is good news.

 

            there are dvi-d cables available for purchase online for cheap. my secondary dell monitor has the same getup, an analog vga input and a dvi-d input. my primary samsung monitor is using my analog vga out of the back of my card , my other card output is dvi-d without the four pins around the wide pin.. a dvi-i is ideal coming out of your gpu, it has the four pins around the wide pin and can go both ways, analog vga with a passive adapter or digital. passive adapters are fine for this to adapt to an analog vga input on your monitor.. if those four pins are missing on the dvi port on your video card then you are digital dvi output only like i am. no need to buy an expensive digital-analog converter, just get a cheap-o dvi-d cable. you will however need to go inside your dell monitor on screen display (OSD) and make the switch to digital if you go this dvi-d route . do this by using your buttons (menu, plus, minus, up, down, select etc )


Moore's Law : 4d Graph in Progress


#3 Psycho Nomad

Psycho Nomad
  • Topic Starter

  • Members
  • 21 posts
  • OFFLINE
  •  
  • Local time:02:21 AM

Posted 17 July 2013 - 12:53 PM

Thanks for your reply Synergy. MY Dell Is my secondary smaller monitor (15 inch or there about). That is the monitor with the DVI in (it also has VGA in).

 

The monitor I'm intending to get to work properly as the main monitor is the DGM monitor which has only a VGA in. I have however stumbled across getting it to work and it does seem to turn on and work fine so I know that the DVI adapter>VGA setup in my case works... sort of. It seems that every time I turn off the computer or it goes in to sleep mode. Upon the screen needing to 'work' again it seems to reliably revert back to thinking it's not plugged in.

 

What happens is the power LED at the bottom blinks every few seconds, leading up to the blink there's a very high pitched, faint (15/20KHz at least, way up there, typical electrical charge sound) 'power charging' type buildup sound as though the monitor is trying to turn on, at the blink theres a tiny 'pip' sound, a moment of silence and the process repeats perpetually.

 

If I plug the VGA into the back of the MOBO on board VGA out the monitor turns on straight away but goes in to power saving mode due to no signal, this stops the electrical build up and dispersal and the LED changes to a constant yellow (standby mode). Unpluging the VGA returns the monitor to it's cycling, 'trying to turn on' green blink state. This is the same state wether it's plugged in to the DVI adapter, or plugged in to nothing.

 

Using my TV through a signal splitter I can actually see the screen while the monitor is still plugged in to the computer. I've gone in to res/refresh rate settings and messed about, unplugging and plugging back in the DVI adapter, trying different combinations, also switching the "windows key+p" options for choosing which windows display profile to use. For whatever reason the monitor will decide to randomly work, or not, as the case often is. I have done extensive testing of what prcesses trigger the monitor to decide on receiving the DDC from the card; but I honestly cannot find any pattern to when it wants to work and when it doesn't. I have only got it to work 10% of the time at best, and have recently not been able to get it to work again at all, even after doing the exact same things in the same order as what caused it to work the first time. It's extremely confusing and quite frustrating to say the least!

What would any of your opinions be about this? Best course of action? Keep trying? What do you think is going on with the monitor? I've not come across this before.


Edited by Psycho Nomad, 17 July 2013 - 12:56 PM.


#4 synergy513

synergy513

  • BC Advisor
  • 1,066 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:08:21 PM

Posted 21 July 2013 - 05:44 PM

I went like WHOA,

 

           I am  not sure about the display configurations you are aspiring to. I just know that you can take a good look at the ports in the back of your graphics card and avoid shelling out over $100 USD for a digital to analog converter by using a cheap dvi-d cable out of the dvi-d port in the card and into the back of your monitor like i do. i just had to go into the internal monitor settings and switch it to digital.

 

       I thought all graphics cards had the three standard ports _  VGA (analog), DVI-I (with the four pins around the wide pin) and HDMI.

 

      The DVI-I can convert to analog  VGA output using a simple passive adapter.

 

               My Visiontek HD 5550 came with a non-standard DVI-D port (without the four pins around the wide pin) as well as the standard  VGA and HDMI.

 

                I panicked...I tried to use a VGA splitter cable to route to my Dell 15 inch which only has VGA input. no luck.

 

                     I recoiled when i thought i would have to pony up over a hundred dollars for an active  digital-analog converter. so i didn't.

 

                         I noticed that my Samsung B2030 had a DVI-D port in the back as well as the standard VGA input. I thought i was going to use my 15 inch as my primary and my 20 inch as my secondary using a dvi-d cable.

 

                     Then i remembered I had a Dell 19 inch in storage, i pulled it out, and low and behold, it had a DVI-D input also, so now the 19 inch is my secondary monitor using the DVI-D and my Samsung 20 inch  is my primary using the VGA.   All i had to do was go into the Dell 19 inch monitors internal menu and change it from analog to  digital and i am loving life with my dual screens again.

 

             this is what i can bear witness to.


Moore's Law : 4d Graph in Progress





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users