Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

Recently upgraded Graphics Card .. now no Dual Screen


  • Please log in to reply
7 replies to this topic

#1 synergy513

synergy513

  • BC Advisor
  • 1,057 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:06:02 PM

Posted 02 May 2013 - 11:53 PM

Hello BC,

 

            I recently installed an AMD HD 5550 card into my Intel 915gro  MoBo and bought one of those VGA splitter cables as to enable dual display. My primary monitor is working fine, and my secondary monitor is getting a signal through the boot process, but as soon as i get the login splash screen, the secondary monitor goes blank with an error message saying "Cannot Display This Video Mode". My old GPU (GeForce 210) did just fine with the dual display using an HDMI (or is it DVI?   the one with multiple pins in a trapezoid) to VGA adapter, but now that same  adapter must be old and discrepit because its pin config isn't matching the new HD 5550. Has anyone else faced this kind of Dysfuntion? I have looked into the AMD Catalyst Help and XP Device manager and the secondary monitor isn't being recognized. SIW isn't noticing the second one either. it just says something to the tune of "NetMeeting"

 

                I am on XP 32.

 

                   Thanks everyone.

 

                                                    Gentle Gentile Abe Froman


Moore's Law : 4d Graph in Progress


BC AdBot (Login to Remove)

 


#2 GreenGiant117

GreenGiant117

  • Members
  • 294 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:07:02 PM

Posted 03 May 2013 - 07:22 AM

Why are you using a VGA splitter? why not just plug one monitor into each plug?

 

Using a splitter like that will either make your computer see one (if its set up for that) or it will see a "generic" and you wont have specific drivers for either

 

The pinouts have not changed, the only thing added was audio and that is through HDMI (you are describing DVI)



#3 synergy513

synergy513
  • Topic Starter

  • BC Advisor
  • 1,057 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:06:02 PM

Posted 04 May 2013 - 04:29 AM

Thanks for th reply GG,

 

             The way I had the dual displays working with my old GPU (GeForce 210) was exactly like that. I had the primary monitor plugged into the VGA from the card and I had the DVI-to-VGA adapter plugged into the card for the second monitor. for whatever reason that same adapter isn't matching up to the DVI port on my new GPU (HD 5550). I looked long and hard with a magnifying glass and flashlight and it appears that the DVI pin configuration on the new card is a standard array for the 8 pins across x 3 rows, but what I see is that the pin configuration beside it is not the same as the adapter. I wish i could shoot a few images and attach. The Card DVI port has just one horizontal slot, whereas the adapter has the same long slot plus 4 pins 2 top and 2 bottom. I know this might not too much sense without images to attach. I will ask my daughter to shoot some photos when she gets out of bed. I thought the VGA splitter cable was the answer, but apparently not.

 

             Also, what is the recommended hardware to get those dual displays running?, my new card definitely supports mulitple displays. Why is my OS seeing the second monitor during boot but then is blanking out at the login splash, that is what has me curious.

 

             I have my primary monitor driver disc, it is  a samsung 19 inch b2030 1600x900. My secondary monitor is a dell 16 inch. I also have a dell 19 inch that i keep as a spare, but it isn't as bright as my little 16 incher that i would love to get functioning.

 

                          Thanks.


Moore's Law : 4d Graph in Progress


#4 dpunisher

dpunisher

  • BC Advisor
  • 2,234 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:South TX
  • Local time:06:02 PM

Posted 05 May 2013 - 12:12 PM

Your DVI output on your new card likely doesn't have a hybrid DVI/VGA

(DVD-I) configuration like your old card.

 

The DVI-I has 4 little pins around the lone spade connector.  Without those 4 pins.......no VGA for you.  I will try to find  a pic and post back.

 

Edit:  http://en.wikipedia.org/wiki/Digital_Visual_Interface


Edited by dpunisher, 05 May 2013 - 12:19 PM.

I am a retired Ford tech. Next to Fords, any computer is a piece of cake. (The cake, its not a lie)

3770K @4.5, Corsair H100, GTX780, 16gig Samsung, Obsidian 700 (yes there is a 700)


#5 synergy513

synergy513
  • Topic Starter

  • BC Advisor
  • 1,057 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:06:02 PM

Posted 05 May 2013 - 02:16 PM

ok, i saw the article, and my new cards DVI port is  definitely without the 4 pins unlike my old card which had the 4 pins around the spade. it looks like my new card is a dvi-d dual link, whereas my adapter and my old card is a dvi-i dual link.  i am guessing dvi-d is just for digital signals such as TV or the like.  i would like to get my second monitor working with the VGA splitter cable, and it is getting the signal from it during boot-up, but nothing after that.

 

thanks DP and GG


Moore's Law : 4d Graph in Progress


#6 synergy513

synergy513
  • Topic Starter

  • BC Advisor
  • 1,057 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:06:02 PM

Posted 05 May 2013 - 03:06 PM

upon further reading, the DVI-D interface requires a VGA  converter (in other words , a simple 8 dollar adapter won't fly.) which looks to be over $100 . I might find one second hand. so, if anyone is in the market for a new graphics card, please verify that your DVI isn't a D and avoid the mistake that i made, if you wish to use your DVI port as a VGA monitor port.


Moore's Law : 4d Graph in Progress


#7 synergy513

synergy513
  • Topic Starter

  • BC Advisor
  • 1,057 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:06:02 PM

Posted 02 June 2013 - 05:40 PM

As luck would have it. my Samsung B2030 monitor has a Dual Link DVI-D input jack in the back of it, so i recently ordered a DVI-D Male-Male cable for a paltry $7. It should arrive in a couple of days.

 

           So now I can use my HD5550 VGA jack for my second monitor hopefully, and use the DVI -D for my primary monitor. That was a pleasant surprise. I thought my dual screen aspirations were dead unless i bought the $100 plus DVI-D to analog VGA converter which would have been on down the road. 


Moore's Law : 4d Graph in Progress


#8 synergy513

synergy513
  • Topic Starter

  • BC Advisor
  • 1,057 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:06:02 PM

Posted 03 June 2013 - 03:17 PM

ok, my $7 DVI-D male-male cable worked. my bright 15 inch monitor had no DVI-D socket on it so i went with my secondary backup Dell 1907 fpv which had a DVI-D socket. I had to go into the internal monitor OSD settings and switch it to digital. it took some tweaking inside the AMD Catalyst  settings and XP but ithe configuration  finally sailed. now i am working with my dual displays and my HD 5550 is staying at a cruising cool 38 C ~~ 99 F.   if anyone else finds themselves in the dilemma i found myself in then hopefully  this thread helps them as much as it helped me.

 

                   Thanks everyone


Moore's Law : 4d Graph in Progress





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users