Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

New monitor won't connect via HDMI


  • Please log in to reply
13 replies to this topic

#1 thepokey

thepokey

  • Members
  • 88 posts
  • OFFLINE
  •  
  • Local time:12:10 AM

Posted 10 December 2017 - 02:45 AM

Hey guys, hope you're able to shed some light on this!

 

I just bought a BENQ GW2270 monitor and trying to set it up now. Monitor works perfectly via VGA input. However, I was wanting it to connect via HDMI so that A). I can use my old monitor as a second monitor via the VGA (since it doesn't have HDMI) and B). So that I can plug my headphones straight into the 3.5mm jack on the monitor (which apparently only works through HDMI).

 

I have plugged in the HDMI cable and connected it to the correct port on my PC - nothing. I made sure to manually tell the monitor which port to look at (e.g. HDMI slot 1, slot 2 etc). When the correct slot is chosen it says "No HDMI signal" when it is in the incorrect slot it says "no cable connected" ... so it seems it definitely registers the cable is connected when the right slot is chosen, it just doesn't connect to the signal.

 

About my set up:

 

Running windows 10, motherboard is an ASUS P8Z77-V LX. This has a HDMI slot on it, although I am aware it is an older model (think I got it in maybe 2011 or 2012). So I am assuming there is something extra I can do to get my PC to pick up the signal? Driver is all up to date. I am a bit tech ignorant so any help would be much appreciated!

 

edit: or I guess a side question: with this set up, is there a way I can connect both monitors without using HDMI? 


Edited by thepokey, 10 December 2017 - 08:38 AM.


BC AdBot (Login to Remove)

 


#2 DavisMcCarn

DavisMcCarn

  • Members
  • 846 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:09:10 AM

Posted 10 December 2017 - 11:48 AM

Page 3-20 of the users manual has a BIOS setting named Initiate IGPU which is expressly to enable dual monitor support. Check it.
http://dlcdnet.asus.com/pub/ASUS/mb/LGA1155/P8Z77-V_LX/E8533_P8Z77-V_LX.pdf?_
Computer dinosaur, servicing PC's since 1976

#3 thepokey

thepokey
  • Topic Starter

  • Members
  • 88 posts
  • OFFLINE
  •  
  • Local time:12:10 AM

Posted 12 December 2017 - 05:56 AM

Thanks for that! I went into the BIOS and I enabled the dual monitor support, connected my old monitor back up, however - still nothing. Although this could be because I connected the old monitor direct to the VGA port on my computer - which seems to do nothing. The way I have been using my monitor since I got this computer has been through a DVI-D to VGA converter (which is how I had to connect the new monitor. So seemingly the VGA and HDMI ports on my PC do not work? I have a secondary DVI-D port which I could connect the second monitor to, the problem is though that it is right next to the DVI-D port I use now, 2 converters will not physically sit next to each other (no room). So I guess I have to figure out why my PC does recognise anything in the VGA or HDMI ports?

 

Which is strange because when i plug the new monitor into the HDMI port it obviously detects *something* because when that HDMI port is chosen in the monitor menu it says "No HDMI signal detected" as opposed to "no cable connected" which is what it says if you select another HDMI port. 



#4 thepokey

thepokey
  • Topic Starter

  • Members
  • 88 posts
  • OFFLINE
  •  
  • Local time:12:10 AM

Posted 12 December 2017 - 06:16 AM

Ok so weirdly right after I finished typing that my second monitor just came to life (I'd left it plugged in via the VGA port). It took maybe 10mins but suddenly it appeared. So, I have no idea why that's the case, but it seems to be working.

 

However, as is usually the case, one solved problems causes another. Now chrome disappears once it is opened. It opens, a youtube video starts playing and I can hear the audio to it, but the screen is not visible. Its not minimised anywhere, its not on the second screen. It just doesn't seem to exist - a restart hasn't fixed that. Along with that, along the bottom of the main screen where it has IE and File Explorer tabs open, there is a flickering blue line which wasn't there before. So, I have no idea what it is, computer doesn't seem to really like a second monitor being attached it seems?

 

edit: disconnected second monitor/turned off dual monitor and everything went back to normal. So it seems everything just sort of freaks out when the second monitor is plugged in.

I thought since it took so long for the PC to recognise something was plugged in via VGA then maybe it would do the same with HDMI. So I connected again via HDMI and left it for about 10mins like I did with the VGA one but still nothing.


Edited by thepokey, 12 December 2017 - 06:51 AM.


#5 DavisMcCarn

DavisMcCarn

  • Members
  • 846 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:09:10 AM

Posted 12 December 2017 - 08:27 AM

You have raised some confusion in your thread in that you say you have a DVI-D to VGA converter cable which, probably unbeknownst to you, must have electronics in it because DVI-D is pure digital and VGA is pure analog. This means there must be electronic parts converting the digital to analog.
I think you should only use the VGA port and the HDMI port on the PC so we don't have another electronic device in the mix.
And, a good way to back up and cover our bases would be to try the PC with only the HDMI monitor plugged into the HDMI port using an HDMI cable. Does that work?
Computer dinosaur, servicing PC's since 1976

#6 thepokey

thepokey
  • Topic Starter

  • Members
  • 88 posts
  • OFFLINE
  •  
  • Local time:12:10 AM

Posted 12 December 2017 - 09:04 AM

I'll give that a shot tomorrow and let you know. But as far as I can tell there doesn't appear to be anything special about the converter. I could be wrong that it isn't actually converting from DVI-D to VGA, but from everything I can see online about what a DVI-D port should look like I'm pretty sure that's what this converter is doing. I got the PC back in late 2011 so my memory is a bit rusty but I seem to remember when I first plugged my VGA monitor into the VGA port it didn't detect anything which is why I somehow ended up trying the converter which worked and its just been that way ever since. I'm completely ignorant of this stuff so if it does have electronics in it then that is something I never considered. However, there is only one VGA port, so if HDMI is not going to work then it seems I'll need to have at least one of the monitors connected via the converter?

 

I have tried the HDMI plugged in without the other monitor plugged in at all, nothing happened. It's weird as I say because it's not as if the PC doesn't recognise that something isn't plugged in there. 



#7 DavisMcCarn

DavisMcCarn

  • Members
  • 846 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:09:10 AM

Posted 12 December 2017 - 09:36 AM

One end of a DVI connector is a flat prong and the female has a slot that it fits into. If it is a DVI-I port, there are also four pins and holes which are the analog connections. If it is DVI-D (Digital only), those 4 pins and holes are missing. Look at the connector on your system board and refer to this:
http://nvidia.custhelp.com/app/answers/detail/a_id/221/~/difference-between-dvi-i-and-dvi-d
Is it DVI-I or DVI-D?

Edited by DavisMcCarn, 12 December 2017 - 09:36 AM.

Computer dinosaur, servicing PC's since 1976

#8 thepokey

thepokey
  • Topic Starter

  • Members
  • 88 posts
  • OFFLINE
  •  
  • Local time:12:10 AM

Posted 29 December 2017 - 02:02 AM

Sorry for the delayed reply - christmas period and all!

 

Based on that information, the port is definitely a DVI-I as it has the four pins. I think where my confusion comes from is that when I googled my motherboard type it brings up the page on Asus and it labels that port as a DVI-D port (and when I look closer I can see that is the case as it doesn't have those 4 pins, so I just called it a DVI-D port). But that is strange then because when I check my motherboard type (through speccy) it says it is a P8Z77-V, it lists it as a DVI-D port (and also the page only shows there is one of these ports whereas on my motherboard there are 2 of them side by side (everything else is the same). Either way, its definitely a DVI-I based on the info you gave!

 

 

Is there any reason though that it isn't detecting via HDMI? Or straight from VGA when I plug it in directly there instead of through the DVI adapter?  



#9 DavisMcCarn

DavisMcCarn

  • Members
  • 846 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:09:10 AM

Posted 29 December 2017 - 10:40 AM

Apparently, you have a very odd version of the P8Z77-V LX and, given its age, you may need to buy a video card in order to run dual monitors.  If you are not concerned with gaming, a reasonable card ought to run about $40.


Computer dinosaur, servicing PC's since 1976

#10 thepokey

thepokey
  • Topic Starter

  • Members
  • 88 posts
  • OFFLINE
  •  
  • Local time:12:10 AM

Posted 29 December 2017 - 11:09 PM

I guess that is good news because a GPU is probably going to be my next PC-related purchase because I do want to do more gaming (Twitch streaming too). So, theoretically, if I were to upgrade to a new GPU (lets just say GTX 1070, for example), then I should be able to just plug one monitor in via the HDMI and the other monitor in via a VGA, or VGA to DVI-D port, enable dual monitors in the BIOS and then ... that's it? 



#11 DavisMcCarn

DavisMcCarn

  • Members
  • 846 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:09:10 AM

Posted 30 December 2017 - 10:07 AM

Yup, that's about it.  You will need to make the video card be the primary display or tell your games to use it rather than the onboard; but, neither is usually very difficult.


Computer dinosaur, servicing PC's since 1976

#12 thepokey

thepokey
  • Topic Starter

  • Members
  • 88 posts
  • OFFLINE
  •  
  • Local time:12:10 AM

Posted 30 December 2017 - 08:59 PM

Oh man, I guess I'll worry about that when it comes to it. Plus, I'm sure all the problems just installing a new GPU into a current system will bring. Watching a friend go through it at the moment and there are issues at every step. Dread!

 

So my current GPU seems to have these inputs: DVI-I, DVI-I, Mini-HDMI - which means that the VGA and the HDMI slots on my PC are straight into the MB, not the GPU. It's also neither the HDMI or the VGA which work - is it possible that its sort of the reverse of what you just said thats happening? That the video card is set as the primary display and so plugging displays into the HDMI/VGA MB slots just isn't being detected because of that? I'm guessing overall that's for the better, as crap as my GPU is, I guess I want display things running through it than using the MB?

 

It also seems most new GPUS (eyeing off a 1060) don't have DVI-I or VGA anymore. They just have HDMI and DVI-D. Since my new monitor has HDMI then I assume that would be no problem. But my old monitor which I would still want to use would then need a VGA to DVI-D adapter. Is that the one you mean has electronic parts and can cause issues?



#13 DavisMcCarn

DavisMcCarn

  • Members
  • 846 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:09:10 AM

Posted 31 December 2017 - 09:16 AM

What usually causes most folks serious grief is forgetting to uninstall the software for the old video card (GPU) before replacing it with the new one which is a problem you should not have as you are adding, not replacing. Or, several years ago, many of the video cards used the same audio chip as the one on the system board which also caused major headaches.

And, yes, almost none of the new GPU's have any support for analog monitors (VGA) so leave the onboard video connected to the old monitor and connect the new monitor to the video card.


Computer dinosaur, servicing PC's since 1976

#14 mjd420nova

mjd420nova

  • Members
  • 1,800 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:06:10 AM

Posted 31 December 2017 - 02:51 PM

I am not a big fan of adapters.  Be it VGA to HDMI, DVI-D to HDMI or any other conversion.  Some adapters just don't work, others don't handshake properly with the GPU or VGA ports, rendering them inoperable.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users