This is my first post here, so please bear with me, and I hope I have posted this in the right forum.
First off, I'm on Windows 10 64-bit. I use two monitors here at work--a Dell and an Insignia HDTV. My computer is an HP with in AMD FX-6350 6-core, 10gb of RAM, and a GeForce GT630 (installed after I bought the system, since I am a photographer by trade and need more power than the onboard graphics). The only problem is, my Dell monitor has no HDMI port, and the GPU only has one HDMI port (the computer didn't come with an HDMI port for some reason). I had to do a lot of cable logistics, but I now have a DVI-D from the monitor to the PC and an HDMI from the TV to the PC.
I don't always have the TV on to use as a monitor, but every time I turn it on, I have to go to display settings (this was also true when I was on Windows 7 and 8) and "disconnect" an erroneous 640x480 PnP display that Windows keeps detecting, even though it doesn't physically exist. The main annoyances with this are 1) I have to keep going into settings to turn the virtual monitor off every single time I turn on the TV, and 2) Windows detects it as the main monitor, so my monitor and TV are treated as secondary monitors (2 and 3) to which the virtual monitor extends the desktop. As a result, many times, I will open up a program and it will open on the virtual monitor instead of on my actual monitors, and to get it to come back I have to go disable the virtual one. This is very annoying, and I can't think of a reason that I have to keep disconnecting it, why my monitors are treated as secondary (it really messes up monitor detection and icons are always hard to find until I get it back the way I want it), and why Windows detects this erroneous monitor in the first place. And to be clear as possible, when I say I have to disable the virtual monitor every time, I mean, that when I turn off the second display and turn it back on again, I must go to display settings again and disable it...again. Each time.
I have tried to no avail to find existing questions or articles on Google related to my exact problem, and I have updated my video drivers. I have tried other cable configurations but a DVI-D and an HDMI cable are the only way these two displays will connect to the same computer to be used concurrently due to their configurations. Why a modern flat screen HD monitor doesn't come with an HDMI port is anyone's guess (probably because most monitors don't come with speakers?), but this makes it hard to use two monitors in any other way but the way I've done it.
Has anyone else experienced this, and is there a way to fix this? Or am I doomed to go to display settings and turn the virtual monitor off every single time I use the second display, and never know where my programs are open?