Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

Video Card Resolution Question


  • Please log in to reply
2 replies to this topic

#1 Ranger SVO

Ranger SVO

  • Members
  • 65 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:02:13 AM

Posted 18 November 2016 - 11:26 PM

I have a Computer with Windows 10 Pro x64. It was equipped with a Gigabyte GT610 2GB video card and I ran two monitors. I recently upgraded to three monitors and replaced the video card with an ASUS GTX 660 ti. 

 

Monitor 1 (far right which is OK) is connected via DVI-I to VGA adapter to a video amp/splitter one going to a desktop monitor, the other going to a 70-inch touch screen. https://www.prometheanworld.com/products/interactive-flat-panels/activpanel

 

The two new monitors (which is NOT OK) were set to their native resolution of 1680 x 1050. The picture was not clear, it was like the gamma setting was way to high. Both are Acer V223W. One is hooked up via the DVI out and the other is hooked up via the display port through a DP to DVI adapter. 

 

Desktop%202_zpssvjb4te4.jpg

 

I attempted adjusting brightness, contrast and gamma through the NVIDIA control panel but could not get the picture to look right. 

 

Finally I tried setting the resolution to 1440 x 900. The picture is much better.

 

Why can I not run the monitors at the 1680 x 1050 (native) resolution?


Edited by Ranger SVO, 18 November 2016 - 11:27 PM.


BC AdBot (Login to Remove)

 


#2 technonymous

technonymous

  • Members
  • 2,468 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:13 PM

Posted 19 November 2016 - 02:58 AM

Well first of all you're running 4 monitors off a video card that is only designed to run 3 displays so you're already borking it. Secondly, you're taking the DVI converting it to analog VGA (downgrading) to support an older monitor and splitting that signal to a 70inch.

 

Both the dvi and hdmi easily support up to 1920x1200@60Hz resolution each single port. Not sure what version of HDMI or DP ports are on your video card. Probably 1.2, I am not sure what all hookups your monitors support so I am just guessing here...

 

To me it makes more sense to junk the older monitor and get you another newer DVI/HDMI monitor like the other two. Connect DVI-I to monitor 1, Connect DVI-D to monitor 2, Connect HDMI to monitor 3. Buy a DP to HDMI cable for the 70inch.

 

That's my thoughts.



#3 Ranger SVO

Ranger SVO
  • Topic Starter

  • Members
  • 65 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:02:13 AM

Posted 19 November 2016 - 11:35 AM

I sincerely appreciate you taking the time to respond and your suggestion will be considered. 

 

My%20Classroom%201a_zpszob5ueyr.jpg

 

The Touch screen and the far right monitor on the desk really need to mirror each other. Their resolution is set to native and the picture on those is perfect. 

 

The two Acers are the issue, but I will consider another monitor and move the connections around. An HDMI signal splitter/ amp is now on my shopping list. Lets see what happens






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users