Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

DVI Connections


  • Please log in to reply
2 replies to this topic

#1 8-bit

8-bit

  • Members
  • 162 posts
  • OFFLINE
  •  
  • Location:London, UK
  • Local time:09:55 PM

Posted 02 August 2009 - 04:01 PM

Would I notice any different to my display if I connected my monitor to my PC using a DVI cable instead of a VGA cable?

BC AdBot (Login to Remove)

 


#2 *Michael*

*Michael*

  • Banned
  • 1 posts
  • OFFLINE
  •  
  • Local time:09:55 PM

Posted 03 August 2009 - 05:43 AM

I will be quoting excerpts from wikipedia.org to explain the
different. If you want a good understanding, I recommend that you read
the articles in their entirety:

VGA: http://en.wikipedia.org/wiki/VGA
DVI: http://en.wikipedia.org/wiki/Dvi

The main difference is that VGA is an analog standard for computer
monitors that was first marketed in 1987, whereas DVI is a newer and
superior digital technology that has the potential to provide a much
better picture.

Here are two relevant excerpts from Wikipedia for your convenience:

"Existing standards, such as VGA, are analog and designed for CRT
based devices. As the source transmits each horizontal line of the
image, it varies its output voltage to represent the desired
brightness. In a CRT device, this is used to vary the intensity of the
scanning beam as it moves across the screen. However, in digital
displays, instead of a scanning beam there is an array of pixels and a
single brightness value must be chosen for each. The decoder does this
by sampling the voltage of the input signal at regular intervals. When
the source is also a digital device (such as a computer), this can
lead to distortion if the samples are not taken at the centre of each
pixel, and in general the crosstalk between adjacent pixels is high."
SOURCE: http://en.wikipedia.org/wiki/Dvi

"DVI takes a different approach. The desired brightness of the pixels
is transmitted as a list of binary numbers. When the display is driven
at its native resolution, all it has to do is read each number and
apply that brightness to the appropriate pixel. In this way, each
pixel in the output buffer of the source device corresponds directly
to one pixel in the display device, whereas with an analog signal the
appearance of each pixel may be affected by its adjacent pixels as
well as by electrical noise and other forms of analog distortion."
SOURCE: http://en.wikipedia.org/wiki/Dvi

Please let me know if you require further clarification or assistance
with this question.

#3 8-bit

8-bit
  • Topic Starter

  • Members
  • 162 posts
  • OFFLINE
  •  
  • Location:London, UK
  • Local time:09:55 PM

Posted 06 August 2009 - 03:16 PM

Many thanks for the reply. I guess I should check Wiki more often to save any one the trouble of having to reply. :thumbsup:




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users