Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Computing > Digital Photography > VGA vs DVI connection to monitor

Reply
Thread Tools

VGA vs DVI connection to monitor

 
 
DaveS
Guest
Posts: n/a
 
      01-21-2011
The stuff I find using Google doesn't seem to be authoritative in any
way on this question..

I am receiving a new monitor (Dell U2311h) next week, and it can be
connected by various types of cable. My graphics card can use VGA or
DVI. The question is, will I experience any benefit by paying for and
using a DVI connection over using the included VGA connection?

I'm interested specifically in responses relating to photo editing.

Dave S.
 
Reply With Quote
 
 
 
 
David J Taylor
Guest
Posts: n/a
 
      01-21-2011
> The stuff I find using Google doesn't seem to be authoritative in any
> way on this question..
>
> I am receiving a new monitor (Dell U2311h) next week, and it can be
> connected by various types of cable. My graphics card can use VGA or
> DVI. The question is, will I experience any benefit by paying for and
> using a DVI connection over using the included VGA connection?
>
> I'm interested specifically in responses relating to photo editing.
>
> Dave S.


Dave,

Many Dell monitors are supplied with both VGA and DVI cables, so you may
be able to test for yourself with no extra cost. Here, I've not noticed
any significant difference - perhaps even /any/ difference - between VGA
and DVI cables. I have heard that some monitors don't allow control over
digital inputs, but I don't think that will apply to the Dell.

What I /would/ urge you to try is to clear some desk space and keep both
old /and/ new monitors, and using the two outputs on your graphics card to
create a two-monitor setup. Going from one monitor to two was one of the
most productive changes I made on my PC.

Cheers,
David

 
Reply With Quote
 
 
 
 
DaveS
Guest
Posts: n/a
 
      01-21-2011
On 1/21/2011 1:41 PM, David J Taylor wrote:
>> The stuff I find using Google doesn't seem to be authoritative in any
>> way on this question..
>>
>> I am receiving a new monitor (Dell U2311h) next week, and it can be
>> connected by various types of cable. My graphics card can use VGA or
>> DVI. The question is, will I experience any benefit by paying for and
>> using a DVI connection over using the included VGA connection?
>>
>> I'm interested specifically in responses relating to photo editing.
>>
>> Dave S.

>
> Dave,
>
> Many Dell monitors are supplied with both VGA and DVI cables, so you may
> be able to test for yourself with no extra cost. Here, I've not noticed
> any significant difference - perhaps even /any/ difference - between VGA
> and DVI cables. I have heard that some monitors don't allow control over
> digital inputs, but I don't think that will apply to the Dell.
>
> What I /would/ urge you to try is to clear some desk space and keep both
> old /and/ new monitors, and using the two outputs on your graphics card
> to create a two-monitor setup. Going from one monitor to two was one of
> the most productive changes I made on my PC.
>
> Cheers,
> David


Thanks for the advice.

I'm afraid I'm far behind you on the second part of your message,
though. My previous (current, as of today) monitor is a CRT. I had
resisted replacing CRT with LCD because of reading that LCD were not as
colour accurate until this week when differences between my brother's
view of a photo varied so drastically with mine, that I checked a
calibration web site and found I was missing several distinctions at the
dark end.

One step at a time.

Dave S.

 
Reply With Quote
 
Eric Stevens
Guest
Posts: n/a
 
      01-21-2011
On Fri, 21 Jan 2011 14:04:37 -0600, DaveS <(E-Mail Removed)> wrote:

>On 1/21/2011 1:41 PM, David J Taylor wrote:
>>> The stuff I find using Google doesn't seem to be authoritative in any
>>> way on this question..
>>>
>>> I am receiving a new monitor (Dell U2311h) next week, and it can be
>>> connected by various types of cable. My graphics card can use VGA or
>>> DVI. The question is, will I experience any benefit by paying for and
>>> using a DVI connection over using the included VGA connection?
>>>
>>> I'm interested specifically in responses relating to photo editing.
>>>
>>> Dave S.

>>
>> Dave,
>>
>> Many Dell monitors are supplied with both VGA and DVI cables, so you may
>> be able to test for yourself with no extra cost. Here, I've not noticed
>> any significant difference - perhaps even /any/ difference - between VGA
>> and DVI cables. I have heard that some monitors don't allow control over
>> digital inputs, but I don't think that will apply to the Dell.
>>
>> What I /would/ urge you to try is to clear some desk space and keep both
>> old /and/ new monitors, and using the two outputs on your graphics card
>> to create a two-monitor setup. Going from one monitor to two was one of
>> the most productive changes I made on my PC.
>>
>> Cheers,
>> David

>
>Thanks for the advice.
>
>I'm afraid I'm far behind you on the second part of your message,
>though. My previous (current, as of today) monitor is a CRT. I had
>resisted replacing CRT with LCD because of reading that LCD were not as
>colour accurate until this week when differences between my brother's
>view of a photo varied so drastically with mine, that I checked a
>calibration web site and found I was missing several distinctions at the
>dark end.
>
>One step at a time.
>

I'm now on my second Dell monitor using DVI and I wouldn't go back to
the less stable analogue connection.

What I would highly recommend is that if you are doing photo editing
you should invest in screen calibration equipment (I use Spyder).
Small and almost subtle changes in the screen can result in large
changes in what you print. Its highly frustrating when you can't get
the same print results as you did several months previously.



Eric Stevens
 
Reply With Quote
 
DaveS
Guest
Posts: n/a
 
      01-22-2011
On 1/21/2011 3:55 PM, N wrote:
> On 22/01/2011, DaveS wrote:
>> The stuff I find using Google doesn't seem to be authoritative in any
>> way on this question..
>>
>> I am receiving a new monitor (Dell U2311h) next week, and it can be
>> connected by various types of cable. My graphics card can use VGA or
>> DVI. The question is, will I experience any benefit by paying for and
>> using a DVI connection over using the included VGA connection?
>>
>> I'm interested specifically in responses relating to photo editing.
>>
>> Dave S.

>
> Why do you think the DVI connection will cost you more money? There'll
> be a DVI cable in the box.
>
>


I set out to prove you wrong, but I stand corrected:
What's in the Box

Monitor with stand
Power Cable
DVI Cable
VGA Cable (attached to the monitor)
Drivers and Documentation media
USB upstream cable
Quick Setup Guide
Safety Information

I believe that I have purchased and set up LCD monitors for others where
there was a DVI connector but no cable. Clearly, there is no cost for me
to find out for myself if there is any visible difference with this monitor.

Dave S.




 
Reply With Quote
 
Robert Coe
Guest
Posts: n/a
 
      01-24-2011
On Fri, 21 Jan 2011 22:17:18 -0900, http://www.velocityreviews.com/forums/(E-Mail Removed) (Floyd L. Davidson)
wrote:
: DaveS <(E-Mail Removed)> wrote:
: > Clearly, there is no cost for me
: >to find out for myself if there is any visible difference with this monitor.
:
: Whether you think you can see it on any given displayed
: image or not, use the DVI connection.
:
: I won't go so far as to say digital data is vastly
: better than analog data, but it is certainly better and
: you get significantly improved precision. Another point
: is that with age the VGA interface will drift far more
: than will the DVI interface.

DVI might be slightly more resistant to RF interference, especially if the
cable is long. But in normal use, it's very unlikely that you'll be able to
see any difference in image quality. That said, there's no reason not to take
Floyd's advice: if your card supports DVI, you might as well use it.

I have two dual-monitor setups at work, one of which uses one monitor on DVI
and one on VGA. On that setup, I can see a slight color difference between the
two monitors, but not enough to be annoying. On the setup with two DVI
monitors connected to the same video card, the colors look identical (given
identical settings of the monitors, of course).

Bob
 
Reply With Quote
 
Eric Stevens
Guest
Posts: n/a
 
      01-24-2011
On Sun, 23 Jan 2011 18:25:34 -0900, (E-Mail Removed) (Floyd L.
Davidson) wrote:

>Robert Coe <(E-Mail Removed)> wrote:
>>On Fri, 21 Jan 2011 22:17:18 -0900, (E-Mail Removed) (Floyd L. Davidson)
>>wrote:
>>: DaveS <(E-Mail Removed)> wrote:
>>: > Clearly, there is no cost for me
>>: >to find out for myself if there is any visible difference with this monitor.
>>:
>>: Whether you think you can see it on any given displayed
>>: image or not, use the DVI connection.
>>:
>>: I won't go so far as to say digital data is vastly
>>: better than analog data, but it is certainly better and
>>: you get significantly improved precision. Another point
>>: is that with age the VGA interface will drift far more
>>: than will the DVI interface.
>>
>>DVI might be slightly more resistant to RF interference, especially if the
>>cable is long. But in normal use, it's very unlikely that you'll be able to
>>see any difference in image quality. That said, there's no reason not to take
>>Floyd's advice: if your card supports DVI, you might as well use it.

>
>In normal use it should be an obvious difference. A
>digital interface sends a specific discrete value to the
>monitor. It is the exact same value each time, and is
>calculated from the value in the digital image file. It
>doesn't change, and has the same accuracy each time.
>
>The VGA interface has to convert the digital value to an
>analog value, and then the monitor has to using the
>timing of a dot clock to pick out the precise time that
>the right value is made available. It is not nearly as
>precise as the process used by the digital interface.
>It can never be as accurate.
>
>>I have two dual-monitor setups at work, one of which uses one monitor on DVI
>>and one on VGA. On that setup, I can see a slight color difference between the
>>two monitors, but not enough to be annoying.

>
>But it *is* different! The difference is error.
>
>>On the setup with two DVI
>>monitors connected to the same video card, the colors look identical (given
>>identical settings of the monitors, of course).

>
>No error.


Unless the monitors are calibrated, it might be two different errors.



Eric Stevens
 
Reply With Quote
 
David J Taylor
Guest
Posts: n/a
 
      01-24-2011
> In normal use it should be an obvious difference. A
> digital interface sends a specific discrete value to the
> monitor. It is the exact same value each time, and is
> calculated from the value in the digital image file. It
> doesn't change, and has the same accuracy each time.

[]
> --
> Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
> Ukpeagvik (Barrow, Alaska) (E-Mail Removed)


Maybe it /should/, but in practice it does not (at least on correctly
adjusted monitors).

Cheers,
David

 
Reply With Quote
 
David J Taylor
Guest
Posts: n/a
 
      01-24-2011
"Floyd L. Davidson" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
> "David J Taylor" <(E-Mail Removed)> wrote:
>>> In normal use it should be an obvious difference. A
>>> digital interface sends a specific discrete value to the
>>> monitor. It is the exact same value each time, and is
>>> calculated from the value in the digital image file. It
>>> doesn't change, and has the same accuracy each time.

>>[]
>>
>>Maybe it /should/, but in practice it does not (at least on correctly
>>adjusted monitors).

>
> I don't agree with your statement at all. In practice
> with a digital interface it sends *exactly* the same
> value every time.
>
> The problem for the analog interface is that is isn't
> exactly the same every time.
>
> And that of course is precisely the distinction between
> digital and analog when it is affected by noise. The
> digital system can function with a much lower SNR than
> can an analog system. It's fundamental.
>
> --
> Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
> Ukpeagvik (Barrow, Alaska) (E-Mail Removed)


Yes, you can get the "right" value into the monitor, but the issues of
drift and calibration inside the monitor are just the same as with an
analogue input monitor. I find that, in practice, drift of the analogue
components in a VGA interface isn't an issue, and neither have I seen VGA
signals affected by electrical noise even on moderate cable runs. Perhaps
I've been lucky!

Cheers,
David

 
Reply With Quote
 
DaveS
Guest
Posts: n/a
 
      01-24-2011
On 1/24/2011 9:47 AM, David J Taylor wrote:
> "Floyd L. Davidson" <(E-Mail Removed)> wrote in message
> news:(E-Mail Removed)...
>> "David J Taylor" <(E-Mail Removed)> wrote:
>>>> In normal use it should be an obvious difference. A
>>>> digital interface sends a specific discrete value to the
>>>> monitor. It is the exact same value each time, and is
>>>> calculated from the value in the digital image file. It
>>>> doesn't change, and has the same accuracy each time.
>>> []
>>>
>>> Maybe it /should/, but in practice it does not (at least on correctly
>>> adjusted monitors).

>>
>> I don't agree with your statement at all. In practice
>> with a digital interface it sends *exactly* the same
>> value every time.
>>
>> The problem for the analog interface is that is isn't
>> exactly the same every time.
>>
>> And that of course is precisely the distinction between
>> digital and analog when it is affected by noise. The
>> digital system can function with a much lower SNR than
>> can an analog system. It's fundamental.
>>
>> --
>> Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
>> Ukpeagvik (Barrow, Alaska) (E-Mail Removed)

>
> Yes, you can get the "right" value into the monitor, but the issues of
> drift and calibration inside the monitor are just the same as with an
> analogue input monitor. I find that, in practice, drift of the analogue
> components in a VGA interface isn't an issue, and neither have I seen
> VGA signals affected by electrical noise even on moderate cable runs.
> Perhaps I've been lucky!
>
> Cheers,
> David


OK, so theory says there is a difference in the signal received by the
monitor, depending on whether its coming from a digital or an analogue
source. Experience shows there is no noticeable difference.

Does anyone know how laptops connect video processor with display?

Dave S.

 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Connecting 2 PCs to a monitor with 2 inputs (DVI and VGA) Scott Steiner Computer Information 2 01-19-2008 09:12 AM
VGA vs. DVI masterxxo@gmail.com Computer Support 5 09-11-2005 07:03 PM
Componet video in - VGA or DVI out (converter) Daniel Computer Support 0 03-03-2004 01:25 PM
DVI to VGA Question Gomer Computer Support 6 12-24-2003 09:50 PM



Advertisments