VGA vs DVI connection to monitor

Discussion in 'Digital Photography' started by DaveS, Jan 21, 2011.

  1. DaveS

    DaveS Guest

    The stuff I find using Google doesn't seem to be authoritative in any
    way on this question..

    I am receiving a new monitor (Dell U2311h) next week, and it can be
    connected by various types of cable. My graphics card can use VGA or
    DVI. The question is, will I experience any benefit by paying for and
    using a DVI connection over using the included VGA connection?

    I'm interested specifically in responses relating to photo editing.

    Dave S.
    DaveS, Jan 21, 2011
    #1
    1. Advertising

  2. > The stuff I find using Google doesn't seem to be authoritative in any
    > way on this question..
    >
    > I am receiving a new monitor (Dell U2311h) next week, and it can be
    > connected by various types of cable. My graphics card can use VGA or
    > DVI. The question is, will I experience any benefit by paying for and
    > using a DVI connection over using the included VGA connection?
    >
    > I'm interested specifically in responses relating to photo editing.
    >
    > Dave S.


    Dave,

    Many Dell monitors are supplied with both VGA and DVI cables, so you may
    be able to test for yourself with no extra cost. Here, I've not noticed
    any significant difference - perhaps even /any/ difference - between VGA
    and DVI cables. I have heard that some monitors don't allow control over
    digital inputs, but I don't think that will apply to the Dell.

    What I /would/ urge you to try is to clear some desk space and keep both
    old /and/ new monitors, and using the two outputs on your graphics card to
    create a two-monitor setup. Going from one monitor to two was one of the
    most productive changes I made on my PC.

    Cheers,
    David
    David J Taylor, Jan 21, 2011
    #2
    1. Advertising

  3. DaveS

    DaveS Guest

    On 1/21/2011 1:41 PM, David J Taylor wrote:
    >> The stuff I find using Google doesn't seem to be authoritative in any
    >> way on this question..
    >>
    >> I am receiving a new monitor (Dell U2311h) next week, and it can be
    >> connected by various types of cable. My graphics card can use VGA or
    >> DVI. The question is, will I experience any benefit by paying for and
    >> using a DVI connection over using the included VGA connection?
    >>
    >> I'm interested specifically in responses relating to photo editing.
    >>
    >> Dave S.

    >
    > Dave,
    >
    > Many Dell monitors are supplied with both VGA and DVI cables, so you may
    > be able to test for yourself with no extra cost. Here, I've not noticed
    > any significant difference - perhaps even /any/ difference - between VGA
    > and DVI cables. I have heard that some monitors don't allow control over
    > digital inputs, but I don't think that will apply to the Dell.
    >
    > What I /would/ urge you to try is to clear some desk space and keep both
    > old /and/ new monitors, and using the two outputs on your graphics card
    > to create a two-monitor setup. Going from one monitor to two was one of
    > the most productive changes I made on my PC.
    >
    > Cheers,
    > David


    Thanks for the advice.

    I'm afraid I'm far behind you on the second part of your message,
    though. My previous (current, as of today) monitor is a CRT. I had
    resisted replacing CRT with LCD because of reading that LCD were not as
    colour accurate until this week when differences between my brother's
    view of a photo varied so drastically with mine, that I checked a
    calibration web site and found I was missing several distinctions at the
    dark end.

    One step at a time.

    Dave S.
    DaveS, Jan 21, 2011
    #3
  4. DaveS

    Eric Stevens Guest

    On Fri, 21 Jan 2011 14:04:37 -0600, DaveS <> wrote:

    >On 1/21/2011 1:41 PM, David J Taylor wrote:
    >>> The stuff I find using Google doesn't seem to be authoritative in any
    >>> way on this question..
    >>>
    >>> I am receiving a new monitor (Dell U2311h) next week, and it can be
    >>> connected by various types of cable. My graphics card can use VGA or
    >>> DVI. The question is, will I experience any benefit by paying for and
    >>> using a DVI connection over using the included VGA connection?
    >>>
    >>> I'm interested specifically in responses relating to photo editing.
    >>>
    >>> Dave S.

    >>
    >> Dave,
    >>
    >> Many Dell monitors are supplied with both VGA and DVI cables, so you may
    >> be able to test for yourself with no extra cost. Here, I've not noticed
    >> any significant difference - perhaps even /any/ difference - between VGA
    >> and DVI cables. I have heard that some monitors don't allow control over
    >> digital inputs, but I don't think that will apply to the Dell.
    >>
    >> What I /would/ urge you to try is to clear some desk space and keep both
    >> old /and/ new monitors, and using the two outputs on your graphics card
    >> to create a two-monitor setup. Going from one monitor to two was one of
    >> the most productive changes I made on my PC.
    >>
    >> Cheers,
    >> David

    >
    >Thanks for the advice.
    >
    >I'm afraid I'm far behind you on the second part of your message,
    >though. My previous (current, as of today) monitor is a CRT. I had
    >resisted replacing CRT with LCD because of reading that LCD were not as
    >colour accurate until this week when differences between my brother's
    >view of a photo varied so drastically with mine, that I checked a
    >calibration web site and found I was missing several distinctions at the
    >dark end.
    >
    >One step at a time.
    >

    I'm now on my second Dell monitor using DVI and I wouldn't go back to
    the less stable analogue connection.

    What I would highly recommend is that if you are doing photo editing
    you should invest in screen calibration equipment (I use Spyder).
    Small and almost subtle changes in the screen can result in large
    changes in what you print. Its highly frustrating when you can't get
    the same print results as you did several months previously.



    Eric Stevens
    Eric Stevens, Jan 21, 2011
    #4
  5. DaveS

    DaveS Guest

    On 1/21/2011 3:55 PM, N wrote:
    > On 22/01/2011, DaveS wrote:
    >> The stuff I find using Google doesn't seem to be authoritative in any
    >> way on this question..
    >>
    >> I am receiving a new monitor (Dell U2311h) next week, and it can be
    >> connected by various types of cable. My graphics card can use VGA or
    >> DVI. The question is, will I experience any benefit by paying for and
    >> using a DVI connection over using the included VGA connection?
    >>
    >> I'm interested specifically in responses relating to photo editing.
    >>
    >> Dave S.

    >
    > Why do you think the DVI connection will cost you more money? There'll
    > be a DVI cable in the box.
    >
    >


    I set out to prove you wrong, but I stand corrected:
    What's in the Box

    Monitor with stand
    Power Cable
    DVI Cable
    VGA Cable (attached to the monitor)
    Drivers and Documentation media
    USB upstream cable
    Quick Setup Guide
    Safety Information

    I believe that I have purchased and set up LCD monitors for others where
    there was a DVI connector but no cable. Clearly, there is no cost for me
    to find out for myself if there is any visible difference with this monitor.

    Dave S.
    DaveS, Jan 22, 2011
    #5
  6. DaveS

    Robert Coe Guest

    On Fri, 21 Jan 2011 22:17:18 -0900, (Floyd L. Davidson)
    wrote:
    : DaveS <> wrote:
    : > Clearly, there is no cost for me
    : >to find out for myself if there is any visible difference with this monitor.
    :
    : Whether you think you can see it on any given displayed
    : image or not, use the DVI connection.
    :
    : I won't go so far as to say digital data is vastly
    : better than analog data, but it is certainly better and
    : you get significantly improved precision. Another point
    : is that with age the VGA interface will drift far more
    : than will the DVI interface.

    DVI might be slightly more resistant to RF interference, especially if the
    cable is long. But in normal use, it's very unlikely that you'll be able to
    see any difference in image quality. That said, there's no reason not to take
    Floyd's advice: if your card supports DVI, you might as well use it.

    I have two dual-monitor setups at work, one of which uses one monitor on DVI
    and one on VGA. On that setup, I can see a slight color difference between the
    two monitors, but not enough to be annoying. On the setup with two DVI
    monitors connected to the same video card, the colors look identical (given
    identical settings of the monitors, of course).

    Bob
    Robert Coe, Jan 24, 2011
    #6
  7. DaveS

    Eric Stevens Guest

    On Sun, 23 Jan 2011 18:25:34 -0900, (Floyd L.
    Davidson) wrote:

    >Robert Coe <> wrote:
    >>On Fri, 21 Jan 2011 22:17:18 -0900, (Floyd L. Davidson)
    >>wrote:
    >>: DaveS <> wrote:
    >>: > Clearly, there is no cost for me
    >>: >to find out for myself if there is any visible difference with this monitor.
    >>:
    >>: Whether you think you can see it on any given displayed
    >>: image or not, use the DVI connection.
    >>:
    >>: I won't go so far as to say digital data is vastly
    >>: better than analog data, but it is certainly better and
    >>: you get significantly improved precision. Another point
    >>: is that with age the VGA interface will drift far more
    >>: than will the DVI interface.
    >>
    >>DVI might be slightly more resistant to RF interference, especially if the
    >>cable is long. But in normal use, it's very unlikely that you'll be able to
    >>see any difference in image quality. That said, there's no reason not to take
    >>Floyd's advice: if your card supports DVI, you might as well use it.

    >
    >In normal use it should be an obvious difference. A
    >digital interface sends a specific discrete value to the
    >monitor. It is the exact same value each time, and is
    >calculated from the value in the digital image file. It
    >doesn't change, and has the same accuracy each time.
    >
    >The VGA interface has to convert the digital value to an
    >analog value, and then the monitor has to using the
    >timing of a dot clock to pick out the precise time that
    >the right value is made available. It is not nearly as
    >precise as the process used by the digital interface.
    >It can never be as accurate.
    >
    >>I have two dual-monitor setups at work, one of which uses one monitor on DVI
    >>and one on VGA. On that setup, I can see a slight color difference between the
    >>two monitors, but not enough to be annoying.

    >
    >But it *is* different! The difference is error.
    >
    >>On the setup with two DVI
    >>monitors connected to the same video card, the colors look identical (given
    >>identical settings of the monitors, of course).

    >
    >No error.


    Unless the monitors are calibrated, it might be two different errors.



    Eric Stevens
    Eric Stevens, Jan 24, 2011
    #7
  8. > In normal use it should be an obvious difference. A
    > digital interface sends a specific discrete value to the
    > monitor. It is the exact same value each time, and is
    > calculated from the value in the digital image file. It
    > doesn't change, and has the same accuracy each time.

    []
    > --
    > Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
    > Ukpeagvik (Barrow, Alaska)


    Maybe it /should/, but in practice it does not (at least on correctly
    adjusted monitors).

    Cheers,
    David
    David J Taylor, Jan 24, 2011
    #8
  9. "Floyd L. Davidson" <> wrote in message
    news:...
    > "David J Taylor" <> wrote:
    >>> In normal use it should be an obvious difference. A
    >>> digital interface sends a specific discrete value to the
    >>> monitor. It is the exact same value each time, and is
    >>> calculated from the value in the digital image file. It
    >>> doesn't change, and has the same accuracy each time.

    >>[]
    >>
    >>Maybe it /should/, but in practice it does not (at least on correctly
    >>adjusted monitors).

    >
    > I don't agree with your statement at all. In practice
    > with a digital interface it sends *exactly* the same
    > value every time.
    >
    > The problem for the analog interface is that is isn't
    > exactly the same every time.
    >
    > And that of course is precisely the distinction between
    > digital and analog when it is affected by noise. The
    > digital system can function with a much lower SNR than
    > can an analog system. It's fundamental.
    >
    > --
    > Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
    > Ukpeagvik (Barrow, Alaska)


    Yes, you can get the "right" value into the monitor, but the issues of
    drift and calibration inside the monitor are just the same as with an
    analogue input monitor. I find that, in practice, drift of the analogue
    components in a VGA interface isn't an issue, and neither have I seen VGA
    signals affected by electrical noise even on moderate cable runs. Perhaps
    I've been lucky!

    Cheers,
    David
    David J Taylor, Jan 24, 2011
    #9
  10. DaveS

    DaveS Guest

    On 1/24/2011 9:47 AM, David J Taylor wrote:
    > "Floyd L. Davidson" <> wrote in message
    > news:...
    >> "David J Taylor" <> wrote:
    >>>> In normal use it should be an obvious difference. A
    >>>> digital interface sends a specific discrete value to the
    >>>> monitor. It is the exact same value each time, and is
    >>>> calculated from the value in the digital image file. It
    >>>> doesn't change, and has the same accuracy each time.
    >>> []
    >>>
    >>> Maybe it /should/, but in practice it does not (at least on correctly
    >>> adjusted monitors).

    >>
    >> I don't agree with your statement at all. In practice
    >> with a digital interface it sends *exactly* the same
    >> value every time.
    >>
    >> The problem for the analog interface is that is isn't
    >> exactly the same every time.
    >>
    >> And that of course is precisely the distinction between
    >> digital and analog when it is affected by noise. The
    >> digital system can function with a much lower SNR than
    >> can an analog system. It's fundamental.
    >>
    >> --
    >> Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
    >> Ukpeagvik (Barrow, Alaska)

    >
    > Yes, you can get the "right" value into the monitor, but the issues of
    > drift and calibration inside the monitor are just the same as with an
    > analogue input monitor. I find that, in practice, drift of the analogue
    > components in a VGA interface isn't an issue, and neither have I seen
    > VGA signals affected by electrical noise even on moderate cable runs.
    > Perhaps I've been lucky!
    >
    > Cheers,
    > David


    OK, so theory says there is a difference in the signal received by the
    monitor, depending on whether its coming from a digital or an analogue
    source. Experience shows there is no noticeable difference.

    Does anyone know how laptops connect video processor with display?

    Dave S.
    DaveS, Jan 24, 2011
    #10
  11. DaveS

    Eric Stevens Guest

    On Mon, 24 Jan 2011 00:41:41 -0900, (Floyd L.
    Davidson) wrote:

    >Eric Stevens <> wrote:
    >>On Sun, 23 Jan 2011 18:25:34 -0900, (Floyd L.
    >>Davidson) wrote:
    >>>Robert Coe <> wrote:
    >>>
    >>>>On the setup with two DVI
    >>>>monitors connected to the same video card, the colors look identical (given
    >>>>identical settings of the monitors, of course).
    >>>
    >>>No error.

    >>
    >>Unless the monitors are calibrated, it might be two different errors.

    >
    >He specified "identical settings of the monitors, of course". They will
    >by definition be the same.


    It all depends what you mean by 'settings'. Even though they are set
    identically for brightness, contrast etc etc, they may still need two
    slightly different ICC color profiles. Even having slightly different
    lighting of the respective work areas may be sufficient to require two
    different ICC profiles. I accept I am close to quibbling.



    Eric Stevens
    Eric Stevens, Jan 24, 2011
    #11
  12. DaveS

    Eric Stevens Guest

    On Mon, 24 Jan 2011 11:58:17 -0600, DaveS <> wrote:

    >On 1/24/2011 9:47 AM, David J Taylor wrote:
    >> "Floyd L. Davidson" <> wrote in message
    >> news:...
    >>> "David J Taylor" <> wrote:
    >>>>> In normal use it should be an obvious difference. A
    >>>>> digital interface sends a specific discrete value to the
    >>>>> monitor. It is the exact same value each time, and is
    >>>>> calculated from the value in the digital image file. It
    >>>>> doesn't change, and has the same accuracy each time.
    >>>> []
    >>>>
    >>>> Maybe it /should/, but in practice it does not (at least on correctly
    >>>> adjusted monitors).
    >>>
    >>> I don't agree with your statement at all. In practice
    >>> with a digital interface it sends *exactly* the same
    >>> value every time.
    >>>
    >>> The problem for the analog interface is that is isn't
    >>> exactly the same every time.
    >>>
    >>> And that of course is precisely the distinction between
    >>> digital and analog when it is affected by noise. The
    >>> digital system can function with a much lower SNR than
    >>> can an analog system. It's fundamental.
    >>>
    >>> --
    >>> Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
    >>> Ukpeagvik (Barrow, Alaska)

    >>
    >> Yes, you can get the "right" value into the monitor, but the issues of
    >> drift and calibration inside the monitor are just the same as with an
    >> analogue input monitor. I find that, in practice, drift of the analogue
    >> components in a VGA interface isn't an issue, and neither have I seen
    >> VGA signals affected by electrical noise even on moderate cable runs.
    >> Perhaps I've been lucky!
    >>
    >> Cheers,
    >> David

    >
    >OK, so theory says there is a difference in the signal received by the
    >monitor, depending on whether its coming from a digital or an analogue
    >source. Experience shows there is no noticeable difference.
    >

    For a given definition of 'noticeable'.

    >Does anyone know how laptops connect video processor with display?
    >
    >Dave S.




    Eric Stevens
    Eric Stevens, Jan 24, 2011
    #12
  13. DaveS

    DaveS Guest

    On 1/24/2011 12:28 PM, Savageduck wrote:
    > On 2011-01-24 09:58:17 -0800, DaveS <> said:
    >
    >> On 1/24/2011 9:47 AM, David J Taylor wrote:
    >>> "Floyd L. Davidson" <> wrote in message
    >>> news:...
    >>>> "David J Taylor" <> wrote:
    >>>>>> In normal use it should be an obvious difference. A
    >>>>>> digital interface sends a specific discrete value to the
    >>>>>> monitor. It is the exact same value each time, and is
    >>>>>> calculated from the value in the digital image file. It
    >>>>>> doesn't change, and has the same accuracy each time.
    >>>>> []
    >>>>>
    >>>>> Maybe it /should/, but in practice it does not (at least on correctly
    >>>>> adjusted monitors).
    >>>>
    >>>> I don't agree with your statement at all. In practice
    >>>> with a digital interface it sends *exactly* the same
    >>>> value every time.
    >>>>
    >>>> The problem for the analog interface is that is isn't
    >>>> exactly the same every time.
    >>>>
    >>>> And that of course is precisely the distinction between
    >>>> digital and analog when it is affected by noise. The
    >>>> digital system can function with a much lower SNR than
    >>>> can an analog system. It's fundamental.
    >>>>
    >>>> --
    >>>> Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
    >>>> Ukpeagvik (Barrow, Alaska)
    >>>
    >>> Yes, you can get the "right" value into the monitor, but the issues of
    >>> drift and calibration inside the monitor are just the same as with an
    >>> analogue input monitor. I find that, in practice, drift of the analogue
    >>> components in a VGA interface isn't an issue, and neither have I seen
    >>> VGA signals affected by electrical noise even on moderate cable runs.
    >>> Perhaps I've been lucky!
    >>>
    >>> Cheers,
    >>> David

    >>
    >> OK, so theory says there is a difference in the signal received by the
    >> monitor, depending on whether its coming from a digital or an analogue
    >> source. Experience shows there is no noticeable difference.
    >>
    >> Does anyone know how laptops connect video processor with display?
    >>
    >> Dave S.

    >
    > My MacBook Pro has a proprietary "mini displayport", "MDP" which
    > supports VGA, DVI, and HDMI via their MDP adapters.
    > The same MDP is found on the current iMacs and MacMini.
    >


    I don't know Macs, but surely you're talking about an external port. My
    question intended to enquire about the internal connection between video
    processor and display.

    Dave S.
    DaveS, Jan 24, 2011
    #13
  14. DaveS

    Robert Coe Guest

    On Mon, 24 Jan 2011 20:49:23 +1300, Eric Stevens <>
    wrote:
    : On Sun, 23 Jan 2011 18:25:34 -0900, (Floyd L.
    : Davidson) wrote:
    :
    : >Robert Coe <> wrote:
    : >>On Fri, 21 Jan 2011 22:17:18 -0900, (Floyd L. Davidson)
    : >>wrote:
    : >>: DaveS <> wrote:
    : >>: > Clearly, there is no cost for me
    : >>: >to find out for myself if there is any visible difference with this monitor.
    : >>:
    : >>: Whether you think you can see it on any given displayed
    : >>: image or not, use the DVI connection.
    : >>:
    : >>: I won't go so far as to say digital data is vastly
    : >>: better than analog data, but it is certainly better and
    : >>: you get significantly improved precision. Another point
    : >>: is that with age the VGA interface will drift far more
    : >>: than will the DVI interface.
    : >>
    : >>DVI might be slightly more resistant to RF interference, especially if the
    : >>cable is long. But in normal use, it's very unlikely that you'll be able to
    : >>see any difference in image quality. That said, there's no reason not to take
    : >>Floyd's advice: if your card supports DVI, you might as well use it.
    : >
    : >In normal use it should be an obvious difference. A
    : >digital interface sends a specific discrete value to the
    : >monitor. It is the exact same value each time, and is
    : >calculated from the value in the digital image file. It
    : >doesn't change, and has the same accuracy each time.
    : >
    : >The VGA interface has to convert the digital value to an
    : >analog value, and then the monitor has to using the
    : >timing of a dot clock to pick out the precise time that
    : >the right value is made available. It is not nearly as
    : >precise as the process used by the digital interface.
    : >It can never be as accurate.
    : >
    : >>I have two dual-monitor setups at work, one of which uses one monitor on DVI
    : >>and one on VGA. On that setup, I can see a slight color difference between the
    : >>two monitors, but not enough to be annoying.
    : >
    : >But it *is* different! The difference is error.
    : >
    : >>On the setup with two DVI
    : >>monitors connected to the same video card, the colors look identical (given
    : >>identical settings of the monitors, of course).
    : >
    : >No error.
    :
    : Unless the monitors are calibrated, it might be two different errors.

    Probably neither. I suspect that the bulk of it is because two graphics
    devices are generating the inputs. The fact that one is digital and the other
    analog is a second-order effect.

    Bob
    Robert Coe, Jan 25, 2011
    #14
  15. DaveS

    Whisky-dave Guest

    On Jan 24, 11:03 pm, Savageduck <savageduck1@{REMOVESPAM}me.com>
    wrote:
    > On 2011-01-24 12:39:46 -0800, DaveS <> said:
    >
    >
    >
    > > On 1/24/2011 12:28 PM, Savageduck wrote:
    > >> On 2011-01-24 09:58:17 -0800, DaveS <> said:

    >
    > >>> On 1/24/2011 9:47 AM, David J Taylor wrote:
    > >>>> "Floyd L. Davidson" <> wrote in message
    > >>>>news:...
    > >>>>> "David J Taylor" <> wrote:
    > >>>>>>> In normal use it should be an obvious difference. A
    > >>>>>>> digital interface sends a specific discrete value to the
    > >>>>>>> monitor. It is the exact same value each time, and is
    > >>>>>>> calculated from the value in the digital image file. It
    > >>>>>>> doesn't change, and has the same accuracy each time.
    > >>>>>> []

    >
    > >>>>>> Maybe it /should/, but in practice it does not (at least on correctly
    > >>>>>> adjusted monitors).

    >
    > >>>>> I don't agree with your statement at all. In practice
    > >>>>> with a digital interface it sends *exactly* the same
    > >>>>> value every time.

    >
    > >>>>> The problem for the analog interface is that is isn't
    > >>>>> exactly the same every time.

    >
    > >>>>> And that of course is precisely the distinction between
    > >>>>> digital and analog when it is affected by noise. The
    > >>>>> digital system can function with a much lower SNR than
    > >>>>> can an analog system. It's fundamental.

    >
    > >>>>> --
    > >>>>> Floyd L. Davidson <http://www.apaflo.com/floyd_davidson>
    > >>>>> Ukpeagvik (Barrow, Alaska)

    >
    > >>>> Yes, you can get the "right" value into the monitor, but the issues of
    > >>>> drift and calibration inside the monitor are just the same as with an
    > >>>> analogue input monitor. I find that, in practice, drift of the analogue
    > >>>> components in a VGA interface isn't an issue, and neither have I seen
    > >>>> VGA signals affected by electrical noise even on moderate cable runs..
    > >>>> Perhaps I've been lucky!

    >
    > >>>> Cheers,
    > >>>> David

    >
    > >>> OK, so theory says there is a difference in the signal received by the
    > >>> monitor, depending on whether its coming from a digital or an analogue
    > >>> source. Experience shows there is no noticeable difference.

    >
    > >>> Does anyone know how laptops connect video processor with display?

    >
    > >>> Dave S.

    >
    > >> My MacBook Pro has a proprietary "mini displayport", "MDP" which
    > >> supports VGA, DVI, and HDMI via their MDP adapters.
    > >> The same MDP is found on the current iMacs and MacMini.

    >
    > > I don't know Macs, but surely you're talking about an external port. My
    > > question intended to enquire about the internal connection between
    > > video processor and display.

    >
    > > Dave S.

    >
    > The intent of your question did not quite come over the way you phrased it.
    > Your question asked; "Does anyone know how laptops connect video
    > processor with display?" (see above) As to which display is not clear.
    >
    > You did not ask how the video processor/graphics card connected to the
    > laptop display. This led me to understand you wanted to find out how an
    > external display would connect to a laptop.
    >
    > My laptop experience lies with MacBook Pro's. These have two graphics
    > cards on board which are directly connected, by ribbon connector to the
    > LCD laptop display as DVI. However to add an external mirroring, or
    > extended display whether that is DVI, VGA, or HDMI, would be done via
    > the MDP and appropriate adaptor. As far as other laptops go your answer
    > will have to come from another source.



    What he may have meant was does the graphics CPU supply analogue or
    digital signals
    to the output port and I think all present graphics cards send out
    digital signals.
    which can be converted to analogue for VGA.
    Whisky-dave, Jan 25, 2011
    #15
  16. DaveS

    Eric Stevens Guest

    On Tue, 25 Jan 2011 01:30:52 -0900, (Floyd L.
    Davidson) wrote:

    >Robert Coe <> wrote:
    >>On Mon, 24 Jan 2011 20:49:23 +1300, Eric Stevens <>
    >>wrote:
    >>: On Sun, 23 Jan 2011 18:25:34 -0900, (Floyd L.
    >>: Davidson) wrote:
    >>:
    >>: >Robert Coe <> wrote:
    >>: >>On Fri, 21 Jan 2011 22:17:18 -0900, (Floyd L. Davidson)
    >>: >>wrote:
    >>: >>: DaveS <> wrote:
    >>: >>: > Clearly, there is no cost for me
    >>: >>: >to find out for myself if there is any visible difference with this monitor.
    >>: >>:
    >>: >>: Whether you think you can see it on any given displayed
    >>: >>: image or not, use the DVI connection.
    >>: >>:
    >>: >>: I won't go so far as to say digital data is vastly
    >>: >>: better than analog data, but it is certainly better and
    >>: >>: you get significantly improved precision. Another point
    >>: >>: is that with age the VGA interface will drift far more
    >>: >>: than will the DVI interface.
    >>: >>
    >>: >>DVI might be slightly more resistant to RF interference, especially if the
    >>: >>cable is long. But in normal use, it's very unlikely that you'll be able to
    >>: >>see any difference in image quality. That said, there's no reason not to take
    >>: >>Floyd's advice: if your card supports DVI, you might as well use it.
    >>: >
    >>: >In normal use it should be an obvious difference. A
    >>: >digital interface sends a specific discrete value to the
    >>: >monitor. It is the exact same value each time, and is
    >>: >calculated from the value in the digital image file. It
    >>: >doesn't change, and has the same accuracy each time.
    >>: >
    >>: >The VGA interface has to convert the digital value to an
    >>: >analog value, and then the monitor has to using the
    >>: >timing of a dot clock to pick out the precise time that
    >>: >the right value is made available. It is not nearly as
    >>: >precise as the process used by the digital interface.
    >>: >It can never be as accurate.
    >>: >
    >>: >>I have two dual-monitor setups at work, one of which uses one monitor on DVI
    >>: >>and one on VGA. On that setup, I can see a slight color difference between the
    >>: >>two monitors, but not enough to be annoying.
    >>: >
    >>: >But it *is* different! The difference is error.
    >>: >
    >>: >>On the setup with two DVI
    >>: >>monitors connected to the same video card, the colors look identical (given
    >>: >>identical settings of the monitors, of course).
    >>: >
    >>: >No error.
    >>:
    >>: Unless the monitors are calibrated, it might be two different errors.
    >>
    >>Probably neither. I suspect that the bulk of it is because two graphics
    >>devices are generating the inputs. The fact that one is digital and the other
    >>analog is a second-order effect.

    >
    >If we are discussing two interfaces on the same card,
    >then it is a single device, not two. There are two
    >different types of data communications systems to send
    >identical data from the graphics device to the monitor.
    >It is a simple characteristic of the difference between
    >digital and analog data transport systems.
    >
    >In either case the original data is digital. For a
    >"digital monitor", that processes the data internally as
    >digital data, sending the original data via a digital
    >system results in *precisely* the same data at the
    >monitor's CPU as was available at the CPU in the
    >graphics device. If it is converted to analog for
    >transport it has to be digitized again at the monitor.
    >That means it *can never be precisely the same data*.
    >
    >The VGA interface was designed for an all analog monitor,
    >where the data is *never* digitized. For such a system it
    >makes no difference if the data is converted to analog at
    >the graphics device or in the monitor. And of course analog
    >only monitors never had DVI interfaces because of that.


    DVI-A is specifically intended to carry only analog signals.

    Virtually all DVI connectors will handle the analog signals of DVI-A

    See http://en.wikipedia.org/wiki/Digital_Visual_Interface



    Eric Stevens
    Eric Stevens, Jan 25, 2011
    #16
  17. DaveS

    Eric Stevens Guest

    On Tue, 25 Jan 2011 19:03:54 -0900, (Floyd L.
    Davidson) wrote:

    >Eric Stevens <> wrote:
    >>On Tue, 25 Jan 2011 01:30:52 -0900, (Floyd L.
    >>Davidson) wrote:
    >>
    >>>The VGA interface was designed for an all analog monitor,
    >>>where the data is *never* digitized. For such a system it
    >>>makes no difference if the data is converted to analog at
    >>>the graphics device or in the monitor. And of course analog
    >>>only monitors never had DVI interfaces because of that.

    >>
    >>DVI-A is specifically intended to carry only analog signals.
    >>
    >>Virtually all DVI connectors will handle the analog signals of DVI-A
    >>
    >>See http://en.wikipedia.org/wiki/Digital_Visual_Interface

    >
    >That is incidental and has no significance to the
    >discussion.


    Yeah? Well there are some "analog only monitors" which do have DVI
    interfaces.
    >
    >It is the digital interface available with DVI that we
    >are actually interested in, and the statement above
    >about all analog monitors is valid. When the DVI
    >interface was designed they made it backwards compatible
    >so that the monitor need not have two connectors.


    The had to have a DVI connector.

    >Our
    >discussion doesn't revolve around which connector is
    >used, but is about the distinction between using digital
    >or analog.




    Eric Stevens
    Eric Stevens, Jan 26, 2011
    #17
  18. DaveS

    Eric Stevens Guest

    On Wed, 26 Jan 2011 00:17:48 -0900, (Floyd L.
    Davidson) wrote:

    >Eric Stevens <> wrote:
    >>On Tue, 25 Jan 2011 19:03:54 -0900, (Floyd L.
    >>Davidson) wrote:
    >>>
    >>>That is incidental and has no significance to the
    >>>discussion.

    >>
    >>Yeah? Well there are some "analog only monitors" which do have DVI
    >>interfaces.

    >
    >That too is incidental. And I'm not at all certain it is true, it makes
    >no difference anyway.
    >
    >The currently manufactured models have to be able to
    >work with currently manufactured video cards, so a cheap
    >analog only monitor might well have a DVI *connector*,
    >but it does *not* have the "DVI interface" that we are
    >talking about. The analog signals on the connector are
    >still a *VGA interface*.


    In the cases I had in mind, passing through a DVI connector.
    >
    >>>It is the digital interface available with DVI that we
    >>>are actually interested in, and the statement above
    >>>about all analog monitors is valid. When the DVI
    >>>interface was designed they made it backwards compatible
    >>>so that the monitor need not have two connectors.

    >>
    >>The had to have a DVI connector.

    >
    >With an embedded VGA interface. The connector does not
    >define the interface.
    >

    I was responding to your statement:

    " And of course analog only monitors never had DVI interfaces
    because of that."

    ... by pointing out that "DVI-A is specifically intended to carry
    only analog signals" and going on to say "Virtually all DVI connectors
    will handle the analog signals of DVI-A".

    >>>Our
    >>>discussion doesn't revolve around which connector is
    >>>used, but is about the distinction between using digital
    >>>or analog.

    >
    >That's what you need to get your head around, rather than
    >trying to argue frivolous points.


    DVI is not, as you seem to think, a purely digital interface, even
    though that is what it is principally used for.



    Eric Stevens
    Eric Stevens, Jan 26, 2011
    #18
  19. DaveS

    Eric Stevens Guest

    On Wed, 26 Jan 2011 12:53:36 -0900, (Floyd L.
    Davidson) wrote:

    >Eric Stevens <> wrote:
    >>On Wed, 26 Jan 2011 00:17:48 -0900, (Floyd L.
    >>Davidson) wrote:
    >>
    >>>Eric Stevens <> wrote:
    >>>>On Tue, 25 Jan 2011 19:03:54 -0900, (Floyd L.
    >>>>Davidson) wrote:
    >>>>>
    >>>>>That is incidental and has no significance to the
    >>>>>discussion.
    >>>>
    >>>>Yeah? Well there are some "analog only monitors" which do have DVI
    >>>>interfaces.
    >>>
    >>>That too is incidental. And I'm not at all certain it is true, it makes
    >>>no difference anyway.
    >>>
    >>>The currently manufactured models have to be able to
    >>>work with currently manufactured video cards, so a cheap
    >>>analog only monitor might well have a DVI *connector*,
    >>>but it does *not* have the "DVI interface" that we are
    >>>talking about. The analog signals on the connector are
    >>>still a *VGA interface*.

    >>
    >>In the cases I had in mind, passing through a DVI connector.
    >>>
    >>>>>It is the digital interface available with DVI that we
    >>>>>are actually interested in, and the statement above
    >>>>>about all analog monitors is valid. When the DVI
    >>>>>interface was designed they made it backwards compatible
    >>>>>so that the monitor need not have two connectors.
    >>>>
    >>>>The had to have a DVI connector.
    >>>
    >>>With an embedded VGA interface. The connector does not
    >>>define the interface.
    >>>

    >>I was responding to your statement:
    >>
    >> " And of course analog only monitors never had DVI interfaces
    >> because of that."
    >>
    >> ... by pointing out that "DVI-A is specifically intended to carry
    >>only analog signals" and going on to say "Virtually all DVI connectors
    >>will handle the analog signals of DVI-A".
    >>
    >>>>>Our
    >>>>>discussion doesn't revolve around which connector is
    >>>>>used, but is about the distinction between using digital
    >>>>>or analog.
    >>>
    >>>That's what you need to get your head around, rather than
    >>>trying to argue frivolous points.

    >>
    >>DVI is not, as you seem to think, a purely digital interface, even
    >>though that is what it is principally used for.

    >
    >The analog signals on a DVI connector are a VGA interface.


    Yep.



    Eric Stevens
    Eric Stevens, Jan 27, 2011
    #19
  20. DaveS

    Eric Stevens Guest

    On Wed, 26 Jan 2011 22:35:50 -0900, (Floyd L.
    Davidson) wrote:

    >Eric Stevens <> wrote:
    >>On Wed, 26 Jan 2011 12:53:36 -0900, (Floyd L.
    >>Davidson) wrote:
    >>
    >>>Eric Stevens <> wrote:
    >>>>On Wed, 26 Jan 2011 00:17:48 -0900, (Floyd L.
    >>>>Davidson) wrote:
    >>>>
    >>>>>Eric Stevens <> wrote:
    >>>>>>On Tue, 25 Jan 2011 19:03:54 -0900, (Floyd L.
    >>>>>>Davidson) wrote:
    >>>>>>>
    >>>>>>>That is incidental and has no significance to the
    >>>>>>>discussion.
    >>>>>>
    >>>>>>Yeah? Well there are some "analog only monitors" which do have DVI
    >>>>>>interfaces.
    >>>>>
    >>>>>That too is incidental. And I'm not at all certain it is true, it makes
    >>>>>no difference anyway.
    >>>>>
    >>>>>The currently manufactured models have to be able to
    >>>>>work with currently manufactured video cards, so a cheap
    >>>>>analog only monitor might well have a DVI *connector*,
    >>>>>but it does *not* have the "DVI interface" that we are
    >>>>>talking about. The analog signals on the connector are
    >>>>>still a *VGA interface*.
    >>>>
    >>>>In the cases I had in mind, passing through a DVI connector.
    >>>>>
    >>>>>>>It is the digital interface available with DVI that we
    >>>>>>>are actually interested in, and the statement above
    >>>>>>>about all analog monitors is valid. When the DVI
    >>>>>>>interface was designed they made it backwards compatible
    >>>>>>>so that the monitor need not have two connectors.
    >>>>>>
    >>>>>>The had to have a DVI connector.
    >>>>>
    >>>>>With an embedded VGA interface. The connector does not
    >>>>>define the interface.
    >>>>>
    >>>>I was responding to your statement:
    >>>>
    >>>> " And of course analog only monitors never had DVI interfaces
    >>>> because of that."
    >>>>
    >>>> ... by pointing out that "DVI-A is specifically intended to carry
    >>>>only analog signals" and going on to say "Virtually all DVI connectors
    >>>>will handle the analog signals of DVI-A".
    >>>>
    >>>>>>>Our
    >>>>>>>discussion doesn't revolve around which connector is
    >>>>>>>used, but is about the distinction between using digital
    >>>>>>>or analog.
    >>>>>
    >>>>>That's what you need to get your head around, rather than
    >>>>>trying to argue frivolous points.
    >>>>
    >>>>DVI is not, as you seem to think, a purely digital interface, even
    >>>>though that is what it is principally used for.
    >>>
    >>>The analog signals on a DVI connector are a VGA interface.

    >>
    >>Yep.

    >
    >And just in case you've missed the point again, the DVI
    >connector is not a DVI interface. The DVI interface is
    >a purely digital interface, and that is the distinction
    >that was being discussed.


    Your view seems to differ from that of the Digital Display Working
    Group - see http://en.wikipedia.org/wiki/Digital_Visual_Interface



    Eric Stevens
    Eric Stevens, Jan 27, 2011
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Gomer

    DVI to VGA Question

    Gomer, Dec 23, 2003, in forum: Computer Support
    Replies:
    6
    Views:
    551
    Guy Zamir
    Dec 24, 2003
  2. Daniel

    Componet video in - VGA or DVI out (converter)

    Daniel, Mar 3, 2004, in forum: Computer Support
    Replies:
    0
    Views:
    1,164
    Daniel
    Mar 3, 2004
  3. VGA vs. DVI

    , Sep 11, 2005, in forum: Computer Support
    Replies:
    5
    Views:
    10,351
    Toolman Tim
    Sep 11, 2005
  4. moses

    DVI/VGA

    moses, Feb 9, 2006, in forum: Computer Support
    Replies:
    1
    Views:
    487
    old jon
    Feb 9, 2006
  5. Scott Steiner

    Connecting 2 PCs to a monitor with 2 inputs (DVI and VGA)

    Scott Steiner, Jan 18, 2008, in forum: Computer Information
    Replies:
    2
    Views:
    900
    Scott Steiner
    Jan 19, 2008
Loading...

Share This Page