• MorphOS Developer
    Krashan
    Posts: 1107 from 2003/6/11
    From: Białystok...
    On a more serious note, there is a very noticable difference between VGA and DVI input when I use my monitor (with a Pegasos or other machine, does not matter).

    There are two factors of difference between DVI and VGA output quality:

    1. The difference is better visible on bigger monitors. On 15" it may look the same, on 19" or 20" the difference may be easy to notice.

    2. VGA quality strongly depends on gfx board. For 20 or so cards I have tested, ATi chipsets produce much better VGA output than nVidia ones. For example R9200SE gives excellent VGA picture on 19" panel (compared with DVI of the same card and the same monitor). On the same monitor GeForce 2 output is pathetic blur. More detailed investigations revealed that most nVidia cards generate VGA output with pixel clock frequency changing slightly along a scanline. Then pixel clock regenerator in the monitor cannot phase-lock to pixels precisely. The result is blur in some parts of every horizontal line. For DVI this pixelclock detuning has no meaning as pixelclock signal is delivered along with RGB signals.
  • »18.07.09 - 08:18
    Profile Visit Website