What about the DVI interface of the graphics card ?

Updated on number 2024-04-05
6 answers
  1. Anonymous users2024-02-07

    Your graphics card doesn't support DVI output, so even if you buy an LCD for DVI, it doesn't make sense, (to support both the graphics card and the monitor) it's better to save some money and buy an LCD that only supports VGA. You can take a look at your motherboard, if there is a 24-pin square connector, it is DVI.

    In the case of LCD, some only have D-Sub (VGA) interface, and some have both D-Sub and DVI. If there is a DVI interface, there must be a VGA.

  2. Anonymous users2024-02-06

    1。If you look at the graphics card interface, there is a blue 15-pin d subThere is also a white square mouth, which is DVI, which is 24 needles.

    Not necessarily, the worse is only the d sub port, and the better one is the dual interface, so you should pay attention when you buy it.

  3. Anonymous users2024-02-05

    What do you mean?

    In fact, LCD displays generally support two kinds of junctions.

    Three years ago, the graphics cards with 64M video memory generally did not have a DVI knot!

    You can choose the LCD of the VGA result!

    LCD displays are generally supported by both types of junctions!

    Only a very small number of only one or the other is supported! (That kind is too little) so you can go and buy an LCD monitor with confidence!

    If your graphics card can have a DVI joint, you should use DVI. If not, you can choose VGA

  4. Anonymous users2024-02-04

    This can be connected, graphics cards generally have DVI, VGA, HDMI three interfaces, as long as you take one after another.

    Of course, if you only have DVI, then you can buy a DVI to DP adapter cable, and you can connect it.

    Just use an adapter cable and you're good to go.

  5. Anonymous users2024-02-03

    Old graphics cards generally use the DVI-I interface, which supports the transfer to the VGA interface output.

    New graphics cards (NVIDIA graphics 10 series and above, e.g. GTX1050; AMD graphics cards RX400 and above, such as RX460) use DVI-D and do not support conversion to VGA interface.

    Except for the difference of converting to VGA, there is no difference in other aspects of use.

    On the parameters of the independent graphics card, determine whether the DVI interface is DVI-I or DVI-D and look at the number of interface pins: if it is 24+5, it is DVI-I, if it is 24+1, it is DVI-D.

  6. Anonymous users2024-02-02

    Old graphics cards generally use the DVI-I interface, which supports the transfer to the VGA interface output.

    New graphics card (NVIDIA graphics card 10 series and above.

    version, such as GTX1050; AMD graphics cards RX400 and above, such as RX460) use DVI-D and do not support conversion to VGA interface.

    Except for the difference of converting to VGA, there is no difference in other aspects of use.

Related questions
4 answers2024-04-05

DVI is better than VGA and technically more advanced than VGA. >>>More

6 answers2024-04-05

<> blue is the VGA interface and white is the DVI interface.

17 answers2024-04-05

Quite simply, DVI can transmit analog signals and digital signals, and VGA can only transmit analog signals. >>>More

2 answers2024-04-05

Subdivided into DVI-D and DVI-i, DVI-D is a pure digital interface, the best choice for large-screen LCD. DVI-I supports digital and backward compatible analog signal output, and our common DVI to D-Sub adapter must be used on DVI-I to achieve conversion. >>>More

19 answers2024-04-05

Choose the graphics card first,Most graphics cards are connected to the display with dual interface design24-pin digital signal interface and 15-pin analog signal interface,There is also an HDMI output input interface,When you connect to the current display,You can buy an adapter as needed。