The DVI input on CRT 1080i HD Sony XBR800

Discussion in 'Off Topic Discussion' started by Fandangos, May 2, 2013.

  1. Fandangos

    Fandangos <B>Site Supporter 2013</B>

    Joined:
    Sep 19, 2012
    Messages:
    604
    Likes Received:
    23
    Friends, I got this beast for free today:
    http://i.imgur.com/QlhG2g8.jpg

    It's a Sony KV-40XBR800 a 40 inches CRT display that has an DVI input.

    According to a FAQ on Sony's site it reffers to it as DVI-D. There's no such thing on the manual.

    "[TABLE="width: 100%"]
    [TR]
    [TD="class: document_title, width: 100%, bgcolor: #FFFFFF, align: center"]Error: No signal detected appears when connecting a computer using a DVI-to-VGA adaptor.[/TD]
    [/TR]
    [TR]
    [TD][TABLE]
    [TR]
    [TD="class: tabbar"]Solution
    [​IMG][/TD]
    [TD="class: tabbar, width: 100%"][​IMG][/TD]
    [/TR]
    [/TABLE]
    [/TD]
    [/TR]
    [TR]
    [TD="class: body"]This error will occur because the video card does not support an analog signal for VGA. Most DVI-I outputs on a computer will require drivers for the display device being used. To resolve this issue, try installing the latest video card driver provided by the video card manufacturer or computer manufacturer.
    NOTES:

    • A DVI-D port [FIG.1] does not support VGA signal.
    • If available, try using a DVI-to-HDMI cable adapter instead of a DVI-to-HDMI cable."
    [/TD]
    [/TR]
    [/TABLE]

    It first says the fault would be on the VGA card later it says it's not VGA capable.

    I'm wondering if this DVI input would accept with wired to the right pins: R G B Sync from all those retro consoles at 15khz.

    If not, that's is my first assumption I would need to go RGB Scart to HDMI-DVI.
    I've been using RGB to component for some time now with my preview SD CRT TV but I'm considering if a convertionless method of analog RGB to digital RGB wouldn't do a better job.

    The thing with HDMI is that it would follow it's own specifications, any converter box would try to strecth or upscale the signal processing it and this is not the aim here.

    So what do you guys think? What route should improve the final results?
     
    Last edited: May 2, 2013
  2. mooseblaster

    mooseblaster Bleep. Site Supporter 2012, 2014

    Joined:
    Aug 27, 2006
    Messages:
    1,568
    Likes Received:
    4
    There are 3 kinds of DVI:

    * DVI-A is analogue-only (very uncommon)
    * DVI-D is digital-only (common)
    * DVI-I is both analogue and digital.

    As your TV only accepts DVI-D, this means you cannot hook up an analogue (i.e.: VGA) signal without converting it using an analogue-to-digital convertor.

    Technically, there should be no issues with stretching or upscaling if done properly, as IIRC some boxes allow you to output a converted signal in 240p and 480i at 60Hz without upscaling. However, you may find that your TV does an awful job at upscaling, so you may wish to go for a decent converter and upscaler (such as one of the XRGB line of upscalers).
     
  3. Fandangos

    Fandangos <B>Site Supporter 2013</B>

    Joined:
    Sep 19, 2012
    Messages:
    604
    Likes Received:
    23
    Yeah, I'm considering the xrgb mini but I'm also thinking about the analogue to digital process.

    I've read a lot in the past when I started to considere RGB to component and it seems there's only a small equation between those. RGB and Y Cb Cr are almost the same.
    So I'm going from analogue to analogue here. This would give 100% fidelity image.

    Going with the xrgb mini on a CRT display, I'm not sure if this would improve from my method.

    This comparission has been done before on CRT? RGB vs HDMI upscaled?
    Unfortunatelly I don't know anyone here that could me lend the mini for a few tests.
     
  4. kungmidas

    kungmidas <B>Site Supporter 2013</B><BR><B>Site Benefactor</

    Joined:
    Sep 20, 2012
    Messages:
    98
    Likes Received:
    0
    Probably nothing you don't know but I'll tell you know what I know :)

    If the DVI input on the tv looks like on this image: http://www2.crutchfield.com.edgesui.../900/900/products/2002/158/x15840xb800-b.jpeg
    Then it's a definitly a DVI-D (not I or A) and thus wont accept an analog signal. A DVI cable/adapter carying analog signal would literally not fit. Hence you can't connect it to an analog source of any kind (like VGA or DVI-A), and if you connect it to DVI-I you must make sure that the source is sending a digital signal, not analog. You should be able to connect it to a DVI-D, HDMI or DualMode DisplayPort using simple adapters with no signal loss though.

    The troubleshooting quote you post, I reckon it's from troubleshooting when connecting something to a VGA connector on the TV, in that case the message and the solution makes more sense I think. Imagine you have a video card with some sort of DVI, connect it somehow to a TV with a VGA connector. Then that information would make sense and help you :)
     
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page