HDMI and RGB mods for different consoles

Discussion in 'Modding and Hacking - Consoles and Electronics' started by OzOnE, Mar 17, 2015.

  1. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    Hey OzOne, does your HDMI board squish the Dreamcast's resolution to 640x480 like most VGA monitors or does it output it correctly? I've read online that the DC's resolution is closer to 720x480 but most monitors squish it to 640x480. I also read the maximum resolution it supports is 800x660.
     
  2. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    I'm still looking into that atm. I'm not convinced that the DC does use 720x480 timings in VGA mode.

    It seems to use standard 640x480 timings, and it's all covered in the System Architecture manual (pages 135,136, and 341-343)...

    https://www.dropbox.com/s/flbxsosi7vb6dov/DCDBSysArc990907E.pdf

    Due to the JTAG port on this proto board being zapped, and not being able to easily count the pixels / lines with my 'scope, I can't yet confirm the exact timings and resolution.

    It definitely outputs a 640x480 frame in VGA mode on most games, I just need to check the exact timings.

    From what I've seen, the image appears to be the perfect aspect ratio on my captures.
    The AverMedia software also shows it as 640x480.

    I can change the output mode and DE (Data Enable) generation to 720x480 as well, and that might cause squishing on some TVs / monitors.
    These are all things I need to test in the coming weeks. For now though, if the TV / monitor supports standard 640x480 VGA via HDMI, it should look fine.

    I'll also be adding horizontal expansion / squishing soon anyway, so any aspect ratio issues can be set as a preset.


    Yep - the DC does support up to 800x600 IIRC.
    I tried it once under KallistiOS, and it appeared to work fine.

    That may have been with slightly non-standard timings though, I'm not sure.


    Good news on the HDMI board - the final main board design is now finished. :D

    [RDC] will be checking things over then sending off for the PCBs + parts in the next few days.

    These will still be prototypes though, as we need to be double-sure they are working fine before paying out for a larger batch.

    It will have the option of either Mini HDMI on the board itself, or either type of HDMI socket on the small daughter-board (Mini HDMI, or standard with / without a chassis-mount flange).
    ie. the main board has an FPC ribbon connector on it now, so people can choose to use that for the HDMI socket if it's suits the specific console better.

    OzOnE

    P.S. Does anyone know of any methods of software to fully patch a non-VGA game?
    I've tried setting the usual VGA flags in IP.BIN on Last Blade 2, but the game code itself still writes to the vid registers and forces "TV" mode?
     
  3. -=FamilyGuy=-

    -=FamilyGuy=- Site Supporter 2049

    Joined:
    Mar 3, 2007
    Messages:
    3,034
    Likes Received:
    891
    There's no generic method for that. Some games simply don't have the code bits required. Japanese-Cake is currently fixing some of those games manually though, see: http://japanese-cake.livejournal.com/.

    Good job!
     
    Last edited: Apr 9, 2015
  4. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    Ah ok. I tried my Dreamcast on a couple HDTVs and a couple PC monitors (one CRT and one 4:3 LCD) and the results were different. It looked great on the CRT and looks ok on the LCD monitor. The HDTVs on the other hand, they displayed the DC differently. It was slightly off-center horizontally and it was a little wider horizontally.

    I'm starting to think that the DC's normal resolution is probably something in between 640x480 and 720x480 but many monitors/TVs see it as 640x480 and try to display it as such. It could be a nonstandard resolution that confuses many monitors/TVs. I think the article I read said the true resolution is something like 720x480 in a 640x480 frame. I'll have to dig it up and see if I remembered it correctly.

    I don't have any Mini HDMI to standard HDMI cables so if you go that route, I'll need to get one (hopefully a 2.0 cable for dat 4K stuff I can't use).

    If you need someone to test it on an NTSC-U/C console, I would be more than happy volunteer as tribute.

    Do I throw my money at the screen now or later?
     
    Last edited: Apr 9, 2015
  5. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3
    If it is 720x480 timing then I guess some TVs/monitors do not like that over the VGA port. I am pretty sure the Dreamcast video output is kind of like the GC digital output. 480p timing, but only about 640 active video pixels horizontally.
     
    Last edited: Apr 9, 2015
  6. Unseen

    Unseen Spirited Member

    Joined:
    Sep 1, 2014
    Messages:
    126
    Likes Received:
    17
    The situation is a bit more complicated on the GC - the graphics chip has a horizontal scaling feature that can result in more than 640 active pixels on a line. One game that uses it is Mario Kart Double Dash with an active area of 666x448 pixels.

    And of course there are games that seem to have gotten it wrong: The title screen of Phantasy Star Online (running at 640x480i) has a large circle in the background that only looks like an actual circle when the image is displayed with square pixels, i.e. 640 instead of 720 horizontally.
     
  7. wombat

    wombat SEGA!

    Joined:
    Mar 14, 2004
    Messages:
    2,671
    Likes Received:
    319
    Great development this is, looking forward to the final product. I'm definitly going to upgrade my Dreamcast with one of these boards, once they become availible! HDMI + GDEMU = next level :) ... Now if only someone could work out a wireless connection for the controllers and we have a Dreamcast 2.0 in our hands.

    Will it also be possible to add a scan line function to the HDMI board?

    Edit: for the 640x480 vs 720x480 debate, for reference see: http://junkerhq.net/xrgb/index.php/Dreamcast
     
    Last edited: Apr 9, 2015
  8. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    I would buy a wireless Dreamcast controller if it was good.

    Scanlines would be a really nice touch too.

    Thanks for the link. I wonder why Sega just didn't set it to output at 720x480p like normal. That must have been what was going on with my HDTVs. Hopefully OzOne can fix this so it is true 720x480p instead of 720x480p in a 640x480 window.
     
    Last edited: Apr 9, 2015
  9. Lum

    Lum Officer at Arms

    Joined:
    Sep 30, 2010
    Messages:
    3,233
    Likes Received:
    42
    Don't most/all consoles flag themselves as supposedly 720? Anyone whose XRGB mini isn't broken knows what I'm talking about.
     
    Last edited: Apr 9, 2015
  10. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3
    For HDMI there is not really a problem. It will recognize it as a 720x480p signal. You provide a clock with HDMI so it will know exactly how many pixels are in a single line.

    I think what most VGA monitors do is, they try to squeeze about 720 pixels into a 640 pixel space, because they think they are getting a 640x480 signal. 640x480 and 720x480 have the same amount of lines in a frame, 720x480 just has a higher pixel clock. I guess this also means some pixels are dropped in the sampling process, although I don't really know too much about video reconstruction from VGA, so I could be entirely wrong. I wonder if there are any monitors that sample the VGA line correctly at 27 MHz (which is the pixel clock speed of 720x480).
     
    Last edited: Apr 9, 2015
  11. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    Ok, that's good to hear the resolution will be accurate via HDMI.

    I think one of my older HDTVs display it as 720x480 via VGA. I can't check the resolution on that TV so I'm unsure if it does. It looks wider on it than on the VGA PC LCD Monitor I normally use with the DC and my old Windows 98 rig from my childhood. My gaming HDTV doesn't have a VGA port and it is the only one that says what the resolution is.
     
  12. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Thanks for the link, wombat.

    That confirms it then - it's actually 720x480p timings.

    But, I'm actually outputting the VIC (Video Id Code) as 640x480 atm, so the capture card sees it as that anyway, and the aspect ratio is correct.

    I can easily switch to a 720x480p VIC, and I think that worked fine as well.
    I will check it again now to see if the aspect is still correct.

    It's definitely 640 pixels across a line though (actually, more like 642).
    I have the blanking almost perfect now. Just one or two extra pixels hidden on the right-hand side.

    Ahh, here we go. This is now in 720x480p mode on the HDMI chip...

    http://postimg.org/image/qti5mi2j7/
    http://postimg.org/image/od5r7owof/

    So, you can see it does have the extra "border" area on either side on the start-up logo, and then is only using 640 pixels in-game, but with a black border.

    That's great though, because 480p should now work on 99.9% of modern TVs. :)

    The aspect ratio still looks fine on the Avermedia, since it handles that correctly (surprisingly, as it does have it's other faults).

    I'll be doing the captures in 480p from now on. It simplifies a lot of stuff.

    On my current vids, the DE (Data Enable) generation was simply cutting off the borders and only showing the 640 pixels in the middle.
    It's nice to keep the border colour there now, and the timings should be correct.

    I honestly thought it was plain 640x480 for a while there because I measured the Vsync and Hsync with the 'scope the other day, and it looked spot-on to 60Hz and 31.468KHz.
    (I realize you can have non-square pixels though, and have borders extending further into the blanking areas etc.)

    The pixel clock for standard 640x480 is normally 25.175MHz, and for 480p is normally 27.7MHz (or actually 27MHz in the DC, which makes perfect sense).


    OzOnE.
     
  13. crans

    crans Rising Member

    Joined:
    Aug 16, 2014
    Messages:
    68
    Likes Received:
    1
    Keep up the good work im so ready to get this in all the systems it willl work on
     
  14. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3
    You did not have the borders before when using the 640x480 VIC? That is pretty interesting. I guess some receivers are still able to handle it as 640x480 even though the video timings are different. Perhaps some receivers will ignore the VIC and just handle it as 720x480 and perhaps some will not work. Not sure what the DE signal will do to the HDMI output.

    I think your new settings make the most sense.


    My first thought here is that the Hsync and Vsync rate are the same as 640x480. I believe 720x480p has exactly the same amount of lines in a picture, so it should have a similar line rate. I guess this is also what makes it hard for monitors, as it is not able to tell the difference between 480p and 640x480 based on the Hsync and Vsync. And on top of that I guess most monitors are not made for 480p.
     
    Last edited: Apr 10, 2015
  15. Calpis

    Calpis Champion of the Forum

    Joined:
    Mar 13, 2004
    Messages:
    5,906
    Likes Received:
    21
    I'm wrong, it's very much Altera. And he's just using 3.3V LVCMOS... (And I guess is relying on the default drive strength to limit current.)

    Right.

    Displays can't truly know the horizontal resolution over an analog connection, but they can guess based on standardized frame timing, sync polarity, or by attempting to determine pixels and measure their period.

    Doesn't matter. The DC is certainly SMPTE 480p (like the rest of the generation), with a 27 MHz pixel clock, it's evident from the oscillator.

    DC games probably don't acknowledge the 10:11 pixel aspect because: slight performance hit?, overscan, typical square-pixel tools and assets, square-pixel texture aliasing?

    640 x 480 isn't a "real" consumer video format. Since it was first used in the computer graphics wild west, there are many implementations with many timings. By the late 90s practically every card had simple frequency synthesizers that could only approximate the common IBM's VGA timing, which itself approximates BT.601 480i timing. (The situation today is practically the same, except it's entirely based off 27 MHz seed clocks rather than 14.318.)

    SMPTE timing is the standard 480p today. If you want square pixels, you should be using a pixel clock of 9/10 * 27 = 24.545 MHz
     
    Last edited: Apr 10, 2015
  16. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    @bagheera - I'm using the DE Generation feature on the HDMI chip atm, as it's a necessary signal and a quick way of getting things running.

    The DE (Data Enable) signal simply tells the sink (TV / monitor) when there are active pixels in a signal.
    So, for 640x480, the DE signal is only enabled for 640 pixels in each line.

    The DC actually has some border pixels either side of the central 640 pixels, so really I was just cropping those before.
    The border can only be set to a fixed colour though by the looks of it (like the Mega Drive etc.)

    DE Generation does work fine too, and allows adjustment of the Hsync / Vsync delays, number of active pixels / lines etc.
    But of course, if the output from the console changes, you have to re-program those registers to match the input signal.

    The DE thing is a pain tbh.
    If it wasn't for that, you could just rely on the HDMI chip to detect the VIC in most cases, then simply input Hsync / Vsync / Pixel Data as normal.

    The HDMI chip can apparently detect a lot of the standard CEA-861 timings and set the VIC appropriately, but the datasheet says that it would likely have trouble properly detecting 240p / 288p signals.

    I also have to use pixel duplication for 480p, so the actual number of "active" pixels for DE Gen is 1440 rather than 720.

    As Calpis said - almost all SD video these days is using a 27MHz master clock (or a fraction of), and that's what the DC uses (for video) too.


    I could probably give more complete replies than this, but I've been busy talking to [RDC] about the new PCB layout, and answering tons of questions on the YouTube vids. lol

    OzOnE
     
  17. Lum

    Lum Officer at Arms

    Joined:
    Sep 30, 2010
    Messages:
    3,233
    Likes Received:
    42
    While we're talking about all these numbers... Is there any video diagnostic device which tests the incoming signal, then provides the user such information?
    Consumer TVs obviously always hide everything from curious eyes.

    Especially cool if it identifies things like the color subcarrier MHz of NTSC/PAL/SECAM.
     
  18. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3

    I don't think you need that? Your pixel clock is above 25 MHz. I have no trouble getting 640x480 to display with my ADV7513 using a 25 MHz clock. I just supply the DE signal through my FPGA a lot simpler than using the DE generation of the ADV7513 imo. Although in your case you would need to implement a hcount and vcount to determine what to do with DE, so you might as well do it with the DE generator.

    Edit: Just did some probing on the PS1 DAC with my scope. The DAC chip is a
    H7240AKV. My PS1 is PAL and it was in the system menu while doing the tests. On my CRT I can clearly see that the video signal is interlaced. I measured the video clock and my scope says it is 13.33 MHz. I do not think my scope's frequency measurements are very reliable. My guess is that the clock is supposed to be about 13.5 MHz. I think this means it is very likely that the video output's timing conform to 576i (for NTSC it would be 480i). It is important that the timings conform, because else I think some TVs will not accept it over HDMI.

    Also probed a bit on the color lines. It seems in the system menu Red Green and Blue use only 5 bits. It would be interesting to see how many bits are used in games, because it would be nice if we would have to solder less wires for the colors. Soldering to the DAC seems doable. Also there are vias to a lot of the pins, so I guess those could be used too.

    The DAC has a sync input. I think it is a combined sync signal (I do not know too much about how combined sync works). Not sure how to split the sync signals with an FPGA.

    I think there should be space above the shielding where the disc drive is placed for an HDMI board, just like with the Dreamcast.

    One thing that sucks: The PS1 seems to switch video modes. System menu is interlaced. Playstation logo when starting a game is interlaced too, then the game goes to progressive (tried THPS2 :D).
     
    Last edited: Apr 10, 2015
  19. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    I would love to have one of these if they exist. A device like that would be really helpful.

    By the way, I don't know if I mentioned this in the thread where we were talking about the Genesis's true video format, but the reason the 240p component test failed with my Genesis was because it was still in 50Hz mode. Swapped to 60Hz and it worked fine.
     
  20. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Yep - they most certainly do have analysers for all of this stuff, including HDMI...

    https://www.youtube.com/watch?v=MoFeCj1hF_8

    @Bagheera - Ooops, I'm probably getting mixed up with the other setting for using a 2x clock.

    I was using pixel dupe for the 480i / 576i tests on the GC before though (although my Samsung LCD didn't like it).

    Yep - DE Gen is a bit of a pain if you don't already have the signal on the DAC / console.
    The ADV handles it very well, so might as well use that.

    As I say, the main drawback is having to detect the timings though, so I'll have to update the VIC if the video mode changes.

    The only way of ensuring the weird input timings will work via HDMI is to use an SDRAM framebuffer really, but that can start to get complex.
    Rather than doing that, I'm looking to finish the Uber-scaler at some point (based on the Tvia TrueView chip), so you can handle the weird formats externally (and do a lot of extra clever stuff).


    After doing some tests the other day to see how the GBS-8220 scaler handles the DC's RGB video, I found that most games (or the BIOS) actually disables Hsync and Vsync in "TV" mode.
    Only composite sync is then output from the AV port (pin 10?)...

    So, I wrote a bit of code to split H/Vsync from Comp sync yesterday. I haven't tried it out yet though.
    Once that's working, I'll just need to hook up another wire to the Comp sync pin on the DC.

    I'll then hook up two more wires to the "Mode" pins on the DC so I can force it to VGA mode for most games whenever the HDMI cable is plugged in.


    Splitting a pure comp sync signal to H/V is actually simpler than I first feared. I'm doing it a bit like Miguel is on here...
    http://www.eevblog.com/forum/beginners/converting-15khz-analog-rgb-to-digital/msg466315/#msg466315

    On a neg edge, I save the current "pos_count", then start incrementing a "neg_count" counter.
    On a pos edge, I save the current "neg_count", then start incrementing a "pos_count" counter. ;)

    Then, it should be just a case of checking for whether "last_neg" > "last_pos" to see if we're seeing the longer Vsync pulse or not.
    In fact, you could probably just use the same counter for pos / neg, as long as you save the current length before starting from 0 again.

    To re-create the Hsync pulses, it will be a case of seeing which count is shorter, then using that value as the delta.

    (there may be a tiny bit of a delay in the final Hsync / Vsync pulses, but that's not really a problem.)

    OzOnE.
     
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page