Direct HDMI output for N64 (and other consoles)...

Discussion in 'Nintendo Game Development' started by OzOnE, Nov 13, 2012.

  1. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hi, all,

    I thought I'd start a new thread for this since it looks like I'll be rambling on about it for some time...

    OK, so at the bottom of the following post, I finally had success with generating HDMI directly from an FPGA...
    http://www.assemblergames.com/forum...B-amp-mod-help&p=627612&viewfull=1#post627612

    I already have the decoder block for extracting R/G/B/Sync from the digital video pins on the N64, so now I'm trying to see if I can get the N64 to work via the FPGA / HDMI interface...

    I'm hoping it will be possible to output standard 480i / 576i (interlaced) directly via HDMI, but I can't seem to figure out what control bits or sync pulses the TV is expecting?
    We know progressive VGA modes are working fine (well 640x480 atm), so if I can't get interlaced working directly, I'll need to convert N64's interlaced output to progressive.

    Marshall has done a great job with his VGA adapter. I'll have to ask him if he wants to help out with the scan doubler code...
    http://forums.benheck.com/viewtopic.php?f=5&t=43034

    I don't have any photos or video yet, as the HDMI code is only outputting a VGA test pattern atm, so it's not very interesting.

    When I try the N64 through the FPGA, my TV just says "Mode not supported", or no signal.
    Obviously this is due to the incorrect sync timing, so I'm after some help with the correct timings...

    The core protocol for DVI_D / HDMI is essentially the same, but HDMI has added support for lossy / lossless multichannel audio, CEC, Ethernet and a load of other junk we don't need right now.
    Here is the DVI spec...
    http://www.cs.unc.edu/Research/stc/FAQs/Video/dvi_spec-V1_0.pdf

    The encoding is very convoluted, but most of that has been taken care of. The problem is, I don't know how the hell HDMI handles an interlaced source?
    I know it's related to the pixel clock, and from that the DSR "Data Signalling Rate" is calculated?

    OK, so what do we know about the N64 video output?...

    Firstly, the encoding for the digital video bus has already been worked out (thanks to Tim)...
    http://members.optusnet.com.au/eviltim/n64rgb/n64rgb.html

    ...and I know that part of code is working 'cos I used it a few years ago to generate analog RGB (SCART) directly from an FPGA.

    The next important thing is the actual pixel clock timings...

    On the NTSC N64, it uses a crystal which is four times the NTSC colour subcarrier frequency. ie. 3.579545 * 4 == 14.31818 MHz.
    On most REVs, you can see this crystal on the board, usually marked "X1".

    This is then multiplied 17 and divided by 5 using the PLL chip "U7". (14.31818 * 17) / 5 == 48.681812 MHz.
    This roughly 50MHz "VCLK" clock is used as the master clock for the video DAC as well as the entire RCP.

    (The RCP internally multiplies VCLK by 1.25 to get 62.5MHz. The CPU internally multiplies the 62.5MHz by 1.5 to get 93.75 MHz).

    Interesting, so the NTSC consoles actually run their RCP at 60.852265 MHz, while the PAL consoles run the RCP at 62.0706625 MHz? hmmm

    The N64 video bus uses a clocking scheme where it outputs one full pixel every four clocks.
    So, the "pixel clock" on the NTSC console is effectively: 48.681812 / 4 == 12.170453 MHz ?

    I think that pixel clock is a bit too non-standard to be transferred via HDMI directly? (not that it matters for the analog RGB encoding)
    We could only halve the VCLK to give 24.340906 MHz, but it's the interlaced issue which is screwing things up.

    Right, I think I'll have to do a scan doubler to get this to work easily. :stupid:
    In the mean time, I have a few tricks to try.

    All for now.
    OzOnE.

     
  2. Lum

    Lum Officer at Arms

    Joined:
    Sep 30, 2010
    Messages:
    3,233
    Likes Received:
    42
    Majority of N64 games are 240p. 480i had limited use.
     
  3. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Yep, internally most games run at 240p (320x240), but I'm pretty sure it duplicates each pixel then always outputs at 480i.

    Some games, like Indy Jones and Pod Racer used a "hi-res" mode. Supposedly, this would be close to 480p internally? (640x480)

    The analog output is always interlaced AFAIK. It may be possible to force some progressive output modes, I've never seen it done though?

    OzOnE.
     
  4. APE

    APE Site Supporter 2015

    Joined:
    Dec 5, 2005
    Messages:
    6,416
    Likes Received:
    138
    I've always assumed that the games with a hi-rez mode were using 640x480 as they needed the expansion pak to do so implying that they needed the extra ram as a frame buffer.

    Turok 2, Hybrid Heaven and South Park 64 (Turok 2 engine) are games off the top of my head that will do it. A partial list:
    http://www.gamespot.com/forums/topi...hat-games-use-it-and-what-are-the-differences

    Problem is that the frame rate usually dropped to a varying extent. Turok 2 went from what was probably a smooth 30fps to 20-25fps. Hybrid Heaven goes from smooth to "dear lord is this even playable?". Overclocking helps Turok 2 out a bit but not Hybrid Heaven in the least.
     
  5. sanni

    sanni Intrepid Member

    Joined:
    May 30, 2008
    Messages:
    653
    Likes Received:
    77
    I don't think the N64 outputs steadily at 480i because with the scaler I'm using you can kinda see the N64 switching resolutions. I had the scaler setup wrong and it would only display an image during the intro cutscenes of Perfect Dark but not the actual game. It also didn't display the menu of the 64Drive(480 resolution) but did display the Neo Myth Menu(240 resolution). So at least the scaler noticed a difference between the 240 and 480 output of the N64. Idk, I have no plan how these things work. Just something I noticed.
    Everything works now though after I did autocalibrate the scaler.

    Still a true digital hdmi output sounds nice. =)
     
  6. keropi

    keropi Familiar Face

    Joined:
    Feb 2, 2011
    Messages:
    1,068
    Likes Received:
    64
    This project is of highly interest, I hope OzOnE gets the info/help he needs!
     
  7. Lum

    Lum Officer at Arms

    Joined:
    Sep 30, 2010
    Messages:
    3,233
    Likes Received:
    42
    Hmm. Barring info to the contrary I'd assume N64 in 240 mode would be the same as NES, SNES, PS1, and PS2.
     
  8. splith

    splith Resolute Member

    Joined:
    May 2, 2010
    Messages:
    997
    Likes Received:
    4
    Wow. Ozone you are doing an electronic engineering course right, because if not, why the hell not! :p Amazing stuff.
     
  9. HEX1GON

    HEX1GON FREEZE! Scumbag

    Joined:
    May 4, 2011
    Messages:
    9,916
    Likes Received:
    837
    Yep, Expansion Pack did indeed allow some games to display "hi-res" I never really noticed the difference between them other than the frame rate...

    Vigilante 8, Vigilante 8 2nd Offense and Top Gear Overdrive are other games that have the feature that a few might not know about.

    Interesting work OzOnE, I have a question though - would HDMI increase the colour or make colours look a lot richer than what they do through Composite or S-Video?
     
    Last edited: Nov 14, 2012
  10. brendand

    brendand Fiery Member

    Joined:
    Jul 5, 2011
    Messages:
    885
    Likes Received:
    5
    the difference is very noticeable on certain games like the world is not enough when you switch between the resolutions its a good difference to the texture quality its not so grainy or blurred prob is the same difference on most other games that support the pack it usually makes the draw distance slightly better and the textures better but sometimes that does make the framerate dip
     
  11. APE

    APE Site Supporter 2015

    Joined:
    Dec 5, 2005
    Messages:
    6,416
    Likes Received:
    138
    Assuming it looks anything like RGB the picture would end up sharper with better colors overall. The A/D converter will always take a bit off the top and keeping everything digital the whole way to your screen will improve things but is always is a matter of "how much and will I be able to tell". Side by side comparisons will yield improvements but as with most things the improvement will not likely be so dramatic as to make you crap your pants.
     
  12. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    @splith - I did a bit of college many years ago (1998, showing my age. lol), but I was rubbish at the academic side of things and dropped out after about 5 months.

    As many people say, I learnt a LOT more in my own time after college. I could probably do the work now if I can motivate myself, but I'm not sure the qualifications are worth it these days?


    @HeXiGON - I would think HDMI would be a massive improvement over comp / S-Vid. Maybe not so much over a "good" RGB, but there will be none of the ghosting / x-hatch problems to worry about.

    You can't really "add" colours as such, but it should be possible to do the same type of processing like emulators do (TV scanline emulation, Eagle 2X modes etc.).

    Almost all of those effects would require a full framebuffer system using the SDRAM or whatever.

    At the moment, I can only catch a glimpse of what the N64 image looks like via HDMI 'cos I can't figure out the scan doubler.
    What I've done is to feed the HDMI output with the Hsync / Vsync from the VGA pattern gen. The decoded N64 RGB output then gets fed to the HDMI output.
    This is course means that the image rolls around like crazy, but at least I can see something to tell me the image itself if working (64DD boot up logo).

    The N64 outputs 21-bit colour (7 bits per R/G/B). It was a shame the 'Big N' didn't make it output 24-bit colour, but then most games only use 16-bit colour anyway (5/6/5 bits).

    There may be methods for smoothing the colour somewhat, but it would likely look artificial and cause colour banding etc.

    I can't seem to see exactly what type of sync the N64 outputs from Vsync / Hsync.
    It looks like straight sync to me, so none of the "serrated" sync pulses you get on the Csync signal.

    Basically, the N64 digital port outputs not only Csync, but separate Vsync + Hsync as well.
    (It also outputs a "clamp" signal, but that's not very interesting.)

    I need to figure out the timing relationship between the N64 and 480i, then I'll know where I am.
    The ~12MHz pixel clock might be causing a problem too. It's fine for the analog output - it just means that the pixels might not be exactly "square", but it doesn't matter.

    For HDMI, it expects a fairly strict pixel clock and timings so the TV / monitor can figure out which mode you're actually inputting.

    I found some scan doubler code last night, so it will take me a while to plug it all in.

    OzOnE.
     
    Last edited: Nov 14, 2012
  13. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    OK, bit of an update...
    (btw, a lot of what I type here is as much for my reference as for anyone else's)...

    After much Googling, I finally found that the spec I was after was the CEA-861-D pdf...
    http://blogimg.chinaunix.net/blog/upfile2/090903185737.pdf

    This outlines the actual timing standard of the digital video that you can send via HDMI.
    I've now managed to modify the VGA test pattern block to output different modes as per the spec.

    The maximum pixel clock is a tad limited on the FPGA because the DVI / HDMI block requires a clock that is 5 times the pixel clock (10 times actually, but we're using a DDR buffer to send two bits per clock).
    So, for standard VGA (640x480) with a pixel clock of 25MHz, the HDMI clock is 125MHz.

    I now have it outputting a 480p (720x480) test pattern, which is closer to the NTSC N64 output (although not interlaced).
    The pixel clock in this case is 27MHz, so HDMI clock is 135MHz.

    Other modes like 720p (1280x720) have a pixel clock of 74.25MHz, which gives an HDMI clock of 372.5MHz, and a final data clock of 745MHz !!! :nightmare:
    This is WAY too fast to output directly from a cheap FPGA, so don't expect any clever upscaling of the N64 to 720p with this method.

    Right, so I've tried outputting the N64 video at it's native 480i via HDMI without much luck.
    The timings for 480i are an exact "digital" representation of a standard analog NTSC signal, but I think there is another problem...

    The DVI / HDMI spec has a minimum frequency for the pixel clock of around 25MHz, so if you input low-res interlaced stuff directly (like NTSC / PAL), the pixel clock would be far too low for the link to operate properly.

    So, what they do is to clock out each pixel TWICE (during each horizontal line) to effectively double the pixel clock.
    The resulting resolution is of course the same, it's just repeating the same pixel twice...

    For NTSC (480i in digital terms), this means the actual format transmitted is 1440x480 (instead of the original 720x480). Makes sense.:rolleyes-new:

    Well, I think a true NTSC signal actually has a frame rate of 59.94Hz, which 60Hz / 1.001 (don't ask. lol).
    Either way, a modern TV should lock on to either frame rate.

    If anyone's interested, the table on pages 24 and 25 of the above pdf shows the pixel clocks etc.
    The detailed timing for 480i @ 59.94Hz is on page 34. Due to the pixel repetition, the res is stated as being "720(1440)x480i".

    So, I'll try changing my VGA test pattern block to see if I can get 480i to work, then I'll try syncing the N64 block to it and see what the differences are.

    In the mean time, I have the VGA block outputting 480p (progressive!) while the colour outputs from the N64 block are going to the HDMI block.
    The result is that I can finally see the image, but it's rolling around the screen due to lack of sync. I think we're getting there though...

    http://www.youtube.com/watch?v=qjP4oRQWC3w&feature=youtu.be

    OzOnE.
     
  14. Lum

    Lum Officer at Arms

    Joined:
    Sep 30, 2010
    Messages:
    3,233
    Likes Received:
    42
    I'm still skeptical. To me it doesn't seem like there would've been enough benefit for N64 to waste processing resources by scaling 240p rendered games to 480i for output (as Dreamcast and Playstation 2 ports often did).
    HDTVs weren't in common use back then.
     
  15. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    It doesn't do any scaling, it just repeats each pixel / line.
    This effectively gives the same res internally, but the TV "sees" the correct timings for standard 480i (which in analog terms is standard NTSC).

    If you think about it, the hi-res games apparently give 480p (internally), and this is just scanned out as interlaced to give 480i.

    EDIT: btw, when I mention "480p" I'm not implying an HDTV mode in this context, just the res of the N64's internal framebuffer.
    Although those terms are generally used with HDTV, many of these modes have an analog equivalent (basically 480i == NTSC, 576i == PAL).

    There isn't any real image "processing" going on as such, there are just many different methods for outputting both low-res and hi-res as interlaced.
    This is what pretty much all consoles / micros do.

    For example, with interlaced displays, we know that each alternate line gets scanned.
    First are the even lines, then all the odd lines (I think it's opposite for PAL, but you get the idea)...

    So, the internal framebuffer may only hold 240 "lines" of pixels, but each line gets repeated when output like this...

    TV Line
    0 = framebuffer line 0
    1 = framebuffer line 0
    2 = framebuffer line 1
    3 = framebuffer line 1

    The same goes for the pixels per line too - most N64 games only have 320x240 framebuffer, so they just repeat each pixel to fit the TV timing...

    Pixel
    001122334455667788

    In fact, it doesn't even need to update the output for every pixel, it just holds the value of the previous pixel until another "TV clock" comes along.

    For "480p" hi-res games, it just outputs the pixels directly (although the lines are scanned out in interlaced fashion)...

    012345678

    OzOnE.
     
    Last edited: Nov 15, 2012
  16. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hi, all,

    OK - Finally got the framebuffer stuff working, looking quite nice already. :biggrin-new: ...
    http://www.youtube.com/watch?v=RmNN51xRgg8

    I REALLY need to buy a new camcorder and shoot in progressive or something. I know it's a pretty terrible vid, but you should get an idea of the quality. :redface-new:

    I had to drop some of the bits to fit the data into 16-bits for now, so it's only 5-6-5 RGB atm.
    It's outputting at 720x480p, but with no deinterlacing yet. The TV is deinterlacing it quite well though.

    Apart from the above issues, the quality is great!
    Obviously it's a rock-solid image with perfect colours, but it does expose the low-resolution a fair bit.
    Then again, I generally used to play mainly PAL games, so I'll give a PAL game a try very soon (might need to change the code for 576p).

    I've only tried the 64DD and Top Gear Rally on it so far, purely because it was easier just to plug into the 64DD
    Also, TGR is the only USA cart I seem to have left?! lol

    (The V64 is in bits atm as I'm helping somebody to fault-find his V64.)

    Just after the one-minute mark in the vid, I press a button on the board to do a random sync. This shows the two separate fields of the original image.
    (I call this the "anti-troll" button, but I'm still waiting for the usual "fake!" comment from the YT kids. lol)

    You might have noticed the Red border around the image. This is just to make it easier to see the outline of the 480p "frame" so I could line up the N64 image within that frame.
    Most of the games only output an image of around 626 pixels wide, hence the black bars on either side of the frame. This makes sense for the NTSC standard as most TV's will overscan the image anyway.

    I can always change it to output 640x480p again, but I thought I'd try 720x480p instead.

    It was a pain to get the framebuffer stuff working at first. I think I've figured most of that out now.
    I've added "parameters" to the code to make it easier to change the image offsets / number of lines etc.

    There is a slight problem where it will often show four vertical lines down the screen when you switch the N64 on...
    The lines stay on the screen, so you have to switch on-and-off a couple of times until you hit the correct sync.
    I should be able to sort this out though.

    Ideally, I want to make some small boards for this, but I could always buy in some cheap pre-made boards from China.
    (they're just generic FPGA boards with SDRAM, so you wouldn't be loosing any quality as such.)

    This project should apply to consoles like the Dreamcast and Gamecube too.
    I don't think it's worth using DVI / HDMI on really old machines unless it's just for the fun of trying it. :friendly_wink:

    That's all for now.
    Never enough hours in the day for all these projects, even though I'm "between jobs" at the moment. lol

    OzOnE.
    P.S. Sorry to all the Sega dudes, but I've shelved the GD Emu for the time being - I spent WAY too much of my life on it recently.
    I might try hooking up the HDMI board to the Dreamcast for a laugh though. It should be quite nice because the DC outputs native 640x480p anyway (in VGA mode).
     
    Last edited: Nov 25, 2012
  17. keropi

    keropi Familiar Face

    Joined:
    Feb 2, 2011
    Messages:
    1,068
    Likes Received:
    64
    this is GREAT progress OzOnE! awesome vid!!!
    Please consider making it work on other retro-machines too, 8-16bit consoles via hdmi?! WIN WIN!
     
  18. reprep

    reprep Gutsy Member

    Joined:
    Jun 8, 2012
    Messages:
    475
    Likes Received:
    1
    thanks OzOnE, this is awsome.

    about the "It's outputting at 720x480p, but with no deinterlacing yet. The TV is deinterlacing it quite well though." part, what does it mean exactly? you are sending 480i, right? as the tv does the deinterlacing.

    i think, the same procedure will be compatible with gamecube too. but we should keep in mind that, hdmi out will only increase the quality if it is available to extract the digital video from the console itself. i don't think this is possible for 8-bit 16-bit consoles.

    about that 626 pixels wide thing, this is true for ps2 games too, so i found it just natural. overscan takes care of that.

    edit: i rewatched the video and saw that your tv says it is taking 480p picture at 01:42. so the tv isn't doing any deinterlacing then.
     
    Last edited: Nov 25, 2012
  19. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hi,

    Thanks for the comments guys.

    @keropi - It should be possible to use HDMI on older consoles, and might improve the quality depending on the console. But, there's a big problem...
    Many older consoles output analog video directly from their chipset, so there's no simple way to tap into the digital video bus...

    On the Genesis for example, you can see the analog RGB is output directly from the VDP chip (YM-7101) and into the TV encoder / RGB buffer (CXA-1145)...
    http://emu-docs.org/Genesis/mega2.png

    So unless you can find a revision which has the digital video signals exposed, you might as well just use the normal RGB input on your TV (or an RGB-to-HDMI converter).

    The only way to do it is to emulate the entire VDP on the FPGA, then redirect the digital output via HDMI.
    You can actually do this for the Genesis because the whole console can be "emulated" in an FPGA anyway (the VDP core is available as a separate file, and I believe compatibility is very good).
    More and more retro machines are being ported to FPGA designs, so it's getting quite easy to run many of the popular consoles on a fairly cheap FPGA board.

    The Gamecube could be a good candidate. It outputs digital vid from a socket too, so that makes things a bit easier.

    There's some good info on the GC dig vid protocol on gamesx...
    http://gamesx.com/wiki/doku.php?id=av:nintendodigitalav

    It apparently outputs 4:2:2 format though, so it will take a while to write the code etc.
    (I already have some code blocks for doing the YCrCb-to-RGB colour conversion).

    @reprep - Yep, it's outputting 480p, but the two original fields still exist in the "progressive" output image.
    Almost all modern TV's have a scaler / de-interlacer which automatically detects the type of incoming frame format and "cadence" of the fields (even if both of those fields exist within a single progressive frame).

    So, it's basically outputting an image which looks a bit like this left-hand image, and the TV itself is de-interlacing it to look like the right-hand image...
    [​IMG]

    On static images, the two different fields should "line up" horizontally, but obviously when you get movement in the image the fields tend to separate / serrate (like the left-hand image above).

    The reason for this is because the odd / even fields are usually from two different animation frames in the N64 framebuffer, and these are spaced 16.66ms apart in time (@60Hz "field rate").

    Ideally, I wanted it to output native 480i directly, but I'm finding it tricky to figure out the sync timings etc.

    Generally, if it was outputting native 480i, most TV's would do a better job at de-interlacing, as they wouldn't need to "detect" the type of image.

    What the TV would then normally do is to "weave" the two fields together to produce a decent looking progressive image at at 60Hz FRAME rate. :witless:

    Actually, I can see the image "jiggling" by one line very slightly a few times a second - I think the TV is doing a "pulldown" on the fields.

    Here's a great site about de-interlacing and telecine from movies (the "Fifth Element" animation half-way down is a good example of 24fps-to-60Hz field NTSC pull-down)...
    http://www.plasma.com/hometheater-guides/proscanexplained.php

    If the N64 was outputting progressive 480p@60Hz from the start (with one N64 frame per one video frame), none of this would be an issue.
    I'm going to try patching the VI regs on a game ROM to disable interlacing and see what happens.

    I've always hated interlacing, it's really just a throwback to the old TV broadcast systems.
    Interlacing was a way of halving the required transmission bandwidth - on a CRT, your eyes then do the job of "de-interlacing" the image (up to a point. lol).

    Thankfully the situation is starting to improve with HDTV (although most TV camera broadcasts are still 480i / 576i / 1080i, and the receiver or TV still needs to de-interlace it).

    OzOnE.
     
    Last edited: Nov 25, 2012
  20. Calpis

    Calpis Champion of the Forum

    Joined:
    Mar 13, 2004
    Messages:
    5,906
    Likes Received:
    21
    Seems likely the 7-bit DACs dither to 8-bit in 320 mode, and drop the LSB in 640 mode since it's doubtful 640 games can afford a 24-bit FB anyway.
     
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page