HDMI and RGB mods for different consoles

Discussion in 'Modding and Hacking - Consoles and Electronics' started by OzOnE, Mar 17, 2015.

  1. Fandangos

    Fandangos <B>Site Supporter 2013</B>

    Joined:
    Sep 19, 2012
    Messages:
    604
    Likes Received:
    23
    Ozone, do you think you could capture some 2D games so we can see the result?
    3D games looked amazing! I'm curious to see the result of some fighting games.

    About the GC, I'm wondering.. why the GC?
    I'm trying to figure the benefits of it over the Wii but I can't.

    Iso loading is possible with Dios Mios, and just like any of the optical drive emulators (wiikey fusion, wode, that can be installed on GC) it won't work with audio streaming games.
    So compatibility is exactly the same but much easier on the Wii.

    The wii has the entire game cube hardware inside, as far as I know, it has the controller ports, the memory card port, everything.
    So why would someone prefer this board on the game cube instead of the wii is beyond me.
    Maybe just for the start up logo, the system menu, the gameboy player or maybe just nostalgia?
     
  2. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    There is a Wii version on the way from Unseen. Some of us grew up with a Wii and don't want to modify our childhood consoles for fear of destroying something of sentimental value. Plus it is for the Game Boy Player, 8 Player Mario Kart over LAN, that slowed down FDS jingle in the System menu, the awesome start-up logo, the system menu, the cool design of the console, and because we can.

    It is much easier on the GameCube as the digital port has all of the required signals in a single spot. Well, at least on DOL-001 Revisions A and B. DOL-101s require soldering to random spots on the board to work with the HDMI boards and the official component cable.

    Hey OzOne, if you get probing around in a Wii, can you find the signals required for the Game Boy Player and the Broadband adapter? If the Wii really is a heavily modified GameCube (or two cubes duct taped together), the signals should still be there right?
     
    Last edited: Apr 11, 2015
  3. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    @Fandangos - as MonkeyBoyJoey said, one of the main reasons for targeting the GC over the Wii is that the GC is so much easier to solder the wires to (for the ones which have the Dig AV port).

    The GC and Wii have a similar (/same?) video DAC, and the pin pitch is very narrow.
    It would be a bitch to solder to, especially for modders who haven't done much of this before.

    Even just to get Kynar wire to sit side-by-side would be a challenge, due to the insulation thickness.
    It can be done, but probably means stripping off a fair bit of insulation, then holding the wires down after using a blob of epoxy or hot glue etc.

    There aren't really any obvious solder points in the Wii, so it either means soldering to the DAC itself, or trying to solder to the ridiculously small vias.

    If Unseen is targeting the Wii, it's probably a good idea to find or design a QSB for it, or just an off-the-shelf ribbon adapter that could make soldering to the DAC easier.

    @MonkeyBoyJoey - I'm not sure about finding the GC signals in the Wii tbh...

    The Wii is actually a different console in that the chipset is new and re-designed, slightly faster, and has more security features etc.
    It uses a very similar PowerPC core and GPU, but I'm pretty sure the chipset is newer.

    So, there are no guarantees the same port signals even exist on the Wii. I haven't really looked into it though, so could be wrong.


    I'm wondering now about doing a small Kickstarter for this HDMI project, as I'm realizing just how expensive it will be to get a batch of say 50 boards made.
    It's looking to cost upwards of $2000 atm! :eek:

    OzOnE.
     
  4. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    Would it be easier to solder to the points under the DAC? Something similar to what Kevtris does to the NES PPU and CPU sounds like a good idea. The only soldering needed for those adapters would be mounting the chip socket to the mobo.

    $2000? That's insane! I would definitely back the kickstarter if you started one. I could also help spread the word about it so it would get some more backers.

    If you do get the chance could you look for the High-Speed port signals? People have talked about it in the past but nobody took the time to look for them. It would be much easier if a schematic was available for the backwards-compatible Wiis.

    Also, when someone successfully recreates the Digital AV connector for the GC, would you be willing to make an adapter for you board that plugs into the port externally? It would then require little to no soldering if you did that.
     
  5. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    @Fandangos - I will try to get some footage of some 2D games on the DC soon.

    I need to test line-doubling on 240p modes as well, but one slight issue with the DC is that it disables the Hsync and Vsync outputs in "TV" mode, and only outputs Comp sync.
    So, I'll be testing that code in the next day or two which splits the Comp sync signal, then add the line-doubler to generate 480p via HDMI (with or without scanlines).

    I've still yet to get the on-screen display code running either, and I'm not even sure the board warrants that complexity tbh.
    If most TVs and capture cards work fine with 480p via HDMI, that was pretty much the whole aim of this project anyway (pixel-perfect pure digital output + audio).

    I'm not sure this FPGA can handle up-scaling to 720p or 1080p without SDRAM, and again that would make it too expensive anyway.

    I did try Bangai-O the other day, but it does force TV mode, even after I patched the VGA flag in IP.BIN.
    So, the game code itself would need to be patched to stop it switching modes when the game boots, or I'll just see if I can grab the Comp sync and output 240p / 288p as 480p / 576p.


    @MonkeyBoyJoey - the DAC on the Wii is surface-mount, and has really small pins.

    It would be very difficult to get or make an adapter that could be soldered in place with the DAC chip soldered on top, but we have been looking into that.

    kevtris can stack his "interposer" boards between the CPU and PPU on the NES because they are old-skool through-hole chips...

    De-soldering those with a proper station is easy, and they then just need standard IC sockets to be soldered in.
    I only wish the Wii was as easy as that. :p

    I've kind of decided against modding the Wii for those reasons, as well as the fact that the Wii U is available now, and I understand it (finally) has HDMI on it.
    (not that too many people are buying the Wii U tbh, but oh well. lol)

    Oh, I'll definitely be making an external version of this board if Buffalo or somebody makes some new Dig AV plugs. :)
    The current board design is only around 41.25mm x 56.62mm now, so still very small for non-BGA stuff.

    [RDC] has already sent off for the final proto boards to be made (by OSH Park), so we should hear back in the next few weeks.

    He's still putting together the parts list and working on tweaking the solder paste stencil design.
    I will then have to throw him a huge bundle of "money units" across the pond to the US to pay for the proto PCBs + parts etc.

    You don't even want to know how much the three proto boards is costing. :eek:

    OzOnE
     
  6. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    Ah ok, I had a feeling it might be a surface mount chip. I wouldn't even dare open my Wii unless it stopped turning on because it was the first Nintendo home console I ever owned and it was a Christmas 2007 gift. That's why I haven't done any probing or modding in my Wii. I won't even soft mod it because I'm afraid I would break something.

    All fears and taboo aside, I hope an easy solution can be made that doesn't require insane soldering skills. I've only been soldering since late 2014 so I'm new to it but I have gotten better within the past couple months. Mostly because I now have a proper soldering iron and not a literal soldering gun (it looks like a brown revolver).

    If it makes it easier, you could drill holes in the pads so you could turn it into a through-hole chip. I've got a nice power drill that would do the job perfectly. :)

    I have a Wii U and I can say for a fact that Wii games on the Wii U look really good via HDMI. Due to the fact I'm unwilling to open my Wii or transfer my Wii's data to the Wii U, I'll be happy using component cables on my Wii. I will always use HDMI on the Wii U except when recording video via my HD PVR 1. Then I will use component.

    When you do make an external version, could you also make an internal version for DOL-101 users like me. I asked Buffalo if he would 3D print a panel mount version of the female Digital AV connector for use on the DOL-101s, but he hasn't given me a clear yes or no answer. It would be useful on the other consoles you are targeting too as you could have a unified connector for everything, therefore keeping it all external and in a cable/adapter form.

    I couldn't even begin to imagine how much it costs but if I had to guess, it would be more than I make in a year. Everything is that much... Glad to hear that the boards are almost done! I will have to get one some day and try it out on the DC, GG, GCN, N64, and whatever else I have that it supports. Heck, while you are at it, why not make it support the 3DS XL or the Wii U gamepad? I would really like something that works with those right about now.
     
  7. Unseen

    Unseen Spirited Member

    Joined:
    Sep 1, 2014
    Messages:
    126
    Likes Received:
    17
    You should probably forget about it.

    Uhm... No.
     
  8. Lum

    Lum Officer at Arms

    Joined:
    Sep 30, 2010
    Messages:
    3,233
    Likes Received:
    42
    Thing is, next to nobody with funds/skills/equipment to perform these mods would intentionally buy a DOL-101 in the first place.
    Aside from bragging rights to say you've achieved it, or dead set on the system's exclusive colors.
     
  9. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Yeah, I probably would never entertain doing this mod on a GC without a Dig AV port, it's just not worth the effort, and there are plenty of older GCs out there which have the port.

    I think I could just about manage to solder to the DAC on the Wii, but it would definitely be a nightmare tbh.
    I couldn't expect the average modder / solderer to try the same really, and I'm already quite worried about people trying the same thing on their Dreamcasts (which isn't anywhere near as hard as the Wii DAC, but would be a challenge for the casual modder.)

    We might have to ask specific modders who are skilled in soldering to carry out the mods for people.

    Adding HDMI to the Wii U game pad is actually an interesting idea though, and might be worth the effort / cost to some people...

    I was watching a vid on Cinemassacre a few days ago (with James and Mike), and it was annoying that you can't really capture the output from the Wii U game pad directly.
    It would still be a tricky mod, but will at least likely have a standard FPC connector inside which goes to the LCD panel.

    Although again, there are other ways around that - I watched this vid last year of when they hacked the Wii U game pad already, and they've already worked out how to stream the video from it and even "snoop" on the data...

    https://www.youtube.com/watch?v=WC1CqQK_beU

    An HDMI mod would still be fun to try, and I'm sure some people would like one, but there are alternative solutions for some of these consoles.

    OzOnE
     
    Last edited: Apr 12, 2015
  10. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3
    Take a look at this project. He is doing scan tripling to get 720p and times four (I do not know the term for that) to get 1080p. The signals are kind of out of spec though, so not every TV accepts it. Furthermore, I think he is able to use the video decoder chip to derive a clock from the HSync pulses, although it has been a while since I last read the thread. I think using that construction he is able to get the pixel clocks for 720p and 1080p. His scan doubler also seems able to not make the TV lose sync when the input signal switches from interlaced to progressive (could be TV dependent). I think what changes when the input signal switches is that suddenly more or less lines are sent to the TV per frame.

    I still think that it should be possible to derive a different output video clock from the input video clock. Using that we would be able to correct consoles with a weird pixel clock, if the refresh rate that is close enough to 60 Hz. Take a look at this thread. They have implemented several arcade systems on an FPGA. They have however some weird video timings (incorrect number of lines per frame, incorrect number of pixels per line), I guess their implementation is that accurate to the original. What they then do is they redefine an output pixel clock based on the system clock for the system. They buffer enough lines for the output pixel clock to use.

    I think their problem is very similar to ours and I think we can use their solution to correct systems with an odd pixel clock or generate a higher output clock so we can get 720p/1080p. I do not really know what kind of PLL we would need, if accurate enough ones exist. I already asked in the topic for their code in which they do the PLL values calcuation. I tried to port it to C, but unfortunately I am not really getting anything useful out of it (and I have not really looked at it very well).

    I am really wondering how marshallh is handling progressive/interlaced switching with his HDMI adapter. It seems he buffers part of the input signal (one field (?), so he can at least combine it with the next field for interlaced?), but I wonder if his adapter ever skips frames. I think the vsync frequency changes slightly when you switch between interlaced/progressive, so it seems hard to lock onto the incoming framerate.
     
    Last edited: Apr 12, 2015
  11. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    It's the only Cube I have and I really don't want to have two cubes lying around. The DOL-101 is my first working cube. I don't want to get rid of it. That's why I keep asking about it.
     
  12. Lum

    Lum Officer at Arms

    Joined:
    Sep 30, 2010
    Messages:
    3,233
    Likes Received:
    42
    Don't worry about that. Set it aside for when you get two broadband adapters, perhaps.
     
  13. MonkeyBoyJoey

    MonkeyBoyJoey 70's Robot Anime GEPPY-X (PS1) Fanatic

    Joined:
    Mar 1, 2015
    Messages:
    1,738
    Likes Received:
    312
    My closest friend and I are planning on moving in together one day so she will have her DOL-001, which could use the cable version, and I would have my DOL-101, which could use a solder version. Two BBAs does sound good but it also sounds expensive. Not as bad as DC BBAs though.

    If only I kept that DOL-001 that stopped reading discs... I hate my younger self sometimes.
     
  14. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    @bagheera - Line-doubling from 240p / 288p to 480i / 576i is easy enough, but I'm intrigued how that guy managed to do fairly decent "de-interlacing" when buffering only a couple of lines at a time from each field?

    The TVP7002 chip that marqs is using looks interesting though...
    http://www.ti.com/product/tvp7002

    I know a lot of these chips have already solved the issue on doing clever de-interlacing without needing a huge SDRAM or SRAM attached, so it may be an option on a future version of my board to use a similar off-the-shelf chip like that.

    I'm already looking at finishing the design of the Uber-scaler, and that will require SDRAM but massively simplifies all the timing and scaling stuff.

    The TrueView chip even handles decent motion-adaptive de-interlacing and all sorts of other stuff, like gamma / colour correction, black level adjustment, colour space conversion etc.
    (although it does work in the YUV colour-space internally anyway.)


    I've plugged some line-doubling code into my project now, and it's working OK...

    I've actually connected a Game Gear mobo to my Cyc V GX dev board, but when I connect the clock from the crystal, it kills the oscillation on the GG.
    So, I'll have to just generate the 32MHz clock on the FPGA for the time being.

    (strangely, it works fine when connecting the GG clock to the CPLD RGB board, so it the Cyc V is probably adding too much capacitance. The on-board clamp diodes won't be helping either.)

    I think what marqs and kevtris are doing is re-generating the sync signals, but for different reasons...

    kevtris showed in a video that the NES / FPGA didn't always stay in sync on reset / power-up (I believe he's fixed that now), so what it appears to be doing is generating the HDMI sync on the FPGA, then waiting until the sync from the NES aligns closely. That way, he can output at 480p / 720p / 1080p, then do the line stretching within that output frame etc.

    When switching between 240p and 480i like the N64 often does, the main difference is the slight change in the Hsync pulse positions for each "field" (so-called "half-lines" etc.)
    AFAIK, the actual pixel output timing is the same between the two modes, but the Hsync offset in 480i means the second field is drawn above / below the previous field on the CRT.

    So, basically I'm now looking into doing a bit of sync re-gen on the FPGA for consoles like the N64.
    I also need to get all the timing standards sorted, and make a note of all the weird modes that different consoles and games might use (the Genesis is particularly fussy).

    The CEA-861 specs are obviously what I need to adhere to for the HDMI output timings, and I already have a test-pattern gen which works fine with those mode timings.
    One annoying thing about the CEA spec is that it doesn't seem to list the standard pixel clock freq next to the diagram of each mode, but it's a great document otherwise.

    Anywho, I hope to have the Game Gear working on HDMI later today, and I want to see what the main difference is between the timings when it's in "stretch" mode (when test pad T10 is tied High).

    This GG has already had it's stock LCD removed anyway, as I was close to getting chroma encoding working for a cheap Composite screen.
    I now have some source code for the common mini-LCD scaler chips too, so looking at either inputting digital video into one of those, or using a cheap LVDS-to-TTL adapter board for driving a new LCD more directly.

    Right... I now have to send some monies over to Mr [RDC] for the new prototype HDMI boards. :)

    OzOnE
     
  15. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3
    Pretty sure he is just line doubling everything, so half of the vertical resolution is thrown away for interlaced images. This is not a bad thing, if you go for a simple, cheap and low latency upscaler imo.

    Interesting.
    I think the hsync timing stays the same. The position of the vsync pulse is only changed afaik, it can be in the middle of a line or at the beginning of a line. In that way you effectively have 262.5 lines per field. In a progressive image you always have a full number of lines per field, so if you line double your image and there is a switch between progressive and interlaced the number of lines in your output image will suddenly become different. For instance if a progressive image is 262 or 263 lines then you get 524 or 526 lines in your output image, which is different from the 525 lines you get for the interlaced image. I am not sure how a TV handles this. I think some TVs will lose sync. Also, I am not entirely sure if I am correct here.

    I am not sure if sync re-gen is enough. You need a new pixel clock as well if it is too far off. And it somehow needs to stay in sync with the input image.
     
    Last edited: Apr 13, 2015
  16. Calpis

    Calpis Champion of the Forum

    Joined:
    Mar 13, 2004
    Messages:
    5,906
    Likes Received:
    21
    Lines missing is very out of spec. IMO no TV should accept it, especially since it's so trivial to do things right with digital signals...

    Of course. This is what I mentioned on the first page of this thread; it's the only way to synchronize with incoming analog video in order to do things like overlay/"genlock"/"time base correction".

    It does depend on the TV. Due to Vsync rate changes between 240p/480i the output pixel clock must change to keep sync, and the TV must be able to track the changes very rapidly, to prevent buffer over/underflows.


    Again, this is what I mentioned in the first page. A fractional PLL can generate a secondary "output clock" governed by the input clock, governed by Hsync. This is the only way to get an output video synchronized to an input video, without crappy temporal interpolation methods.

    Of course they exist, you probably own a dozen fractional frequency synthesizers embedded into everyday electronics. (Again, you can even create them using ordinary PLLs to get arbitrary ratios through modulation. Without profiling the loop filter and understanding the divider implementation [not easy on FPGA] it's hard to say what type of modulation if any is acceptable though.)

    This is impossible (of course), at best it's "bob deinterlacing", which isn't deinterlacing at all. Think about it, you need to buffer the previous field to even do simple weave or blend field merges.

    They are standard TV frontend chips--they just handle digitization and YCbCr compression at most. Subsequent scaler chips usually provide the deinterlacing, scaling, signal processing algorithms.

    Deinterlacing requires buffering fields, there's no way around that.

    ??? There isn't any need. He's re-clocking the NES, almost certainly with a phase accumulator, to achieve exactly 60/59.94 Hz, which the output video buffer may easily synchronize to.

    Hsync should *always* be in the same position (hence equalization pulses) so the lines align correctly.

    Vsync occurs a half-line early every other field to provide interlace...

    The Genesis has very standard timing, the problem people have with its sync is electrical--the sync output is a high-impedance open-drain/logic-level signal rather than a 40 IRE into 75 ohm TV sync signal.
     
    Last edited: Apr 13, 2015
  17. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Yep - I reckon most of these off-the-shelf chips with no external RAM are just doing a simple bob de-interlacing.

    I'm sure there are some chips which will buffer some lines to help with blending within the same field, or some that do some basic macroblock motion estimation for when the next field comes along though.

    kevtris did say that something on his initial code had to "wait until the NES and HDMI sync-up" on one of the GameTechUS vids.
    I wouldn't have thought that a simple PLL would take any noticeable amount of time to sync, so I assumed he could possibly have being doing some of the timing generation on his FPGA too.


    Yep, I realize how the Vsync position affects the line position in interlaced modes.
    I of course mis-spoke when I said "Hsync", I meant to say "Vsync" instead.

    (btw, I don't want to get back into the discussion from a while ago about me saying the 240p / 288p modes not being a truly "progressive". I was clearly mistaken back then, and admitted as much. lol)


    I always assumed the Genesis has very slightly non-standard timings from what I've heard / read?
    I've seen it mentioned a few times online that it outputs only 234 lines or something, and not quite 240?

    I haven't tested this theory myself yet, but you could be right because I would have expected most consoles to have to adhere quite closely to "standard" timing specs, especially in the 90s. (yes, I realized the Genesis was released in 1989, but close enough. hehe)

    I know older CRT TVs were quite tolerant of minor timing changes though, so you never know if the occasional console "abused" the timing framework slightly.
    I'm sure there are many examples of non-standard timings on older 70s-80s consoles at least.


    btw, I tested the HDMI DC on our newer 47" Celcus (LG panel) LCD TV earlier, and it works great.

    The image looks fantastic in straight 480p, and obviously the TV itself will be scaling to the panel, so I see no real benefit to try doing any expansion of lines nor scaling of the image on the FPGA itself.

    Up-scaling can only get you so far in most cases, and any half-decent modern TV will a doing a pretty good job anyway.

    Of course, some people may want to try to improve the image even further, in which case they can still use an external scaler if they wish (or my Uber-scaler, if I ever get time to finish it. :p)

    Only a handful of DC games force TV / 240p mode too, but I will be testing the Comp Sync splitter and line-doubling this week.
    I need to test all of that for most older consoles / computers anyway.


    OzOnE
     
    Last edited: Apr 13, 2015
  18. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3

    I will be giving this document a good read. Currently, I do not know enough about PLLs. Hopefully this will clear things up.


    Yes, I think he is doing the same thing as the NEO GEO HDMI mod. He already has an input pixel clock and from there he calculates the NEO GEO clock with Verilog. That obviously gives some clock jitter, but the system does not seem to mind it.

    I wonder if that would be possible with the N64 as the input clocks there go to a PLL and I think it will not lock anymore when there is too much jitter.
     
  19. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    There's no secret to simply using the console's pixel clock directly tbh.

    Most of them are stable enough as-is, it's just that the Vsync / Hsync timings and / or pixel clock is a tad too slow on some machines to meet the minimum requirement for HDMI (25MHz pixel clock IIRC?).
    The ADV chip can do auto pixel repetition for those formats, but the sync timings still need to be relatively close to one of the accepted standard modes (meaning modes that most TVs support, and not just any mode listed in the CEA-861 specs.)

    Or, it's sometimes the case that the Vsync is just a bit too far below 60Hz to allow some monitors to sync properly, whereas the same signal (in analog) would work fine on a CRT.

    Both my DC and N64 HDMI mods work fine directly on the video clock from the console itself, and they don't need any PLLs. ;)

    It was only from the way kevtris spoke about his mod that I think he's doing something slightly differently.
    It may be just a bit of de-glitching code or something that lets his HDMI pixel clock and syncs run asynchronously to the incoming NES video?

    I'll try to find the part of the vid again, and you'll see what I mean.
    Not that it's too important, it was just interesting to note.


    OzOnE.
     
  20. bagheera

    bagheera Rising Member

    Joined:
    Aug 1, 2014
    Messages:
    65
    Likes Received:
    3
    But a lot of old consoles use a different amount of lines and pixels per line than any standard specificies. The Neo Geo HDMI mod did not work on the creator's TV initially (it did on his monitor), because it had 528 lines per frame, if he would just simply line double it. And this is something I am afraid of, that a lot of consoles will not work on certain TVs if the specs of the incoming video is too far off and this does not necessarily mean only the pixel clock.

    Did you try the N64 with this board or do you mean your project from 2012? The N64 has some weird timings, such as 773.5 pixels per line for 240p. I don't think that will work with every TV over HDMI directly even if you line double it. The problem is, you end up with an output image that does not conform to the CEA spec.

    I made a video of my N64 HDMI adapter by the way:


    The video kind of sucks. Currently it just writes the whole image to SDRAM and it is read out again to send it over HDMI as a 640x480 signal (sure I could make it 720x480, but that would only add bigger black borders at the moment :p). I have not tried yet to just line double the incoming image, but I am pretty sure it will not work with a lot of TVs.
     
    Last edited by a moderator: May 25, 2015
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page