"Let's make GD ROM emulation happen" Facebook group.

Discussion in 'Sega Dreamcast Development and Research' started by sonicdude10, Jun 18, 2012.

Tags: Add Tags
  1. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hi,

    It was just the Gamecube code for reading in the drive commands that I'm having trouble with...

    The original code works fine, but is in VHDL and I have a problems reading some of it (and modifying it quickly).
    So, I tried converting it to Verilog, but it doesn't latch the GC commands correctly?

    I can force it to work in Verilog if I do it synchronously (by clocking the GC signals through some regs, then checking for rising or falling edges),
    but it's not ideal and causes extra delays which could make things unreliable.

    Here is the segment of Destop's original VHDl code; I haven't included any port declarations / assigns / regs etc...

    http://pastebin.com/pd6sbSRf

    Here is a conversion to Verilog by Veritak (free s/w)...

    http://pastebin.com/utjF8kYd

    And here is a conversion by X-HDl (demo)...

    http://pastebin.com/Ebyi2ckM


    Looking at the code, you would think the Verilog would work OK, but it's not seeing the async signal edges properly?
    It might be something else I've missed in my code though?

    I'm assuming it's because the VHDL has the keyword " 'Event ", which tells the compiler to look for an edge?
    In Verilog, I think you need to add "posedge" or "negedge" to the event list for it to work the same way, is this correct?


    (again, sorry this is in a Sega thread, so I'll keep it quick).


    Thanks in advance,
    OzOnE.
    P.S. Those 2232H chips look very nice. :cool-new: No wonder you can stream ISO's so well!

    btw, I've started playing with the DAC on my board - I tried to play a DC Audio track, but the SD Card is reading from a completely unexpected start position?
    I might have to throw out this crappy SD Card driver code and find another one.
     
    Last edited: Jul 1, 2012
  2. cybdyn

    cybdyn Embedded developer (MCU & FPGA)

    Joined:
    Jan 12, 2012
    Messages:
    551
    Likes Received:
    4
    OzOnE. so you can launch game on GC with original code you used? or it need some fixing?

    i plan use usb in option of debuggin and for them who wants stream data from pc. also it is good if it has good speed you can just conet by usb cable console and pc and it doesnt need disclose box for reching hdd or sd
     
    Last edited: Jul 2, 2012
  3. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    No, it's not actually working yet. I'm just playing with the code until the new FPGA board arrives for the DC.

    I have my real Gamecube hooked up to the FPGA and it's latching in the commands OK (it's trying to read the Disk ID).

    I think it would be easy for you to stream images to the Gamecube once you've completed your new board etc. I'm not too good with PC software programming.
    The GC protocol is fairly simple and doesn't appear to have any major security as such? It just requests Words from the disk.

    It looks like the "Disk ID request" just reads the header from the disk (complete with Game Code/Maker Code/Game Name/file offsets etc)...

    http://hitmen.c02.at/files/yagcd/yagcd/chap13.html

    After that point, I'm assuming it just requests the boot files and starts main execution?
    I'm going to fake a disk header later on to see what happens afterwards (I don't have USB or SD streaming working yet, as usual :moody: ).

    Destop's original code has examples for USB streaming too. I can't confirm 100% if it ever worked as expected.
    There is a LOT of audio streaming code though, so basic game streaming was "probably" working OK.

    (I think a certain AG / Benheck forum member ended up with the original PCB and said it "doesn't work too well") :friendly_wink:

    @cybdyn - do you possibly have any simple PC code examples for streaming ISO images via USB (FT245 or FT2232)? I'm not sure how to go about building the command structures?
    As I say, I'm not experienced with PC programming, but I'm most comfortable with reading C/C++.

    Regards,
    OzOnE.
     
    Last edited: Jul 2, 2012
  4. cybdyn

    cybdyn Embedded developer (MCU & FPGA)

    Joined:
    Jan 12, 2012
    Messages:
    551
    Likes Received:
    4
    for FT chips i use their library interface with simple functions call like FT_READ, FT_WRITE and others, so i make myself code just read ISO file and send (ft_write) to fpga. i can share code if you need , but i'm on summer hollyday about a month..

    also i use as IDE for programming the Builder 5, not for professional as most people can say, but its enough for 1.1 usb speed

    what chip for usb u wana use ft245? or from cypres?
     
    Last edited: Jul 2, 2012
  5. n64coder

    n64coder Robust Member

    Joined:
    Mar 25, 2009
    Messages:
    248
    Likes Received:
    1
    Do you still need a gamecube controller? I have a few and would have no problems mailing you one. Just PM me your address.
     
  6. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    That's OK, don't worry too much. I'm just trying to get my head around the example code.

    I have a FT245 on this new board. (I don't really want to go back to Cypress again, they are quite over-complicated).

    I just have to figure out what sort of handshake protocol I need when streaming a file.
    I'm checking out the C++ Builder and MS Visual C++ examples atm.

    btw, I've made some progress with the GC Verilog code - the obvious mistake is that many of the VHDL-to-Verilog translators keep adding in the regs and initializing them to zero!

    So, with the GC "readcommands" process, it wasn't incrementing the index because it was being reset on every clock.

    I'm starting to learn more about "always" blocks in Verilog.
    The signals in the event list can either be all synchronous (only ONE edge for each signal though, plus a clock), or all async (either edge, but you still check which edge you want). Something like that anyway. lol

    Also, the main "synchronous" process was changed to async when translated to Verilog!

    So, it's latching in GC commands now (Verilog). I will try adding the SD Card driver now.



    OzOnE.
    P.S. I know I keep apologizing for mentioning the GC in this thread, but I've realized there will be a lot of crossover with these consoles if we are going to support all of them on a single FPGA board.
    @mods - Please let me know if each console needs to be split into separate threads?
     
  7. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    @n64coder - Thanks for the kind offer, but I just bought a cheap clone controller about an hour ago. I almost forgot about that actually.

    @runkthepunk - Thanks again, but I thought I'd grab a controller anyway. I was just being a bit cheap really. :redface-new:

    I only needed the one controller tbh. I should be able to get past the Language screen and try spoofing the Disk ID so I can at least see the game name on screen.
    (The code needs a lot of work first, but at least I can make sense of it now. VHDL == :nightmare:).

    OzOnE.
     
  8. _SD_

    _SD_ Resolute Member

    Joined:
    Oct 11, 2008
    Messages:
    947
    Likes Received:
    1
    Regarding the possible multi-console functionality: any chance of PC Engine support at some point down the line? And would it emulate the entire Super CD Rom unit, or just the actual CD drive itself? Basically, do I go for a Core or a Duo?

    Anyway, fantastic work as always.
     
  9. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hmm, you've had me wondering about this...

    After your post, I've been searching for info about the PC Engine. What we basically need are the expansion port pinouts and memory map for the CD interface.

    I think I only saw a TG16 once or twice back in the early nineties (quite rare in the UK, they might have had a few in Tandy's in the late eighties?). It looked pretty advanced for the time.

    As I understand it, the PCE / TG16 uses a specific "CD-ROM System" HuCard when reading CD's is this correct? Does the card just stay in the slot all the time the CD-ROM is used?

    Almost any console's storage can be emulated via an FPGA if you have enough info on the system / enough spare pins / programming skill (I wish I had a tad more of that). :dejection:

    The 16-bit cart based systems like Megadrive and SNES look easy to implement, but of course with systems like NeoGeo and arcade boards, there are a huge number of pins and possible security devices.

    Ooh, just found some stuff while typing this...

    http://www.gamesx.com/misctech/pcebp.php

    (scroll down about three-quarters to section 8 for the CD-ROM info)...
    http://cgfm2.emuviews.com/txt/pcetech.txt

    It looks a bit more involved than I first imagined because of the ADPCM streaming stuff. (Interestingly, it uses similar SCSI / ATAPI type commands to the DC GD Drive).

    tbh, I'm not likely to be buying a TG16 any time soon - they fetch a far amount on the auction sites, and I personally don't think I'd play the games too much.

    Without having the physical machine, it's near impossible to debug. Maybe @cybdyn will be up for the challenge in the future though? :smile-new: :eek:nthego:

    OzOnE.
     
  10. veganx

    veganx Dauntless Member

    Joined:
    Jan 8, 2011
    Messages:
    743
    Likes Received:
    2
    Since we are throwing some questions around I would like to shot some curiosity: what about those security measures in the PS2, just to name an example?

    Is it easier to develop a cd-rom replacement for a system without any security system? Like the 3DO or the Mega CD?
    All security measures need to be bypassed via some kind of exploit in the code of the FPGA?
     
  11. Calpis

    Calpis Champion of the Forum

    Joined:
    Mar 13, 2004
    Messages:
    5,906
    Likes Received:
    21
    Are you thinking about it in terms of hardware? Synchronous blocks resolve to combinatorial logic followed by an edge-triggered flip-flop, asynchronous blocks resolve to combinatorial logic followed by a transparent latches (which may get optimized away if you don't have any latching action).

    With asynchronous logic *there are no edges* (unless you manually implement triggered FFs), you're really evaluating signal levels. Edges may only be evaluated by the sensitivity list with pos/negedge. Any code within a block is going to be asynchronous boolean functions (evaluated to sum-of-products in PLD/LUT in FPGA or whatever).

    For a good design each module should have separate synchronous and asynchronous blocks, and you should almost never use blocking assignment. Blocking assignment is only good for breaking a really long logic expression into multiple lines. If you don't use it properly though a latch will be synthesized and wreak havoc. These consoles are new enough to probably be fully synchronous so you most likely don't need an asynchronous block at all. Also when you're writing asynchronous blocks, forget adding signals to your sensitivity list, just use an asterisk (widely implemented SystemVerilog feature I think).

    always @*

    I'm not familiar with the signal names used here but I can tell that the sensitivity list is really F-ed up, most certainly either the GC provides a clock signal, or if it doesn't the strobe is meant to clock the logic. Don't run logic entirely in the FPGA's clock domain (it looks that way, maybe I'm wrong) or you'll run into really bad metastability issues that your logic must account for.

    In other words the GC should clock commands into the FPGA on its own, the FPGA shouldn't be sampling the data. It should be then be self clocked into a FIFO. The FIFO should have some kind of domain crossing technique implemented. The FPGA/MCU should grab data from the FIFO and do stuff with it in its own domain. There should be another symmetrical FIFO back to the GC for response data that the GC can pull via its own domain as well since just shoving data onto the bus can be problematic if a bit clock/strobe is also used as an output-enable as is the case with many an asynchronous bus (!).
     
    Last edited: Jul 3, 2012
  12. cybdyn

    cybdyn Embedded developer (MCU & FPGA)

    Joined:
    Jan 12, 2012
    Messages:
    551
    Likes Received:
    4
    Ozone: "I have a FT245 on this new board...I just have to figure out what sort of handshake protocol I need when streaming a file."

    i made my own protocol, it seems you can make your own as you wish. i can give you my example if you need?


    about GC, a plan understand general points of original code and rewrite it.
     
  13. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hi,

    Sorry for not replying for a while - I've been very busy building a laser projector (among a lot of other things)...

    I also received the new FPGA board last week and soldered the SRAM...

    I've spent about 12 hours a day for the past 4 or 5 days to get it working, but sadly it has beaten me once again. :sorrow:

    It loads from SD into SRAM OK, and appears to load from SRAM to DC, then it just stops after that and gets stuck on a SET_MODE command?

    I still don't have an easy way of confirming that the data is correct 100%. I've had a scan through and it looks to be transferring OK?

    I can't understand it, and frankly I'm a bit p*ssed off with it now. I tried comparing the new code to the old CF code again, but the SD version just won't load to the Sega Logo?

    The only time it EVER seemed to work was when doing DMA directly from CF Card. (It's a lot of hassle to connect up the old board now too.)

    I can post the current crappy code if anyone wants to have look?

    @calpis - Thanks for the info, it's been a lot of help with just trying to understand these always blocks etc. (I've abandoned the GC for now btw; yes that code looks evil).

    I tried many different ways of using the async /READ signal from the DC (and GC) to directly clock-in the Words, but it wasn't reliable (kept glitching)??

    I had to go back to clocking the async signals through two FF's, then checking for rising or falling edges. This is the only thing which seems to work.
    The only downside is the small delay from the falling edge of /READ to when the data is updated, but it looks to be in plenty of time before the rising edge latches the data.

    So, with that in mind, I've been looking into modifying the BIOS instead. I thought it would be near impossible before because I couldn't find a decent debugger.
    Now I realized that the MESS debugger does a great job, I've managed to work out quite a few of the BIOS GD access routines.

    Another idea is just to write a driver for Dreamshell. In theory, it should be able to take full direct control of a hard drive on the G1 bus.

    This will need a FLASH BIOS mod of course, so I've just asked @Bad_Ad84 for some chips.

    I'll also have a play with making my own low-level assembly routines for HDD access as a patch for the original BIOS.
    I've never done any programming on the DC, but I can understand the assembly code quite well.

    It's very long-winded working through the code, 'cos you have to step through with the debugger to work stuff out.

    Dreamshell is starting to look more promising. A lot of the core stuff is already in place, it "just" needs to be modded for HDD.
    I think maybe I should get in touch with Mr SWAT man? Also might be time for me to get familiar with LUA scripts.

    If you look at it this way - if this can be done in Dreamshell and burnt into a FLASH BIOS, all you would need is a ~£2 FLASH chip and a cheap adapter PCB from the HDD to GD socket.

    Sorry that I couldn't get the FPGA board working. It's firmly attached to the DC now though, so I'll carry on messing with it.
    (Unfortunately, the new board doesn't have a USB connection or it's own SD Card socket, so it's harder to add things to it.)

    OzOnE.
    P.S. I need a way of displaying sectors on-screen under Dreamshell - if anyone is handy with LUA, please let me know.
     
  14. splith

    splith Resolute Member

    Joined:
    May 2, 2010
    Messages:
    997
    Likes Received:
    4
    I've got two of the DC bios chips sitting on my desk right now... Still haven't actually put any into a DC! You can have one if you want?
     
  15. cybdyn

    cybdyn Embedded developer (MCU & FPGA)

    Joined:
    Jan 12, 2012
    Messages:
    551
    Likes Received:
    4
    Ozone, ye DS moding is another way)))) that is what i was talking about in another thread, now i see even you have arived to this point)))


    http://www.assemblergames.com/forums/showthread.php?27608-Replace-GD-ROM-with-Flash-Card/page11

    ...emulation of original GD-ROM is tough enough.

    but let me talk about another way. Nowadays we got SD-mod that used trick with new bios (DraeamShell) and it basicaly works on new syscalls that were replased in original bios...

    http://www.assemblergames.com/forums/showthread.php?34646-Dreamcast-GDRom-Diver-emulator!/page3

    ...sd card mod works well. in simpliest way we can plug ide device to g1 (g2) and use similar to sd-mod method that replace bios syscalls. so it doesnt need much knowledge about raw-GDi and so....



    but anyways i'll try make replace gd-rom by FPGA. and also multi-bios mode can help if i face same problem. so i can get bios-mod and gd-rom converter in one device w/o flash-mod and other.
     
  16. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hi,

    Mammoth post incoming!!...

    @splith - Yep, how much did you want for them? Are you in the UK?
    The thing is, I only really needed the one chip and I'm waiting for Bad_Ad84 to get back to me about payment.

    @cybdyn = Yes, you said that BIOS modding would be a good idea, and I'm starting to realize it might be easier and cheaper...

    Especially now that Dreamshell already has functions for accessing a hard drive (PIO only atm).
    I'm assuming this is the "Navi" type IDE interface which connects to the G2 bus?

    I've been looking at some of the LUA scripts, and it seems simple enough. The problem is I'm not sure how to compile a new module to add the extra functions?

    But, does DS not work with GDI files? Did they explain exactly why this is? Is it a BIOS / security issue?

    I could probably write my own HDD access routines in DC assembly. Ideally, we want to take advantage of DMA via the G1 port.

    @cybdyn (and anyone) - Below is the latest code. Again, it's NOT working atm, but I think it's still a data corruption issue.
    If you can feed it the CORRECT data (first 7 sectors of your track3.iso), I'm hoping it will show the Sega logo, then request the next block of sectors like it used to.

    btw, your track3.bin MUST be converted to ISO unless you're handling the larger sectors - The DC only seems to request 2048-Byte (Mode 1) sectors from the GD area!

    I split the code into two parts in an attempt to simplify things... The "gd_emu" part handles the low-level stuff from the DC and then just requests a block of GD sectors from the "control" block.
    The control block handles the SD Card reading (or whatever source you want), then handles the DMA transfer to the DC...

    "gd_emu"...
    http://pastebin.com/6DbUNXUh

    "control"...
    http://pastebin.com/yw91xrnh


    There are a few other things to keep in mind about the code. A lot of stuff is manually spoofed, or unfinished...

    I don't know if the "gd_emu" block is going to work exactly like the old CF Card version (which definitely DID start to boot to the logo),
    but I did a compare with the old code using Notepad++ to make the above copy as similar as possible (same status bits and responses etc.).

    There is some confusion with the SET_MODE command and how the status / IO bits are supposed to be set when receiving data FROM the DC?
    The Sega SPI manual has some mistakes concerning this. Although, it seemed to work OK before by just doing "gds_pio_send_data". This is the way the nullDC source did it.

    There are a few comments in the code which say "spy log" where I'm setting bits based on what I saw when a REAL GD was booting.
    I don't know if this is the correct thing to do? The same thing goes for "gd_state" 17 - The DC only wanted to boot if it saw a Busy bit ONCE after a SET_MODE command?

    I'm not using the Audio streaming junk atm, I was just messing with the DAC on my previous board.
    I'm not using SDRAM atm (even though this board has 8MB) - the new 512KB SRAM chip appears to be working fine (NOTE: My SRAM is 256KB x 16, so in reality, address bit A18 is unused).

    It's not currently using the PLL for the main system clock, I'm just using the "CLK50M" signal directly from my board's crystal clock.
    It seems to be unreliable when running at 100MHz via the PLL (no surprise really with this chip), I was only using 100MHz to try to speed up SDRAM access.

    I strongly believe that the data from my SD Card is being corrupted (one or two words), or that it's missing the "/gd_rd_n" falling edges and not counting properly?
    Having said that, the counting seems reliable and finishes at the correct point?

    If anyone could please tell me how to directly use the async "/gd_rd_n" signal to clock data from the SRAM, that would be great. (without clocking "/gd_rd_n" through FF's first.)

    I realized last night that I can actually grab quite a bit of data using Signaltap, paste it into a text file and keeping repeating the process until I had a large chunk of the SD Card data.
    I then compared this to the original track3.iso (Taxi, PAL) and found that there were a couple of Words which were being skipped!
    So, even if the counting is working properly, it looks like the SD driver is missing a couple of Words at random, and shifting a whole chunk of data. :moody:

    It's difficult to tell what's happening though - you should have much better luck if you're streaming via USB!
    I haven't included the SD Card driver code but I can paste it if anyone wants?

    I added the "gd_dma_state" stuff near the bottom of the gd_emu block to try to make it similar to when the CF Card version was working...

    What the CF version did was to kick off the DMA state machine and let it run by itself (send the LBA stuff to CF Card then let the DC control the DMA transfer). BUT, the main "gd_state" was allowed to go directly back to 1 (idle)!

    The only way it seemed to get to the Sega logo is by letting "gd_state" go back to idle while the DMA was being set up?
    I don't know if this is because specific flags were being set which kept the DC happy, or if the DC needed to write some ATA commands during DMA??

    (If the DC wants to read a basic ATA register at any time, it shouldn't be a problem because they are directly mapped to the IDE bus.)

    I'm manually generating both TOCs and the "REQ_MODE" stuff to make it specific to the GDI I'm using.
    In theory, it should be able to write to the MODE registers, but I've commented that out for now.

    Erm, what else? Oh yeah, I'm not handling the CD Audio type commands properly yet. When the CF version booted (or the original GD disk), it ignored all that stuff anyway.
    The DC generally just does this...

    Sends the "SET_FEATURES" commands to set up the PIO and DMA modes,
    Does a lot of "SPI_TEST_UNIT" and "SPI_REQ_ERROR" commands until it finds the correct state(s) / results (eg. Lid closed, Disk present etc.),
    Does the 0x70 / 0x71 security requests,
    Reads both TOCs (Single and Double Density) one after the other,
    !! Starts reading IP.BIN from FAD 45150 via DMA !!

    So, it doesn't seem to bother with the CD Audio commands at all if it finds a standard 3-track game TOC (with other games, this will need fixing!).

    Anyway, you can see what a headache this stuff is. :nightmare:

    OzOnE.
    P.S. Would it be very naughty to post parts of a DC BIOS R/E?
     
  17. Calpis

    Calpis Champion of the Forum

    Joined:
    Mar 13, 2004
    Messages:
    5,906
    Likes Received:
    21
    Are you sure there isn't a clock to synchronize with?

    Are you sure this is a dedicated (pre-decoded) read strobe specifically for the GDROM and not for any other peripherals on the bus?


    always @(posedge gd_rd_n) // data is stable at the rising edge, most likely it isn't at the falling
    begin
    buffer[address] <= data;
    address <= address + 1;
    end

    Edit: wait, do you mean asynchronously writing this data to the SRAM? You can't. You need a fast bus arbiter in order to get dual-port access.
     
    Last edited: Jul 15, 2012
  18. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    The gd_rd_n signal is just the /READ strobe from the DC. Often called /DIOR on a regular IDE / ATA / ATAPI drive.

    There is no separate clock, this is an async strobe from the DC. On the falling edge, you prepare your data and put it onto the bus, on the rising edge, the DC should latch the data.

    The timing should be the same as for Multiword DMA Mode 2 (this is the mode the DC uses for sector transfers) Table 6-4 / Figure 6-2...

    http://support.mdl.ru/PC_compl/firma/Quantum/products/manuals/bigfoot_at/chp6-4-2.html

    There is some conflicting info though. Many diagrams (including in the ATA/ATAPI official specs) show the /DIOR and /DIOW signals inverted?...

    http://www.docstoc.com/docs/4213798...agram-Proposal-Pete-McLean-Maxtor-Corporation

    This is definitely wrong because the /CS signals are going low (asserted) and /DMACK is definitely supposed to go low as well (in the diagram).

    From what I've seen on a REAL drive (GD or otherwise), the data is latched on the RISING edge of the /READ strobe.

    Do you mean that it might be best to increment on the rising edges anyway, so the next Word is prepared way in advance?
    I've tried changing this around many times, but I suppose it will actually give better results.

    Ideally, I just want the SRAM / DMA address to increment instantly on the falling edge of the /READ strobe.
    This is to ensure that the data is definitely stable before the DC latches it.

    OK, I've just changed it to a rising edge (I haven't tried the always block yet - I tried it the other day and it didn't work).
    The output is a lot cleaner because the data is prepared much earlier. You can see the dotted line (address inc) is quite a way after the rising edge due to clocking "gd_rd_n" through FF's...

    [​IMG]

    It's still not loading though, I think my SD driver is messed up?

    The problem with the always block is that I need a way of resetting the address to zero as well.
    If I just add a separate always block outside of the main block for incrementing, I get the dreaded "multiple driver" errors when compiling due to trying to reset the same reg to zero in the main block.

    So instead, I tried resetting like this...
    ("rising" and "falling" and clocked through FF's first. "cont_dma_rq" always goes high just before a DMA transfer)...

    always @(posedge cont_dma_rq_rising or posedge gd_rd_falling)
    begin
    if (cont_dma_rq_rising) begin
    // DMA_ADDR <= 19'b1111111111111111111; // Intentional wrap-around, so DMA_ADDR starts at zero on first falling edge of "gd_rd_n".
    dc_bytecount <= 32'hFFFFFFFE; // Intentional wrap-around, so "dc_bytecount" starts at zero on first falling edge of "gd_rd_n".
    end
    else if (gd_rd_falling)
    begin
    // DMA_ADDR <= DMA_ADDR + 1; // negedge of "gd_rd_n".
    dc_bytecount <= dc_bytecount + 2; // Remember, a WORD is transferred from SRAM !!!
    end
    end

    I've just noticed that I was using "gd_rd_falling" when it should just be "gd_rd_n"!

    btw, DMA_ADDR is simply the SRAM address where the sectors from the SD Card have been stored.
    I created DMA_ADDR as well as SRAM_ADDR_REG so I could manually change SRAM_ADDR_REG while doing the SD loading stuff, then let DMA_ADDR auto increment during DMA.

    DMA_ADDR is simply assigned to the SRAM address pins while "cont_dma_rq" is high...

    output wire [18:0] SRAM_ADDR = (cont_dma_rq) ? DMA_ADDR : SRAM_ADDR_REG;


    I know this is a confusing way of doing things, but I'm trying everything to get it working more reliably.
    Once I think the DMA code is working well, it points to data corruption being a more likely culprit.

    It would be great if it would just increment DMA_ADDR reliably on either edge, but I've yet to get it working?

    Thanks,
    OzOnE.
     
  19. angelwolf71885

    angelwolf71885 Dauntless Member

    Joined:
    Jun 5, 2010
    Messages:
    795
    Likes Received:
    6
    i think the issue could be with using a SD card instead of a CF card
    from what i understand the SD card only communicates on the USB bus
    so the read of the blocks has to be converted at some point
    and a |CF card is IDE all the way so theres no conversion necessary
     
  20. OzOnE

    OzOnE Site Supporter 2013

    Joined:
    Nov 10, 2011
    Messages:
    538
    Likes Received:
    173
    Hi,

    SD Card should be fine assuming the controller / driver is working correctly. The best thing of all is that you only need about six pins and a pull-up resistor to hook it up!
    btw, SD can work with a four-wire protocol, or via SPI (what I'm using) - not to be confused with Sega Packet Interface, although it would be nice if it was. lol

    There are some extra CRC bytes added to the end of each SD block, but the driver I'm using is "supposed" to discard them.
    I'll have to delve into the driver code a bit more, but it's seriously over-engineered.

    I agree, I do like CF cards, but they need quite a few commands to set them up.
    You'd still need an FPGA or something similar to process the custom Sega commands before grabbing the data from CF. This is what I was doing originally.

    Also, in IDE mode, a CF will only transfer up to 256 at once AFAIK.
    I should have tried using a CF in Memory mode - I think this is what Marshallh started using while developing the 64drive?

    I'm tempted to try a CF card again but I'm running out of pins! It was the only time the DC booted to the license screen - the DMA transfer handled itself too.

    @calpis - is it normal to have so many always blocks in the code? This is a nightmare for debugging...

    http://www.freefilehosting.net/sdcardtest


    OK, I just tried this...


    always @(posedge gd_rd_n or posedge cont_dma_rq_rising)
    begin
    if (cont_dma_rq_rising) begin // Only pulses for ONE clock on rising edge!
    DMA_ADDR <= 19'd0; // Zero the SRAM addr
    dc_bytecount <= 32'd0; // Zero the bytecount.
    end
    else if (cont_dma_rq && gd_rd_n) // Only increment when "cont_dma_rq" is high (not rising)!
    begin
    DMA_ADDR <= DMA_ADDR + 1;
    dc_bytecount <= dc_bytecount + 2; // Remember, a WORD is transferred from SRAM !!!
    end
    end


    It's finally incrementing by itself now, but the above image was actually cleaner.
    I'm 99% sure the DC latches the data on the RISING edge of "gd_rd_n", I could be wrong.

    Does anyone have a simple SD Card code in Verilog?
    Ideally I just need to issue a Multiple Block Read command, read the bytes I want (at my leisure, or as fast as it wants), then issue a STOP MBR command.


    OzOnE.
     
    Last edited: Jul 15, 2012
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page