Firefox website archiver?

Discussion in 'Computer Gaming Forum' started by ASSEMbler, Jul 4, 2008.

  1. ASSEMbler

    ASSEMbler Administrator Staff Member

    Joined:
    Mar 13, 2004
    Messages:
    19,394
    Likes Received:
    995
    Anyone know of a plugin that will let you archive an entire site
    like on a mac?

    That or a program that will spider and save an entire site.
     
  2. babu

    babu Mamihlapinatapai

    Joined:
    Apr 15, 2005
    Messages:
    2,945
    Likes Received:
    3
    Last edited: Jul 4, 2008
  3. babu

    babu Mamihlapinatapai

    Joined:
    Apr 15, 2005
    Messages:
    2,945
    Likes Received:
    3
  4. Aypok

    Aypok Spirited Member

    Joined:
    Jan 8, 2008
    Messages:
    189
    Likes Received:
    1
    "wget" is what I use. You'd do something like "wget -m http://example.org/" and it'd follow all links on the site, saving each page/image/whatever as it goes. It's quite powerful, but not everyone likes using the command line. There is/was a similar application called something like "httrack", which had a Mozilla front-end for controlling it.

    EDIT: It's a separate program, not a FireFerret extension.
     
    Last edited: Jul 4, 2008
  5. babu

    babu Mamihlapinatapai

    Joined:
    Apr 15, 2005
    Messages:
    2,945
    Likes Received:
    3
    If we're just talking site rippers, I've used BlackWidow a few times with success.
     
  6. madhatter256

    madhatter256 Illustrious Member

    Joined:
    Mar 13, 2004
    Messages:
    6,578
    Likes Received:
    4
    Google: HTTrack Website Copier. It's free and I have it and it works great. Lets you choose how deep you want it to go and only goes within the domain or outside of it (not really sure).

    It is a standalone program. Just type in the URL and you're set.
     
    Last edited: Jul 4, 2008
  7. mairsil

    mairsil Officer at Arms

    Joined:
    Apr 20, 2005
    Messages:
    3,425
    Likes Received:
    153
    I swear that I read the topic as "Firefox website achievement?"

    Anyway, depending on what you want to do with it, and if you already have the program, the full version of Adobe Acrobat can grab websites (X levels deep or the whole thing) and convert them to PDF form. It will give you a nice snapshot.
     
  8. MatthewCallis

    MatthewCallis Robust Member

    Joined:
    Nov 9, 2007
    Messages:
    205
    Likes Received:
    3
    ScrapBook is the way to go, for versatility and compatibility without proprietary compressed formats. Plus searching within all the saved pages is great.
     
  9. GaijinPunch

    GaijinPunch Lemon Party Organizer and Promoter

    Joined:
    Mar 13, 2004
    Messages:
    10,999
    Likes Received:
    75
    Will this get links which are embedded in an ActiveX object?

    EDIT: Cool, Linux Version.

    EDIT: Not cool... doesn't pick up links which are spit out by ActiveX. Damn.
     
    Last edited: Jul 7, 2008
  10. madhatter256

    madhatter256 Illustrious Member

    Joined:
    Mar 13, 2004
    Messages:
    6,578
    Likes Received:
    4

    That's probably for security issues. With all the corrupt activex scripts out there. Don't want to get a bad one...
     
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page