Anyone know of a plugin that will let you archive an entire site like on a mac? That or a program that will spider and save an entire site.
After some searching I found this https://addons.mozilla.org/en-US/firefox/addon/212 sounds like what you were asking for? edit: Seems to only work on FF 1.x, can't get it to install on FF 2
"wget" is what I use. You'd do something like "wget -m http://example.org/" and it'd follow all links on the site, saving each page/image/whatever as it goes. It's quite powerful, but not everyone likes using the command line. There is/was a similar application called something like "httrack", which had a Mozilla front-end for controlling it. EDIT: It's a separate program, not a FireFerret extension.
Google: HTTrack Website Copier. It's free and I have it and it works great. Lets you choose how deep you want it to go and only goes within the domain or outside of it (not really sure). It is a standalone program. Just type in the URL and you're set.
I swear that I read the topic as "Firefox website achievement?" Anyway, depending on what you want to do with it, and if you already have the program, the full version of Adobe Acrobat can grab websites (X levels deep or the whole thing) and convert them to PDF form. It will give you a nice snapshot.
ScrapBook is the way to go, for versatility and compatibility without proprietary compressed formats. Plus searching within all the saved pages is great.
Will this get links which are embedded in an ActiveX object? EDIT: Cool, Linux Version. EDIT: Not cool... doesn't pick up links which are spit out by ActiveX. Damn.
That's probably for security issues. With all the corrupt activex scripts out there. Don't want to get a bad one...