a very good server side spider process Rated 4 out of 5 stars

well, that's a ... different way of going about it! primarily i thought it would just be handy for tumbling but the concept goes far beyond that. quite simply the site does what spider/crawler programs do, then lets you download it as one zip (per blog or gallery) within seconds! they ask you don't use it more than twenty times a day as it's run at a loss on donations.
the addon is not very useful to me though. i always disable googoo scripts. so i have to load the rarchive page first, then 'temporarily allow' the scripts it needs, then refreshing, confirming over eighteen, copy the url in, circumventing the need for this add-on anyway.
then i found it produced invalid zips when it exceeded some limit when spidering tumblr archives (understandable as it sometimes found over four thousand images). the process does remove duplicate files (reblogs) automatically but doesn't tell you where it gave up.
obviously i need some practice and a workaround to enable the script ready. i may even give up on the add-on but i will still like to use the process.