Topic: Downloading

Posted under General

Is there a quicker way of saving every image i have in my favorites to my HDD, with out having to open each one and click save?

Updated by RriikTheDeinon

Fox2K9 said:
Is there a quicker way of saving every image i have in my favorites to my HDD, with out having to open each one and click save?

i think firefox has an app for that

Updated by anonymous

You could try FlashGot? That might work...
I use it combined with Orbit Downloader to download off of <em>fchan</em> and <em>paws.ru</em>, and it works well.
I don't know how well it work on here though, I haven't tried it yet...

Updated by anonymous

success!

FireFox Add-on "down them all", works awsomely!

Updated by anonymous

strike that, saves as tiny pics, leme tinker with it

Updated by anonymous

i think the problem is unsolvable, its downloading the thumbnails :/

Updated by anonymous

The Save Images add-on might work. Tell it to open all links on one page into a single tab (Filtered so it only returns image pages), then save all images from that new tab (Again, filtered for just the pictures).

It'll still a few operations to do, so its benefit is questionable.

Updated by anonymous

bella said:
i think the problem is unsolvable, its downloading the thumbnails :/

DTA is a great tool when you have access to a page with multiple links straight to images, like most chans. For it or most other tools like it to be useful on boorus, however, the index page would need to offer a direct link to the images.

I don't know if it's a common enough problem to want to act like a search robot and automatically run through the hyperlink structure of a site in order to download every image within X links of a starting point and stored at a matching URL pattern. That's also assuming that such a tool wouldn't want to respect any robots.txt which might forbid crawling the site in such a fashion.

Solvable problem: yes. Common enough to see a tool besides DTA: Not certain. Maybe it could be suggested to the DTA team, though.

Updated by anonymous

Please don't do this. Using tools like this HAMMERS the site and makes it slower and such for all the other users.

Updated by anonymous

Arcturus said:
Please don't do this. Using tools like this HAMMERS the site and makes it slower and such for all the other users.

So the only way is one by one?

Updated by anonymous

Why do you need to download it at all? If the site were ever to go down permanently, we'd put up a torrent or something.

And if not, well.. it's all HERE.

Updated by anonymous

Arcturus said:
Why do you need to download it at all? If the site were ever to go down permanently, we'd put up a torrent or something.

And if not, well.. it's all HERE.

I just want to have it in case my net breaks or something (i pray to god for it not to happen).

Updated by anonymous

Fox2K9 said:
I just want to have it in case my net breaks or something (i pray to god for it not to happen).

Well, guess you'll have to do it like the rest of us, one image at a time.

@Arcturus: Sorry, I geeked out on whether it <i>could</i> be done, and forgot about whether it <i>should</i> be done. Please don't hate me! :'(

Updated by anonymous

Arcturus said:
Why do you need to download it at all? If the site were ever to go down permanently, we'd put up a torrent or something.
And if not, well.. it's all HERE.

No disrespect, but some users here, (including me) are collectors. =^_^=

Just work on downloading one or two pages a day or something (using that "one-by-one" method...)
Depending on how many pages you have it could take a couple of weeks, or a month or two...

Updated by anonymous

Kald

Former Staff

Thumbnails file names are formated as follow :
http://main.e621.inthesewer.net/data/preview/XX/YY/[MD5].jpg

Corresponding pictures file names are formated as follow :
http://e621.net/data/XX/YY/[MD5].zzz

Where XX and YY vary for each pic.

Assuming you can get a list of thumbnails, you can build a list of pictures file names, but you will have to do one pass per file extension type (zzz -> jpg, gif, png)

I guess mass downloading tools can import lists of URLs to download.
The only hard part would be to export the thumbnails URL from the pages source codes.

Updated by anonymous

The only hard part would be to export the thumbnails URL from the pages source codes.

and doing it in a way that won't cripple the functionality of the site. those things can be a serious drain on bandwidth.

Updated by anonymous

wget + scripting language of your choice

Maybe just wget with really well designed parameters. Just choose the parameters carefully or you'll end up downloading the whole site and get banned.

That is how I would do it if I needed to download all my favourites but I don't see any point in that. It's much more convinient to have the pictures on this site than on my computer. Now I can let anybody use my computer without any chance that they run into weird furry porn (private browsing ftw) and I have access to my weird furry porn collection from any internet device. It's cloud computing, the current big thing in IT, man.

Updated by anonymous

IgnisWolf said:
http://xiaxtal.github.io/e621batch/index.html

This is a mass downloader, the only issue is it maxes out at 10 pages. Since I have 19 pages of favorites (about 1000+ images) I used this to download the first 10 pages. And since it seems I can't tell it which pages to do. Slowly went through and one at a time dragged and dropped the last 9 pages into my folder, and in the future when I fave stuff on here, I just transfer all the new stuff over asap. If this method is a bad idea, someone please inform me and I will cease.

Necrobumped

Updated by anonymous

IgnisWolf said:
http://xiaxtal.github.io/e621batch/index.html

This is a mass downloader, the only issue is it maxes out at 10 pages. Since I have 19 pages of favorites (about 1000+ images) I used this to download the first 10 pages. And since it seems I can't tell it which pages to do. Slowly went through and one at a time dragged and dropped the last 9 pages into my folder, and in the future when I fave stuff on here, I just transfer all the new stuff over asap. If this method is a bad idea, someone please inform me and I will cease.

Ignoring your pointless six year necrobump, at 20 pages you could just sort it in reverse order.

Updated by anonymous

just wondering...do people even see the time stamps of posts/threads at all when necro posting or is tunnel vision all the way down to the "reply" button? cause you must've been digging a good ways to hit the 6 year mark.

Updated by anonymous

I was going to drop a link to my downloader tool here, but I realize that not everyone is running Linux... T.T

Updated by anonymous

Faux-Pa said:
I was going to drop a link to my downloader tool here, but I realize that not everyone is running Linux... T.T

let them run linux under virtualbox under windows

Updated by anonymous

Munkelzahn said:
let them run linux under virtualbox under windows

I'm pretty sure a heavy sum of users wouldn't want to go through the hurdle of doing that... Hell, I'd install "Bash on Ubuntu on Windows" through the programs and features tool on Windows 10 before I do that.

Updated by anonymous

treos said:
just wondering...do people even see the time stamps of posts/threads at all when necro posting or is tunnel vision all the way down to the "reply" button? cause you must've been digging a good ways to hit the 6 year mark.

Or they used the search bar.

Updated by anonymous

treos said:
just wondering...do people even see the time stamps of posts/threads at all when necro posting or is tunnel vision all the way down to the "reply" button? cause you must've been digging a good ways to hit the 6 year mark.

It's better than creating a new post for something that already exists.

Updated by anonymous

  • 1