mcsib said:
Thanks for understanding the situation. There are a ton of bugs that have popped up recently, new feature requests I need to look through, and those "wonderful" dependency security bugs (have to love them) that have popped up. I can't give time estimates for when I can work on the project, but I will be able to soonish.As for the question, I have considered making a toggle for the post limiting in the downloader. The only reason I haven't is due to scrapes. I don't know how good the servers on e621 are, but I'm reluctant to make that functionality easier as not to anger the e621 staff into emailing me about it (I mean that as a light joke). It more or less depends, if an admin or staff worker gives me a thumbs up that I'm good to implement that toggle, I can add it. I just remember way back then when other downloaders did the same thing with other sites, and then immediately DDOSed the servers with hundreds of users downloading 20k+ images. That's something I'd rather not do. The main worry, in this case, wouldn't be a DDOS situation (most modern servers have gotten better about it) but more so throttling.
I obviously can't answer for the staff but in the API FAQ there's no mention that I've seen about a max number of downloads, only that the request rate should be limited to 1/sec or less. If you decide to implement the toggle, maybe adding a throttle to it that goes slower than it currently does and letting the user know that the trade-off on large scrapes is a slower speed would be a good compromise?