Topic: blacklist serious question (not involving how to)

Posted under General

ok, admittedly, my blacklist is fairly long. but I was wondering, is there a way to make it so that when I browse the images, I have more than one or two pics per page? (not including the option of removing things from blacklist.) and if there isn't a way now, is there a possibility of a way being made for the new e621? just a thought, since I'm sure I'm not the only one with a long blacklist.

Updated by ikdind

Unfortunately there is not an option to do that currently, and I doubt it will be in 2.2 since it's done clientside to ease server load.

Updated by anonymous

You should clean up that list and start searching through tags instead.

Updated by anonymous

Hat said:
You should clean up that list and start searching through tags instead.

Aye. If you use "-" tags (-guro , etc), you'll usually be able to eliminate unwanted tags without having to blacklist them, thus increasing the number of images per page efficiently(since it removes those tags from the search entirely, rather than blocking images with those tags).

Updated by anonymous

tony311 said:
Unfortunately there is not an option to do that currently, and I doubt it will be in 2.2 since it's done clientside to ease server load.

Well, it's not client-side, but it's still a lightweight means of avoiding the tags you don't want to see, because they're only being applied to a single page of results at a time.

If the blacklist were used in search queries just like normal search tags were, then e621 would have all the same performance problems as if users had simply been allowed to input as many tags as they wanted. There is good reason for accounts being limited to searching a precious few tags at a time.

Updated by anonymous

ikdind said:
Well, it's not client-side, but it's still a lightweight means of avoiding the tags you don't want to see, because they're only being applied to a single page of results at a time.

If the blacklist were used in search queries just like normal search tags were, then e621 would have all the same performance problems as if users had simply been allowed to input as many tags as they wanted. There is good reason for accounts being limited to searching a precious few tags at a time.

No, I'm pretty sure it's clientside. the main javascript grabs your blacklisted tags from cookies and uses it to simply hide certain posts from displaying in the post list. It's done this way because then there's no server load at all except for the initial request of blacklisted tags, which shouldn't happen very often. Doing it serverside means adding, with some users, tens of tags to every single search result, thereby largely increasing server strain.

Updated by anonymous

tony311 said:
No, I'm pretty sure it's clientside. the main javascript grabs your blacklisted tags from cookies and uses it to simply hide certain posts from displaying in the post list. It's done this way because then there's no server load at all except for the initial request of blacklisted tags, which shouldn't happen very often. Doing it serverside means adding, with some users, tens of tags to every single search result, thereby largely increasing server strain.

Huh. Taking a closer look at the page source of return search results, I think you're right. I get 20 <span class="thumb"> sections back from any given query, but only a subset are actually shown. Neat.

Updated by anonymous

  • 1