Topic: [Feature] Make the blacklist prevent, not just hide.

Posted under Site Bug Reports & Feature Requests

Requested feature overview description.

Even with making use of Extend and getting faster internet, blacklisted images will still appear (and still be clickable) for a short moment when loading a new page. One solution to this is simply bookmarking pages that have the search loaded with the minus key (-tag) to prevent the results of that page from having tags you don’t want to show up, and sticking with that bookmark’s set of searched tags.

The problem is the moment you leave and go to a page without your bookmarked search settings, the -prevented tags still appear for that short moment.

You could just stick with the bookmarked page with -prevented tags and never leave, but that’s no fun.

The feature I’d suggest is making the blacklist function like the minus key in a search, so blacklisted images don’t appear in the first place. And, you'd be able to turn it on/off if you prefer.

Why would it be useful?

It would make the blacklist really DO what its purpose is for: Making so you don’t see anything on that list at all. For more casual users or people with slower internet, it’ll help them more effectively avoid seeing things they don’t even want a glance at. For more compulsive users, it’ll help them avoid wasting time and break bad habits more successfully.

What part(s) of the site page(s) are affected?

All post/index pages, according to what you put in your blacklist.

Updated by Pup

Human-Shaped said:
The feature I’d suggest is making the blacklist function like the minus key in a search, so blacklisted images don’t appear in the first place.

I suggested this 6 years ago:
forum #78429

Updated by anonymous

Relevant information can be found in this forum, discussing why blacklisted image can appear for brief moments. forum #257610

Updated by anonymous

I tried asking the Extend thread months ago but they must be busy, so I thought I'd ask here in the meantime. I don't know how frequently the site itself makes changes based on user feature suggestions, but I figured it was worth a try.

The Extend dude is awesome though~ I wonder if there's some kind of script magic he could do to make every post/index page's search bar always have whatever tags (or minus tags) you always want active.

That would eliminate the need to just stick to a bookmark that has whatever six tags you're limited to searching. You could bounce around to different artist pages, and the -tag you want always blacklisted won't ever pop in.

Then if they ever raise the limit from six, it could prevent even more from appearing~

Sorry, just dreaming :P

Updated by anonymous

This would be a really annoying feature. I turn off elements of my blacklist all the time, for various reasons. Even if it were entirely technically feasible to do this, it would remove functionality.

Updated by anonymous

Clawdragons said:
This would be a really annoying feature. I turn off elements of my blacklist all the time, for various reasons. Even if it were entirely technically feasible to do this, it would remove functionality.

If it's entirely technically feasible, it should also be something you can turn on and off like most features~

Updated by anonymous

Violet_Rose said:
I'm kinda confused why sending more images, potentially a lot more images, is less strain on the server than filtering them out. But I dunno what the back end is like.

I was hoping an admin would comment with more detail than I know, but since they haven't, I'll just point out that the server which serves query results is not the same one which serves files.

Updated by anonymous

Violet_Rose said:
I'm kinda confused why sending more images, potentially a lot more images, is less strain on the server than filtering them out. But I dunno what the back end is like.

Because the act of filtering and sorting the information on the server is more expensive than serving static files. It's why there is a limit on the number of tags you can have in a search. Because the search system operates on a much larger data set than the client, the same methods that work well for the clients don't work well on the server side.

As for why the images are placed on the page before filtering occurs? Because people disable javascript and it's better to have safe defaults of things working than to have it fail in a method of "no images load, but the rest of the site appears to function."

Updated by anonymous

This has already been suggested multiple times...

Updated by anonymous

Jacob said:
This has already been suggested multiple times...

That's good news~ One more vote in favor of it becoming a thing, then ;[

Updated by anonymous

KiraNoot said:
As for why the images are placed on the page before filtering occurs? Because people disable javascript and it's better to have safe defaults of things working than to have it fail in a method of "no images load, but the rest of the site appears to function."

This is entirely false. You can filter out the posts before they appear AND have the site still function fine if the user disabled javascript.

For instance:

<img data-src="foo.jpg"/>
<noscript>
<img src="foo.jpg"/>
</noscript>

The above HTML code relies on javascript (not included above) to change data-src attribute into src. If javascript is disabled, it will show the contents inside the <noscript> tag, that is, a visible image.

As a bonus, this approach would save server bandwidth because if the post is blacklisted, data-src would never get changed into src and the thumbnail would never even get downloaded.

Updated by anonymous

Delian said:
This is entirely false. You can filter out the posts before they appear AND have the site still function fine if the user disabled javascript.

For instance:

<img data-src="foo.jpg"/>
<noscript>
<img src="foo.jpg"/>
</noscript>

The above HTML code relies on javascript (not included above) to change data-src attribute into src. If javascript is disabled, it will show the contents inside the <noscript> tag, that is, a visible image.

In fact, this would save server bandwidth because if the post is blacklisted, data-src would never get changed into src and the thumbnail would never even get downloaded.

I'll take it into consideration for future iterations. I have no plans for revising the current version of the blacklist. It's too fragile as it is.

The downside I see for this is that it requires sending quite a lot more data to the clients upfront, but at least it would compress well.

I feel the accusatory tone of your post was unneeded. I appreciate it when people offer improvements and better ways to do things, but the way you phrased it sours the introduction.

Updated by anonymous

Data duplication and usage: All things considered, given how bloated the pages are already it's probably negligible in the long run, I was mostly thinking out loud about possible concerns.

Violet_Rose said:
That could be avoided by just having the entire image set wrapped in noscript and a div, then if your script runs, you do the normal blacklist logic then move the div out of the noscript into its parent. A user without JS just sees all the images. A user with JS enabled sees non-blacklisted images appear shortly after page load.

Random musing not related to any actual testing: It makes me wonder if this would result in really bad render stalling on pages with lots of thumbnails, since it would move a large number of DOM elements out of a hidden area and push them into a visible area and force a layout and reflow on the contents.

Updated by anonymous

KiraNoot said:
the accusatory tone

I'm sorry, that was not my intention.
 

The downside I see for this is that it requires sending quite a lot more data to the clients upfront, but at least it would compress well.

That depends on the implementation I suppose. I mean, if javascript is turned on, then technically you can move the contents out of the <noscript> tag to have them rendered.

Code

eg:

<noscript>
<img src="foo.jpg"/>
</noscript>
document.querySelectorAll("noscript").forEach(el => el.insertAdjacentHTML("beforeBegin", el.firstChild.nodeValue));

That way you wouldn't need to duplicate the elements.

Alternatively, if you don't care about the lazy-loading of images, then perhaps a simpler solution would be one that uses css inside a noscript tag.

Code
<style id="remove-with-javascript"> span.thumb { display: none; }</style>
<noscript>
<style> span.thumb { display: inline-block; } </style>
</noscript>

 

I have no plans for revising the current version of the blacklist. It's too fragile as it is.

I have looked at the blacklist javascript code that you use and it seems to be quite robust. Altho I tend to agree that this feature, in general, isn't high priority.
 

It makes me wonder if this would result in really bad render stalling on pages with lots of thumbnails

I have plenty of experience with this and I can tell you that the page rendering would actually be faster (from the styling/reflow performance perspective). Because it's faster to show elements once, than to first show them and then hide some of them.

The problem that I see is that, images wouldn't show while the HTML is being downloaded. In that aspect, DOMContentLoaded event is too late to trigger the filtering because then the user has to wait for the whole HTML to finish downloading before they get to see the first image, which is why the above css solution isn't good. I see two solutions to this problem. One is by using a MutationObserver to process blacklist as the nodes are added to the DOM.

Code
var observer = new MutationObserver((mutations) => {
  mutations.forEach(mutation => {
    mutation.addedNodes.forEach(addedNode => {
      if (addedNode.nodeType === Node.ELEMENT_NODE && addedNode.classList.contains("thumb")) {
        /* apply blacklist on the addedNode */
      }
    });
  });
});
observer.observe(document.documentElement, {subtree: true, childList: true});

The other is by adding an inline script after every thumbnail, which does the processing.

Code
<span class="thumb" id="p1234567" onclick="return PostModeMenu.click(event,1234567)">
  //existing content
</span>
<script>
  /* apply blacklist on the node with id="p1234567" */
</script>

This works because, as you know, the browser executes the scripts as soon as they're encountered during the parsing process; before parsing nodes that are underneath.

Updated by anonymous

I don’t know how any of the code would work so I can only come up with ideas. I’ve got Chrome and use Tampermonkey & Stylus just for this site. Is there anything I could put in those extensions to do what I’m suggesting?

So far I’ve just got my old idea and a new one I thought of recently:

1.) Make every post/index page's search bar always have whatever tags (or minus tags) you always want active. You could bounce around to different artist pages, and the -tag you want always blacklisted won't ever pop in because it's default on the search bar.

Or...

2.) Only show the page when it's done loading. Until then it could just be an image of a blank e621 page (or whatever image you want), and by the time it's done loading the blacklisted thumbnails are already gone.

I’ve run searches for doing that sort of thing (making ‘loading screens’ to make sure a page is fully ready each time), but I’ve only found people talking about that for making websites, not changing current ones.

Updated by anonymous

Pup

Privileged

Human-Shaped said:
I don’t know how any of the code would work so I can only come up with ideas. I’ve got Chrome and use Tampermonkey & Stylus just for this site. Is there anything I could put in those extensions to do what I’m suggesting?

Have you tested to see if this is still a problem on the beta site?

I had a quick look, but it loads to fast for me to tell if the original image shows or not.

Updated by anonymous

Pupslut said:
Have you tested to see if this is still a problem on the beta site?

I had a quick look, but it loads to fast for me to tell if the original image shows or not.

The images are still there, I haven't reworked this yet on the beta site.

Updated by anonymous

I didn't know there was a beta site. Well when it's done I hope it has the feature myself and others are talking about. Honestly there are things on this site I'd pay to make real (this being one of them), but all I can do is ask @.@

Updated by anonymous

Pup

Privileged

Human-Shaped said:
I didn't know there was a beta site. Well when it's done I hope it has the feature myself and others are talking about. Honestly there are things on this site I'd pay to make real, but all I can do is ask @.@

There's a sticky thread: forum #279936, "Introducing the beta site"

Feedback, input and bug reports are appreciated, and that's essentially what the forum on the beta site is being used for currently.

Updated by anonymous

Pupslut said:
There's a sticky thread: forum #279936, "Introducing the beta site"

Feedback, input and bug reports are appreciated, and that's essentially what the forum on the beta site is being used for currently.

Thanks for the link, I just posted this idea there as a suggested feature :]

Updated by anonymous

It could still be done client-side by using CSS to handle the blacklist instead of JS, though it would be kind of a hack. Basically, you'd have to turn blacklist rules into CSS selectors (the blacklist has a simplified syntax, so that would probably be doable) matching against an attribute tag on each post/thumbnail.

For instance (I'm using the thumbnail alt text since there's currently no such attribute tag):

/* Blacklist: canid -wolf */
img[alt~="canid"]:not([alt~="wolf"]) {
  display: none !important;
}

/*Blacklist: female dragon*/
img[alt~="female"][alt~="dragon"] {
  display: none !important;
}

CSS can be applied before the document has finished loading (which is the point at which JS is usually made to run), so that would have the desired effect (though I'm not sure about whether or not it would stop thumbnails from downloading) without putting additionnal load on the server. It would, however, restrict any future blacklist improvement to what's possible to to with those CSS selectors.

Browser support shouldn't be an issue, since [attr~="elem"] is an old CSS 2.1 selector for space-separated lists that's even supported on IE7, and the :not() pseudo-selector is supported by IE9, so this should work for pretty much everyone unless they're browsing e6 at work on Windows XP machines.

It's not a perfect solution (selectively disabling each rule might prove tricky), but I figured I'd throw it out there.

Updated by anonymous

not as thorough but simpler idea:
the beta lets users search up to 40 tags at once. use some of those to give users an additional "auto-search blacklist" option that would always automatically add tags or (-)tags to their searches in the background.

Example:
  • auto-search blacklist: pants, shirt, -hat
  • blacklist: socks
  • search: clothed

Returned posts: the search results for clothed hat -pants -shirt with posts tagged socks hidden

it's not as complex as a preventative version of the current blacklist, but it's more flexible, could remove a big chunk of the hidden posts from a search, and probably wouldn't be too hard to implement

Updated by anonymous

Pup

Privileged

sneezer22 said:
not as thorough but simpler idea:
the beta lets users search up to 40 tags at once. use some of those to give users an additional "auto-search blacklist" option that would always automatically add tags or (-)tags to their searches in the background.

The problem I see with that is that it wouldn't be easy to toggle the blacklist on and off, without searching again, like it currently is.

Also you can have multiple things on the blacklist such as:

group male -female
solo female

Which would block groups with only males unless there's a female, and solo females.

It'd be hard to add search terms things like that, with each line meant to be handled separately.

Updated by anonymous

Pupslut said:
The problem I see with that is that it wouldn't be easy to toggle the blacklist on and off, without searching again, like it currently is.

the site would probably have to serve a new set of posts yes, but as you said that would only be as server intensive as requesting a new search and toggling the use of the blacklist. it would be more of an all or nothing kind of thing than the current blacklist, so mostly used for tags you either always or never wanted to see.

Also you can have multiple things on the blacklist such as:

group male -female
solo female

Which would block groups with only males unless there's a female, and solo females.

It'd be hard to add search terms things like that, with each line meant to be handled separately.

yep, that's accurate. this blacklist option would only be able to handle simpler formatted input. the option I had thought of for this would be the use of the ~ operator. that would give you a bit of leeway to allow for at least one higher level concept. so to block out solo female you would put ~intersex ~male ~ambiguous_gender into the "auto-search blacklist"(ABS)

the way I saw it working is users would get an input section in the settings that looks like the the one for the current blacklist. they would then either:

  • add 1 tag per line they just always or never wanted to see.

Or

  • just enter a search string like they would on the posts page:(probably the better option)
    • hat -pants -shirt ~intersex ~male ~ambiguous_gender

the list would then be added like normal tags to any search. if a user is viewing the post page they would actually be seeing a search for the tags in their ASB

when a search happens the site would see the tags the user wants to search for just like normal, add the tags from their ASB, then provide the results for a search of the combined tags. then apply the current blacklist as normal

Updated by anonymous

"Oh no, the images I wanted blocked existed for a couple nanoseconds and disappeared too fast for me to even process what they were, clearly this is the website's fault and something must be done about it."

Updated by anonymous

Pup

Privileged

LoneWolf343 said:
"Oh no, the images I wanted blocked existed for a couple nanoseconds and disappeared too fast for me to even process what they were, clearly this is the website's fault and something must be done about it."

I'd agree if it was just the main desktop version, but it's longer on mobile and the tab preview images are taken before the blacklist is run, so stay visible on there. Then every other day E6's image server loads painfully slow for a bit and the blacklist isn't applied till after it's all loaded.

Hiding the images before the page loads would lower the amount of thumbnails a user needs to download, which would be good for slow connections, server bandwidth, the times when the E6 image server's going slow and for less mobile data usage. It'd also solve the images staying on tab previews on mobile.

It's obviously nice to have the blacklist feature at all, but given there's the beta site being developed, it's a good time to be able to swap things round and request features.

Updated by anonymous

LoneWolf343 said:
"Oh no, the images I wanted blocked existed for a couple nanoseconds and disappeared too fast for me to even process what they were, clearly this is the website's fault and something must be done about it."

not sure if this is about what I posted or more aimed at the stuff above but 3 points I can come up with that would be a plus for this sort of thing would be:

1. more posts per page when searching
2. in the past I've tried using the blacklist as an advanced search for tagging purposes... it works in a way but a lot of the time you only get 1 post every like 1000+ images. a blacklist that "prevents" would give users crazy abilities to find really specific posts instead of dozens of blank pages of hidden, unwanted results

And then, what got ME here in the first place:

3. in the beta you can search and then a/d(navigate) from post to post in that search after clicking on a thumbnail on the search page. blacklisted posts are still a part of the search results though which means if you're blacklisting 2/3rds of the posts that came up in your search the site may give you like 8 blank pages in a row while you're trying to get to the next non-hidden post in your search. a "preventative" blacklist or blacklist alternative would be a fix for that and make for a better experience

I've personally never really had a problem with the current blacklist. this is a free site, to not be grateful for what you're given would just be foolish/rude. on the other hand it's also a community and if we can make a better experience for everyone with a reasonable new feature, why not? tbh I'm still just stoked the beta has the increased tag search capacity, if that was the only new feature I'd still be really happy ^_^

Updated by anonymous

alt idea to the "preventing instead of hiding" mindset. would there maybe be some way to sort the posts in a search so that all(or more) of the non hidden posts are at the front of the queue and all(or more) of the hidden ones are at the back? ... hmm, I probably don't understand the inner workings of the blacklist well enough to really be able to come up with a good way of doing that :/

Updated by anonymous

sneezer22 said:
in the beta you can search and then a/d(navigate) from post to post in that search after clicking on a thumbnail on the search page. blacklisted posts are still a part of the search results though which means if you're blacklisting 2/3rds of the posts that came up in your search the site may give you like 8 blank pages in a row while you're trying to get to the next non-hidden post in your search.

The most realistic increment of improvement on this behaviour, IMO, would be to detect that the current image is blacklisted and automatically skip to the next one (after a small delay, say 2 seconds, in which you have an opportunity to cancel this by clicking on anything). This is probably doable with a userscript, since the only thing that's essential to it is to run after the blacklisting code has run, check whether the image is hidden, and trigger a redirect.

However, there are certainly issues with this. Suppose that you have intentionally clicked into a series of posts that fall under your blacklist. You would want to be able to toggle that auto-skip off or on with a single click.

A similar auto-skip feature could be applied to pages of results (in terms of the current site -- I don't know how the beta site handles this), since it's possible for the entire page of results to be blacklisted, especially with a small '# results/page' setting.
This latter idea would (AFAICS) enable "crazy abilities to find really specific posts instead of dozens of blank pages of hidden, unwanted results" without necessitating any changes to the blacklisting system.

Updated by anonymous

Pup

Privileged

Just wanted to post that the beta now hides images till the blacklist code is run.

Thanks again, Kira.

-----

As for not returning blacklisted posts vs only hiding them, if they're hidden they can still be toggled, when it'd be more awkward if the posts weren't returned at all, secondly, it'd put more work on the server to block posts.

Currently, I presume, it works similar to the api, the site gets a list of posts from the database then displays or hides them. If say, the first 5 pages are all blacklisted, that'd mean the database would need to be searched 6 times to get a full page of posts, both slowing down the loading times and putting a lot extra strain on the site.

This could also be exploited, blacklist every tag then do a search, the database would need to then keep getting page after page, using a lot of resources, till it found an image that wasn't blacklisted.

Overall I feel just hiding them is better.

Updated by anonymous

  • 1