Topic: Should Images Using Glaze/Adversarial Noise Be Allowed?

Posted under General

Like the title says, should images that add a bunch of weird AI-tricking noise and artifacts be allowed on e621? I feel like it shouldn't as it seems like it goes against the quality standards of this site (e.g. no excessive artifacts).

Isn't it ironic that we have an adversarial_noise tag that would completely negate the purpose of adding noise in the first place?
I guess it would be successful in preventing the AI from sampling your artwork in the first place, but it also makes it easier for developers to limit the amount of data poisoning from happening.

Updated

If these reactionary "protest filters" can operate without degrading the image to a human eye then there is no reason to care. But if it looks like a compression artifact, and quacks like a compression artifact, it should be deleted like one.

In my opinion, it's not really any worse than something like a distracting_watermark. If there was another version available without it, that would probably be deemed to be the superior version.

thegreatwolfgang said:
Isn't it ironic that we have an adversarial_noise tag that would completely negate the purpose of adding noise in the first place?
I guess it would be successfully in preventing the AI from sampling your artwork in the first place, but it also makes it easier for developers to limit the amount of data poisoning from happening.

I mean, it is TWYS. I saw adversarial noise. And I added it.
(Also Glaze was able to have it's effects mitgateged with a few lines of python code within like. I think 3 days of it's announcement anyway?)

yetanothertemp said:
I mean, it is TWYS. I saw adversarial noise. And I added it.
(Also Glaze was able to have it's effects mitgateged with a few lines of python code within like. I think 3 days of it's announcement anyway?)

It was, yes. So, here's the fun thing about using A.I. to fight A.I.... you can then use A.I. to fight the A.I., and in the end you wind up with the piracy-DRM cycle of "neat protection, challenge accepted".

Mdf

Member

I don't think it should be allowed. E6 is meant to be an archival site (curious how many terabytes yall host?) and the quality standards enforced here exist for a reason. If a better, publically available (re: not paywalled) image exists, that version should always be used. Permitting this only lowers the quality bar and can set a bad precedent for the future since everyone will start doing it.

Intentionally shittifying your work is also something I don't think E6 should showcase. As a writer, I wouldn't introduce errors to my final product just because someone can copy-paste. It's surprising that even after all this time, people still don't understand that when you post something to the internet, you lose all control over it. You can delete it, but it still exists on someone else's hard drive.

We want artists to put their best work forward, inspire others, and a bunch of other cliche things. Allowing this may as well relax the standards and allow deliberate artifacting.

I really don't like it. It's ugly, and copy protection has always been fighting a losing battle. That said, I think the best policy here is to tag the noise and favor versions without the noise if we can get them. Removing images just because they have noise is cutting our nose to spite our face as much as the artists using noise are.

Mdf

Member

hexen said:
Check this out: https://e621.net/stats

Holy cra-poli... this is.. this is an interesting find!

off topic:

Hmm, lets see here...

post #3278492

Total posts... okay. Active posts... okay. Deleted posts... kinda lame because a lot of stuff I liked over the years is in there, but okay...

Destroyed posts... Pffft, what exactly are the mods doing, "destroyed posts"? Is the content so cringe that the staff needs to bleach the site to remove all traces of that upload's existence? Are there post IDs that are just straight-up missing with a 404 web page?

Ah here we go: 7.5 terabytes of content. That's... wow that's a lot less than I thought. Could fit that on two of my computer's drives. I wonder if they're running some kind of compression and serving up decompressed stuff as its requested? Guess e6s storage side of things is much smaller than I thought, but then again it's running 24/7 so that's something. I thought we were in the realm of 15-20TB but I guess a lot of higher quality stuff is paywalled so most the stuff here is standard resolution and thus smaller on the disk. 1.6MB, yeah, that's not much.

One-fifth of the entire user base is inactive accounts? You'd think those would get culled after some time? What are those people doing I wonder... Also, I thought there were hundreds more 'privileged' users than 67, that's shockingly low.

Congrats on 1M dmails, you go king! I wonder how many of those are because of records (keep reading).

You know, maybe that'd be something interesting to track. How many records have gone out total, yesterday, last month, last year?..... MODS!

I know looking at the user feedback page, it appears there are 70-75 records per page, and there are 432 pages for negs, so that would be around 32k negative records handed out.

334 pages of neutral is around 25k. Only seven pages of positives, so some 525.

So that's around 57,500 messages specifically for records, so that's about 5.75% of all dmails being records.

Staff must be hella busy. We are such a naughty community!

mdf said:
l

Total posts... okay. Active posts... okay. Deleted posts... kinda lame because a lot of stuff I liked over the years is in there, but okay...

Destroyed posts... Pffft, what exactly are the mods doing, "destroyed posts"? Is the content so cringe that the staff needs to bleach the site to remove all traces of that upload's existence? Are there post IDs that are just straight-up missing with a 404 web page?

It's content that is fully purged from the site, and can't be undone or seen by staff like deletions. covers irl bestiality, csam, and the like

mdf said:
Destroyed posts... Pffft, what exactly are the mods doing, "destroyed posts"? Is the content so cringe that the staff needs to bleach the site to remove all traces of that upload's existence? Are there post IDs that are just straight-up missing with a 404 web page?

Yes, they are missing entirely, quite like post #1 - post #12 flat out don't exist
Not all of them were illegal or illegal adjacent, id clashes can happen (as well as other shenanigans, like post #3000000)

Artists do a lot of things that sabotage the quality of their posts for dubious reasons. This is just one more thing. Perhaps the anti-AI noise posts can be deleted if their added distortion equals that of posts deleted for terrible compression artifacts.

As individuals, we don't have to lift a finger to support these sabotaged posts. Don't upload or tag them if you find the practice offensive. I refuse to upload certain posts or artists due to their quality sabotage, in addition to skipping posts for their content, or at least I find myself heavily disincentivized from uploading them. Everyone is free to do the same.

Song

Janitor

A user described glazing in its current form as "Denuvo for art", which is a pretty apt comparison. It's generally more harmful for viewers than it is for machine learning programs, which can implement consistent de-glazing algorithms.

We tolerate glazing currently. If glazing is so severe that it's heavily distracting or is indistinguishable from deep frying, we'll delete the post with an explanation. Still, I personally wouldn't recommend using it due to its ineffectiveness against using your art in AI and due to harming the people who view or pay for your work. A better strategy would be to prioritize websites that don't use your work for machine learning datasets. This doesn't prevent people from scraping artwork, but it at least means the site maintainers respect you enough to not undermine your wishes.

song said:
A user described glazing in its current form as "Denuvo for art", which is a pretty apt comparison. It's generally more harmful for viewers than it is for machine learning programs, which can implement consistent de-glazing algorithms.

We tolerate glazing currently. If glazing is so severe that it's heavily distracting or is indistinguishable from deep frying, we'll delete the post with an explanation. Still, I personally wouldn't recommend using it due to its ineffectiveness against using your art in AI and due to harming the people who view or pay for your work. A better strategy would be to prioritize websites that don't use your work for machine learning datasets. This doesn't prevent people from scraping artwork, but it at least means the site maintainers respect you enough to not undermine your wishes.

Out of morbid curiosity I checked to see JUST how trivial it was to remove the less-severe glazes (gaussian), and Gimp seemed to be able to do it reliably with Antialias+Denoise 4-6 with minimal loss. I believe Waifu2x has a better denoising algorithm, though, so it's.... baffling and laughable. Bafflaughable. This is like maps in the dark ages having fake landmasses and rivers.

alphamule

Privileged

votp said:
Out of morbid curiosity I checked to see JUST how trivial it was to remove the less-severe glazes (gaussian), and Gimp seemed to be able to do it reliably with Antialias+Denoise 4-6 with minimal loss. I believe Waifu2x has a better denoising algorithm, though, so it's.... baffling and laughable. Bafflaughable. This is like maps in the dark ages having fake landmasses and rivers.

Or captchas that make it hard for humans because of horrible contrast but are trivial for a filter to make easier for the bots.

Isn't this breaking the watermarks rule, though? We're not supposed to be posting the intentionally vandalized versions of images, unless it is the ONLY version available?

alphamule said:
Or captchas that make it hard for humans because of horrible contrast but are trivial for a filter to make easier for the bots.

Isn't this breaking the watermarks rule, though? We're not supposed to be posting the intentionally vandalized versions of images, unless it is the ONLY version available?

Technically, it might violate the A.I. content rule, given it's artwork that's been fed through an A.I. to do that.

alphamule said:

Isn't this breaking the watermarks rule, though? We're not supposed to be posting the intentionally vandalized versions of images, unless it is the ONLY version available?

Unless it's an only artist can post cdnp

I find this question curious.

Who does it hurt, surely not you. Unless youre mad you cant scrape e6 for your lora without thinking.

demesejha said:
I find this question curious.

Who does it hurt, surely not you. Unless youre mad you cant scrape e6 for your lora without thinking.

See the comments on post #4000384 for who it hurts. No one there is trying to use some random comic for an ai model.

demesejha said:
I find this question curious.

Who does it hurt, surely not you. Unless youre mad you cant scrape e6 for your lora without thinking.

Images with glaze look like they've been dipped in oil.

demesejha said:
I find this question curious.

Who does it hurt, surely not you. Unless youre mad you cant scrape e6 for your lora without thinking.

Adversarial noise affects everyone, not just AI bros trying to scrape data. It is the rough equivalent of lathering a nice car in diarrhea to ward off thieves.
It looks like crap, to say nothing of the various tools that exist to remove it.

Updated

lafcadio said:
Adversarial noise affects everyone, not just AI bros trying to scrape data. It is the rough equivalent of lathering a nice car in diarrhea to ward off thieves.
It looks like crap, to say nothing of the various tools that exist to remove it.

A bakery grows concerned that people are using machines to chemically break down and copy the recipe of their famous chocolate cake.
The bakery starts pouring hot sauce on their cakes to foil the machines.
The customers stop buying cakes from that bakery because they are now inedible, and go to purchase copycat recipes that are not punishing them for paying.

votp said:
A bakery grows concerned that people are using machines to chemically break down and copy the recipe of their famous chocolate cake.
The bakery starts pouring hot sauce on their cakes to foil the machines.
The customers stop buying cakes from that bakery because they are now inedible, and go to purchase copycat recipes that are not punishing them for paying.

Gimme a slice of that king cobra chilli infused, 95% cacao, darkest chocolate cake with a solid layer of saigon cinnemon dusting over the anise frosting then, because that malice sounds exquisite.

junco said:
I feel honored my comic gets to be used as the example here. So I get to one of thousands of folks who Midjourney intentionally find and scrape their work. https://twitter.com/ALRadeck/status/1741625808450392216. So god forbid me I try to do something to protect myself.

I mean I don't begrudge you acting on your values but you made the image look like it was converted to .gif and back. If someone uploaded a non-glazed image that looked like that it would be deleted for not meeting the quality standards. I don't think glaze should be treated differently.

Genjar

Former Staff

Thinking that these harm AI learning in any way show that the anti-AI activists have no idea of how training works, and still think that it just 'copies style'.
If anything, this is beneficial for training, since it's an another thing to teach for the model to actively avoid — just like jpg- and scan artifacts. Not to mention that it's simple to make a filter to remove these stains, if someone wants to.

genjar said:
Thinking that these harm AI learning in any way show that the anti-AI activists have no idea of how training works, and still think that it just 'copies style'.
If anything, this is beneficial for training, since it's an another thing to teach for the model to actively avoid — just like jpg- and scan artifacts. Not to mention that it's simple to make a filter to remove these stains, if someone wants to.

I think all the examples of it poisoning the data are shown on extremely small sets of images. Like the dataset that trained SD contained billions of images, not just 20 images, 10 of which were "poisoned" and it spent a very long time training. It is slightly effective at poisoning fine-tuning, but those only take 10-20 images anyways, and a human can very easily just edit them by hand in a few hours, potentially faster since they're so obvious.

Genjar

Former Staff

definitelynotafurry4 said:
I think all the examples of it poisoning the data are shown on extremely small sets of images.

Right. And that is, of course, on a dataset where they've done nothing to avoid bad data. Which — again — shows complete ignorance of how AI training actually works. Nobody, and I mean nobody, just tosses everything in without first teaching the AI to sort data into good and bad quality.
If a human can see the Glaze/Adversarial Noise, so can an AI. It's as pointless as watermarks, which modern AIs can easily remove: just like any human, an AI can learn to draw in similar style while choosing to leave out the watermark or Glaze.

Updated

genjar said:
Right. And that is, of course, on a dataset where they've done nothing to avoid bad data. Which — again — shows complete ignorance of how AI training actually works. Nobody, and I mean nobody, just tosses everything in without first teaching the AI to sort data into good and bad quality.
If a human can see the Glaze/Adversarial Noise, so can an AI. It's as pointless as watermarks, which modern AIs can easily remove: just like any human, an AI can learn to draw in similar style while choosing to leave out the watermark or Glaze.

In one of the examples I saw. The AI actually started to glaze images to a dgree itself once the images were pre processed.

genjar said:
Right. And that is, of course, on a dataset where they've done nothing to avoid bad data. Which — again — shows complete ignorance of how AI training actually works. Nobody, and I mean nobody, just tosses everything in without first teaching the AI to sort data into good and bad quality.
If a human can see the Glaze/Adversarial Noise, so can an AI. It's as pointless as watermarks, which modern AIs can easily remove: just like any human, an AI can learn to draw in similar style while choosing to leave out the watermark or Glaze.

I don't get this "Just roll over and die" mentality you guys have. I'm trying. I want to try. And if that requires folks to take time out of their day to deal with my pain in the ass, then that's *good*. The only people that have had any issue with this is this site for some reason, in which you guys just link extensions and bots in the comments, or endlessly follow and harass each page telling me to roll over and die. It's my comic. It's my time and effort. Never has this extremely selfish notion of 'I deserve what you have" been more apparent than on this site. Funnily enough, my efforts seem to work quite well, because the outputs folks show me when it's intentionally trained to show my work, look fuck all like it. https://www.deviantart.com/comments/1/937003071/5126426901 So it can't be all useless, can it?

and to the dude i've had blocked for ages spamming each page with adding a tag he made just for my comic and like 4 icons, I hope you have a great time adding it yourself you absolute tick.

arrow189 said:
I mean I don't begrudge you acting on your values but you made the image look like it was converted to .gif and back. If someone uploaded a non-glazed image that looked like that it would be deleted for not meeting the quality standards. I don't think glaze should be treated differently.

This is reactionary noise, not unintentional. I think it should be treated differently. You hate it so much? Well maybe fucking over millions of artist's livelyhood has the tiniest bit of noise to it. Christ almighty you guys act like it actively shoots laser beams into your sockets.

junco said:
Funnily enough, my efforts seem to work quite well, because the outputs folks show me when it's intentionally trained to show my work, look fuck all like it. https://www.deviantart.com/comments/1/937003071/5126426901 So it can't be all useless, can it?

what do you mean "intentionally trained to show my work" that's just base Dall-E 3 given a prompt with a username in it. also it's totally anecdotal and there's no control to compare against, as far as we know you could type someone with several thousand drawn pieces like fuf in there and get similarly unrecognizable results.

Updated

junco said:
This is reactionary noise, not unintentional. I think it should be treated differently. You hate it so much? Well maybe fucking over millions of artist's livelyhood has the tiniest bit of noise to it. Christ almighty you guys act like it actively shoots laser beams into your sockets.

Because images artifacted to the point of being an eyesore might as well be.

junco said:
This is reactionary noise, not unintentional. I think it should be treated differently. You hate it so much? Well maybe fucking over millions of artist's livelyhood has the tiniest bit of noise to it. Christ almighty you guys act like it actively shoots laser beams into your sockets.

At the end of the day it will probably end up hurting artists more than it helps them. The glaze won't prevent AI from training on it, anyways. It just makes the quality of the product lower for actual people and isn't even an inconvenience to the actual problem at hand.

alphamule

Privileged

junco said:
This is reactionary noise, not unintentional. I think it should be treated differently. You hate it so much? Well maybe fucking over millions of artist's livelyhood has the tiniest bit of noise to it. Christ almighty you guys act like it actively shoots laser beams into your sockets.

May I suggest a compromise? DNP the majority of images you created, and then allow a minority unedited to get people to go to your own site or subscription/purchase/tip site account? I don't know. Just trying to be helpful. :(

junco said:
This is reactionary noise, not unintentional. I think it should be treated differently. You hate it so much? Well maybe fucking over millions of artist's livelyhood has the tiniest bit of noise to it. Christ almighty you guys act like it actively shoots laser beams into your sockets.

I do agree that intentional artifacting shouldn't be a reason for deletion, as long as the image meets quality standards. I would really appreciate, however if you could tag adversarial noise on upload for the purpose of blacklisting. For myself at least, it does tend to cause eye fatigue, so I'd rather avoid such images when possible.

I don't think AI art is the threat to artists that it's often made out to be. It blindly copies the pattern it was trained on, it doesn't understand what it's actually doing. It's not that far from an image equivalent of the library of babel. Even when it produces something competent, the knowledge that there was no thought behind it makes it's creations inherently less valuable. Even when AI art depicts something cute or creative, deep down viewers know that it's an aberration with no meaning beyond what they themselves project onto it. There is no point in trying to engage with it's characters or speculate on the world or action it shows just like there is no point in trying to find truth in the meaning of a string of random character.

I could confidently stake my life on the fact that the image generating AIs we have right now will never create images more beloved than artists who put any level of care or creativity into their work. For that to change we would have to get AI that actually understands and cares about what it's doing, which would qualify as it's own person by most standards.

alphamule

Privileged

oozeenthusiast said:
I don't think AI art is the threat to artists that it's often made out to be. It blindly copies the pattern it was trained on, it doesn't understand what it's actually doing. It's not that far from an image equivalent of the library of babel. Even when it produces something competent, the knowledge that there was no thought behind it makes it's creations inherently less valuable. Even when AI art depicts something cute or creative, deep down viewers know that it's an aberration with no meaning beyond what they themselves project onto it. There is no point in trying to engage with it's characters or speculate on the world or action it shows just like there is no point in trying to find truth in the meaning of a string of random character.

I could confidently stake my life on the fact that the image generating AIs we have right now will never create images more beloved than artists who put any level of care or creativity into their work. For that to change we would have to get AI that actually understands and cares about what it's doing, which would qualify as it's own person by most standards.

Well, there's this, too. Had discussion in #AGNPH channel on IRC and it amounted to: Given a year to face the fears we realized that the kinds of people paying for commissions would not want AI art, anyways! XD We've had stuff like 3D rendering and Flash for years, but people still prefer hand-drawn images over 3D renders and sprites/templates, a lot of the time. It just sucks when you know people are going to try scamming or plagiarizing. :(

I've seen low-effort stuff that is being used like the memepic generators ala Ifunny, and it always has some uncanny effect. It's kind of hilarious that if I intentionally imitated that quality, I'd be accused of using Stable Diffusion or the like. I mean, it's not even a question that no human drew it so I'm not even sure how to imitate it convincingly by hand.

alphamule said:
I've seen low-effort stuff that is being used like the memepic generators ala Ifunny, and it always has some uncanny effect. It's kind of hilarious that if I intentionally imitated that quality, I'd be accused of using Stable Diffusion or the like. I mean, it's not even a question that no human drew it so I'm not even sure how to imitate it convincingly by hand.

I'd actually really like seeing art drawn in that kinda effect that SD spits out. It's, oddly pleasing to me.
also the trolling opportunities would be *chefs kiss*

So as someone who actually trains AI models on data from E6, I can say that most model trainers I know do exclude data tagged as adversarial_noise. Based on what I've seen of its effectiveness from my testing, having that tag (when it is truthfully applicable, of course) is probably more effective in preventing an image's use in training than Glaze/Mist/Nightshade are on their own. I can't confidently say that this will last, because I don't think that it is very certain that currently poisoned images will actually be effective against newer models -- I'm not in a hurry to remove that from the blacklist, personally. However, it should still be in everyone's best interests to have posts containing adversarial consistently and accurately tagged, if people applying it don't want their images used for training.

thegreatwolfgang said:
Isn't it ironic that we have an adversarial_noise tag that would completely negate the purpose of adding noise in the first place?
I guess it would be successful in preventing the AI from sampling your artwork in the first place, but it also makes it easier for developers to limit the amount of data poisoning from happening.

Only Nightshade is designed explicitly for poisoning data, the others are intended to directly make a model misread a style (and only the style, it doesn't target anything else). Nightshade, as far as I can tell, depended upon a fairly large portion of the dataset being poisoned when they were testing it under very ideal lab conditions, and the production/release version is a lot more subtle (in terms of how much it is applied and what it tries to make an image "look like" to an AI model being much more orthogonal to the real content of the image) and honestly is probably too subtle to have a significant effect with how many images realistically will have it applied. We had to try really, really hard under very favorable conditions to get Nightshade to look like it was maybe doing something to a model with half of the dataset being poisoned. The chances that it would have any effect on the training of a model trained on e621 posts, with the amount of posts that have it or however many posts might realistically have it in the future, is near zero.

Hate to necro this, but my eyes aren't quite what they used to be and it seemed... neater to simply drop this in here.

Can we have some folks check the most recent works from Butterchalk and B-ern to see if this is applicable? I genuinely can't tell if they're glazed, my eyes are acting up, or my terminal needs a new screen. I'm extremely reluctant to add it to what could be stylistic texturing that I'm misreading (comic/crayon replication) or just artifacting from bad conversions from jpeg to png.

votp said:
Hate to necro this, but my eyes aren't quite what they used to be and it seemed... neater to simply drop this in here.

Can we have some folks check the most recent works from Butterchalk and B-ern to see if this is applicable? I genuinely can't tell if they're glazed, my eyes are acting up, or my terminal needs a new screen. I'm extremely reluctant to add it to what could be stylistic texturing that I'm misreading (comic/crayon replication) or just artifacting from bad conversions from jpeg to png.

I think Ern's current comic project has a filter over it for stylistic purposes, to make it mimic analog TV broadcast or VHS footage thereof.

votp said:
Hate to necro this, but my eyes aren't quite what they used to be and it seemed... neater to simply drop this in here.

Can we have some folks check the most recent works from Butterchalk and B-ern to see if this is applicable? I genuinely can't tell if they're glazed, my eyes are acting up, or my terminal needs a new screen. I'm extremely reluctant to add it to what could be stylistic texturing that I'm misreading (comic/crayon replication) or just artifacting from bad conversions from jpeg to png.

Both have a static/noise filter.

regsmutt said:
Both have a static/noise filter.

Might be worth making a tag for those kinds of filters as well if there isn't already one.
I actually really like that kind of effec

  • 1