Topic: Feature Request: automatic detection of ai generated content

Posted under Site Bug Reports & Feature Requests

What it says on the title. I'm not sure relying on manual detection is going to remain viable for the foreseeable future and I think some images are already being smuggled through. I don't know if there is an off the shelf solution or if something would need to be made. I'm thinking the solution may also need to be ai based. Thoughts?

Edit: Upon further inspection of some of the images in question, I think I may have just been a bit paranoid, but still.

Updated

There's some basic AI art detection already (based on the image metadata I believe) but actually automatically detecting AI art is just going to be an arms race that I don't think the developers here have the time to waste competing in.

In theory it wouldn't be too hard to train an AI to detect other AI art, but then it would also be trivial for the person generating it to train their AI to bypass it.

faucet said:
There's some basic AI art detection already (based on the image metadata I believe) but actually automatically detecting AI art is just going to be an arms race that I don't think the developers here have the time to waste competing in.

In theory it wouldn't be too hard to train an AI to detect other AI art, but then it would also be trivial for the person generating it to train their AI to bypass it.

I might be naive but I was kind of assuming that most people wouldn't be bothered trying to build their own ai just to fool this websites detection.

I feel like AI art is at similar level with stuff like screencaps from game mods, traced artwork and such, they are disallowed, but sweating over that something sneaks by will just drain everyones energy and time.
If some get by and accidentally approved, they can be deleted later down the line if they get found out. We should have flag option now for content that's not according to guidelines for these scenarios as well.

mairo said:
I feel like AI art is at similar level with stuff like screencaps from game mods, traced artwork and such, they are disallowed, but sweating over that something sneaks by will just drain everyones energy and time.
If some get by and accidentally approved, they can be deleted later down the line if they get found out. We should have flag option now for content that's not according to guidelines for these scenarios as well.

Fair enough. I was mostly just worried about a nightmare scenario where the site gets overwhelmed by the stuff.

Stable Diffusion, at least, has an invisible watermark on produced images by default. While you can turn it off, most don't know it's on to begin with. E6 could probably be set up to detect that. Won't solve everything, but it should pick up at least some of it.

mairo said:
We should have flag option now for content that's not according to guidelines for these scenarios as well.

Man, i’ve always wondered why we didn’t have this.

I know my old wording was, “irrelevant to site” cuz my scenarios were, unrelated human onlys. but this sounds way better.

mairo said:
We should have flag option now for content that's not according to guidelines for these scenarios as well.

Man, i’ve always wondered why we didn’t have this.

I know my old wording was, “irrelevant to site” cuz my scenarios were, unrelated human onlys. but this sounds way better.

benjiboyo said:
Man, i’ve always wondered why we didn’t have this.

I know my old wording was, “irrelevant to site” cuz my scenarios were, unrelated human onlys. but this sounds way better.

At the very beginning, flag reason was just a box, but basically that just made people flag for reasons like "I don't like this" and such, so it was changed into a list of things to select from which isn't too dissimilar from basically any other website and social media with flagging option where you choose why you are flagging for and maybe provide some additional information, which made it so flags were more proper and consistant.
The reason why "not according to guidelines" hasn't been a flag is because we do still have manual curation and approval queue. If something isn't according to guidelines, then janitor will get to the post due time regardless and if it's e.g. human only upload, it's not going to hurt anyone in any manner, where real life pornography, gore and pirated material should be deleted ASAP.

rattyboi said:
What it says on the title. I'm not sure relying on manual detection is going to remain viable for the foreseeable future and I think some images are already being smuggled through. I don't know if there is an off the shelf solution or if something would need to be made. I'm thinking the solution may also need to be ai based. Thoughts?

Edit: Upon further inspection of some of the images in question, I think I may have just been a bit paranoid, but still.

Dude, just ban them, since this is intractable. This is equivalent to the Halting Problem. "Does it compute?" when asked about algorithms that deliberately don't complete in a predictable time.

  • 1