Topic: Are we having DDOS-ing problems?

Posted under General

It seems like the site keeps going down for maintenance more and more often and I keep getting the annoying "Are you a robot?" check any time I click on an image past a certain time of day.

On another note, why the hell does confirming you're a human kick you to the home page rather than whatever page you were navigating to, especially since the URL for the page you were heading to is in the address bar during the robot check? It makes me need to navigate backward two pages, click on the image again, and risk having to do the robot check again on loop until I don't get unlucky.

glitchbun said:
It seems like the site keeps going down for maintenance more and more often and I keep getting the annoying "Are you a robot?" check any time I click on an image past a certain time of day.

On another note, why the hell does confirming you're a human kick you to the home page rather than whatever page you were navigating to, especially since the URL for the page you were heading to is in the address bar during the robot check? It makes me need to navigate backward two pages, click on the image again, and risk having to do the robot check again on loop until I don't get unlucky.

It started saying "Your browser is out of date!
Update your browser to view this website correctly." on a browser with too strict of security settings.

I actually started having problems using my usual android app "The Wolf's Stash" today due to cloud flare. It seems stuck in a loop, after I do the click to verify I'm human it has a loading circle for a while and just opens the verify I'm human box again, never getting past it.

invelios said:
I actually started having problems using my usual android app "The Wolf's Stash" today due to cloud flare. It seems stuck in a loop, after I do the click to verify I'm human it has a loading circle for a while and just opens the verify I'm human box again, never getting past it.

I just hopped on the forums to see if anyone else was talking about that. Glad to see I'm not alone. I don't know if the staff have control over what's behind cloudflare, but I really wish the API wasn't. Cloudflare sucks for its current job as-is, and "protecting" APIs are well outside its purpose.

The maintenance periods are unrelated, and should mostly be done for awhile. Several were changes/maintenance by our provider that we had to accommodate, so it was on their schedule more than ours. Some were needed upgrades on stuff we needed to do. So it has been several times recently, but it was just things that needed to get done. A lot of that's done now though and hopefully that will stay done for awhile.

The bot checks were noted as being too often and were recently adjusted, so it should happen less often now. Hopefully.

The redirect adjustment for after a bot check is already on the list for "features I would love to see", but it is a long list and there's always something the developers need to look at. But you are not alone in wishing for that one. Hopefully they'll be able to fit that change in somewhere.

Cloudflare does what cloudflare needs to. And sometimes that's beyond our control.

I'm typing on my phone because Cloudflare won't let me use the site on my regular browser on my computer. Each time I click the checkbox, it forces me to do the Captcha over again and again without ever letting me in. This problem is worse than Google's ReCaptcha since it works and I can complete it just fine, besides it being an annoying nuisance. It doesn't affect my phone thankfully, but I hardly use my phone to browse or post images to e6 in the first place.

I don't know what Cloudflare did, but RE621, rss, and the api are all returning 403. Why the hell would you ask me to confirm I'm not a robot on services that are primarily used by robots. Is this increased ddos protection?

In case anyone didn't know, after clicking the "I'm not a robot" button and getting dumped to the home page, you can usually get to the page you were trying to load by pressing the Back button only once and hitting Refresh on the robot check.

Refresh also works if the site decides it hates you while you were in the middle of opening a bunch of tabs at once. You only have to click the button on one of them, all other checks that were loaded at the same time can simply be reloaded.

Just adding that requests to the api are returning 403 for me as well. Haven't made any changes to my authentication info or headers. :<

Updated

invelios said:
I actually started having problems using my usual android app "The Wolf's Stash" today due to cloud flare. It seems stuck in a loop, after I do the click to verify I'm human it has a loading circle for a while and just opens the verify I'm human box again, never getting past it.

This is by design. It's to prevent anything but approved browsers from view it.

wolfmanfur said:
I'm typing on my phone because Cloudflare won't let me use the site on my regular browser on my computer. Each time I click the checkbox, it forces me to do the Captcha over again and again without ever letting me in. This problem is worse than Google's ReCaptcha since it works and I can complete it just fine, besides it being an annoying nuisance. It doesn't affect my phone thankfully, but I hardly use my phone to browse or post images to e6 in the first place.

Irony, CF on Verizon mobile is a pain.

m-machina said:
I don't know what Cloudflare did, but RE621, rss, and the api are all returning 403. Why the hell would you ask me to confirm I'm not a robot on services that are primarily used by robots. Is this increased ddos protection?

I wonder if it's possible to get some RSA cert-based alternative API. I have used SSH this way to prevent a port from appearing publicly on my remote server. Bind-ed strictly to localhost. But then you're dealing with rate-limiting on THAT publicly visible port (The encrypted one that is public-facing).

Updated

If blocking the browser in apps like The Wolf's Stash is by design, then that is bad design. How are we supposed to verify then on an app like that? The Wolf's stash is by far a much better experience than using a normal mobile browser, and has great features like tag following, getting locked out of it because of this change is very aggravating.

kora_viridian said:
User Donovan DMC runs an unofficial status page for the e621 API - the part of the site that tools like RE621 and Wolf's Stash use. It queries e621 periodically and reports whether or not the query worked.

The status page is at https://status.e621.ws/ ; try loading that page in your regular browser. If that page reports that the e621 API is down, then it probably really is down.

I don't work for or volunteer for e621.

Seems to say it is down on every browser I've opened, including one where i have gotten past the human verification. I'm curious if anyone has gotten a call through to the api at all.

I've never setup anything with cloudflare, is there a way to exclude certain endpoints from this verification process? That sounds like something that should be an option, especially for something as widespread as cloudflare, I would think a feature as simple as that would have been requested before.

my api-curling shellscript is failing as well. the return code is HTTP/2 403, and the data served is clownflare's challenge page.

invelios said:
If blocking the browser in apps like The Wolf's Stash is by design, then that is bad design. How are we supposed to verify then on an app like that? The Wolf's stash is by far a much better experience than using a normal mobile browser, and has great features like tag following, getting locked out of it because of this change is very aggravating.

Didn't say it was good. Bad specs can lead to bad design. They assume that you're using it to block scrapers, login painters, spammers, and other stuff like that, not DDoS. And if someone's scraping without asking permission, then we need to fix that, as well. :/

invelios said:
Seems to say it is down on every browser I've opened, including one where i have gotten past the human verification. I'm curious if anyone has gotten a call through to the api at all.

I've never setup anything with cloudflare, is there a way to exclude certain endpoints from this verification process? That sounds like something that should be an option, especially for something as widespread as cloudflare, I would think a feature as simple as that would have been requested before.

There's a token you can get but I think it's only for testing purposes.

cinnamoncrunch said:
my api-curling shellscript is failing as well. the return code is HTTP/2 403, and the data served is clownflare's challenge page.

BTW: You will find that the exact header and requests order will be detected by things like PerimeterX, and you guessed it - blocked. This is entirely to deal with scraping which if you're a site selling user info, this makes perfect sense. i.e. Twitter and Facebook, as well as LinkedIn.

kora_viridian said:
I'd guess not.

If you would like to have five nines on your furry pr0n, I'm sure NMNY , or their boss, would be happy to talk to you about an SLA. Bring your checkbook. :D

Somewhere between 5 and 100 milliseconds after you do that, the bad-guy bot finds that endpoint, and starts making requests at 1.21 jiggahertz. Such is the internets in these latter days.

LOL, wonder how they'd do that if it's whitelisted. Sadly, IP spoofing is still a thing in 2023, but meh.

Hmm, you can have your porn fast, reliable, cheap, convenient, diverse, choose 3.

Earlier I've had problems voting or adding posts to favorites, but something changed with Cloudflare that now I can no longer even connect to e621 with Tor. It just gets stuck at the "checking if the site connection is secure" page and no matter how many time I try to get past the "verify you are human" box it just loops the same page continuously. e926 still works though.

It is depressing how bad actors keep making the internet worse.

I'm not up-to-date on how e621 is implementing Cloudflare, but currently, it's also breaking my scripts that use the API. Instead of enabling bot protection across the entire website including API endpoints, how about excluding them and adding rate limits instead? This would also protect the website against application-level DDoS, but without breaking the API entirely.

also having trouble in animeboxes on android (set up with the API key and validated the client, but it errors when trying to connect using the key)

I hate to be the annoying "I'm having this issue too" noob on GitHub issue reports, but... well, I'm having this issue too. The Wolf's Stash prompts me to pass the CF interactive challenge, but when I complete it, the box spins for a couple seconds and re-prompts me to complete another.

Hopefully this gets fixed soon, because while the mobile mode is better than pre-e621ng, it's still rather annoying to use one-handed...

if you're going to break the API so mobile apps can't work, then make the site more mobile friendly to begin with

remember when the site coder didn't know what cron was? null remembers

If it's of any use, for The Wolf's Stash in particular, this reddit thread has some suggestions, but I'm doubtful of most working. Something they mentioned that I've confirmed does work is turning off explicit mode/using e926. Not that it's particularly helpful, but it's seemingly using an API key just like the other mode (I tested favoriting a post through the app and checking on the browser, it went through just fine). Just guessing from trying to get to their landing pages in a new browser session, it looks like cloudflare just doesn't protect e9 like e6 (I guess?). Whatever it is is deeply frustrating.
https://www.reddit.com/r/e621/comments/12mfxi9/anyone_else_having_trouble_with_the_wolfs_stash/?utm_source=share&utm_medium=android_app&utm_name=androidcss&utm_term=1&utm_content=share_button

Wolf's stash just started working for me again. Seems someone fixed the issue with the api, Thank You to anyone involved with the fix.

I can confirm as well that TFS is working again, however I keep encountering "1200" errors. Looks like E621's cache server isn't keeping up with the requests, so CF is rate limiting until it does.

For anyone saying "don't serve challenges on the api" or "increase the ratelimits", this changes nothing, those requests would still need to hit SOME server, which defeats the purpose of mitigating an attack. Any hole can and WILL be exploited. Even if requests are just hitting a load balancer, that's still a system they're overloading.

donovan_dmc said:
For anyone saying "don't serve challenges on the api" or "increase the ratelimits", this changes nothing, those requests would still need to hit SOME server, which defeats the purpose of mitigating an attack. Any hole can and WILL be exploited. Even if requests are just hitting a load balancer, that's still a system they're overloading.

Yeah, the proper solution is to not have it facing the public Internet in the first place (VPN and/or CDN *cough*)?

  • 1