Topic: [Feature] Auto-https for sites that support it

Posted under Site Bug Reports & Feature Requests

Requested feature: for a given list of domain patterns, substitute http with https in source links.

.deviantart.com
.furaffinity.net
.weasyl.com
(maybe others too, but notably not tumblr)

Why would it be useful: no point in letting your ISP (and everyone else on the line) know too much about your hobbies :tinfoilhat:

What parts of the site pages are affected /post/show/*

Substitution can be done on post-submit, post-edit, or post-show. I suggest only doing it on post-show, if at all feasible, to avoid changes to user-submitted data in the db. So if http url is submitted, it is stored as http, and it's still http in the edit form, but when post is shown it's https. No data will be lost if things go wrong, and no need to change old posts.

Updated by savageorange

Bumping, because this is definitely needed.

Sites like FA actually do not enforce this and I have manually been adding the "s" in the source links if/when I edit tags and stuff.

Updated by anonymous

leomole

Former Staff

+1. My sources are http because I copy paste but https is better.

Updated by anonymous

Deviantart only supports partial TLS encryption. I strongly advocate against enabling it by default on that website, but the following have full TLS support when browsing profiles and image galleries: FurAffinity, InkBunny, SoFurry, Weasyl, FurryNetwork.

+1 for implementing. If it does not pass, I recommend picking up HTTPS Everywhere from the Electronic Frontier Foundation and using it to force websites to use TLS encryption on connection if they fully support it but have not switched it on by default.

Updated by anonymous

People who care should already have HTTPS Everywhere. If the uploader cares he will use a HTTPS link.
This seems unnecessary.

Updated by anonymous

I don't at all agree that this is unnecessary -- basically secure comms, insofar as possible, should be a default, not an opt-in that you must research to even realize that it exists. That said, I can see that maintaining the whitelist is potentially redundant with work that is already done on HTTPSEverywhere rulesets.

Has anyone run any histogram analysis on unique domains in sources? I was just thinking that a blacklist approach might turn out to be more practical, depending on the results.

Updated by anonymous

  • 1