Topic: [Feature] WebP support

Posted under Site Bug Reports & Feature Requests

aaronfranke said:
No, that's horrible. Quantizing the image down to a smaller color palette makes the image look way worse than using JPEG.

Look at this: https://i.imgur.com/zQ0wYcL.png

The quantized reduced palette version looks bad and is larger than the JPEG. You probably are finding it hard to tell which one is the JPEG, so here are the labels: https://i.imgur.com/Y8RxEGh.png

Anyone who thinks that reducing the color palette is an acceptable solution is just wrong. If you just do the test as I did, it's painfully obvious.

Lmao I immediately spotted the jpeg vomit artifacts next to the nose and when exporting with an indexed color palette you must have picked one of the worst ones like stein-floyd. Here's what it looks like when you select none and what it looks like when you select positioned.

They are better depending what type of image you're dealing with, but this is a "digital painting" I don't think any kind of compression will ever look good on this, the best you can do is convert it into a lossy webp and then back into a png and then the difference is minimal.

And for full disclosure, you were right when you said "You probably are finding it hard to tell which one is the JPEG", but not for the reason you think. You posted an image with the same picture repeated 3 times and like an idiot I was trying to play a game of "find the difference" between the 3 images, if you don't zoom in the jpeg artifact and the quantization are not visible. With that said, jpeg has an history for doing a lot more damage to images as opposed to quantization, to the point where it is visible without zooming in.

In fact, take a look at this: compression_artifacts this will never happen with indexed colors unless you're aiming very low like 1, 2or 4 bits colors.

Lastly, the reason why by converting to jpeg you got an image so small is most likely because:
1. jpeg does not support an alpha channel. Now, to answer savageorange's point, removing the alpha channel alone reduces the png's size down to 77.7 mb.
2. jpeg only supports 8 bits per channel. https://www.izitru.com/how-many-colors-does-jpeg-have.php https://community.adobe.com/t5/lightroom-classic-discussions/is-it-possible-to-export-jpeg-with-more-than-8-bits-per-channel/td-p/8854374 your image has 16 bits per channel.
If you had converted your image into a jpeg with 100% quality it wouldn't have made a difference, it would still be as small as before.

Now, we're trying to fit this huge colorful image into less than 75 mb of space while doing the least modifications possible and to make it happen folks had this discussion here where we could figure why the image was so massive to start with.

If you wanna help, heres the code I used to remove the alpha channel via imagemagick

convert 'corbin and panda.png' -alpha off output.png

, if you don't want to, fine I'll post the image myself once I'm done with it, if I let you do whatever you want this'll result into a worse image.

Also, anyone who thinks GIF is a good format is also plain wrong and has no idea what they're talking about, it's orders of magnitude worse than animated WebP, animated PNG, and video formats like WebM, MP4, etc.

Counterpoint: https://giphy.com/gifs/i-love-you-chippythedog-are-beautiful-0usUCcbhxd4Voog4zP

It is better to post an animated png, a webm or anything else besides a gif, but that doesn't mean it can't be cleaned out and look good. The worst gifs I've seen are on old websites or the ones that use a clip from a film, they tend to have too many colors for the gif.

alphamule

Privileged

wolfmanfur said:
It's funny how they wanted to re-compress the image into a jpeg with 90% quality while all it took to compress the image was to remove the Alpha channel and change png compression settings. This hyperfocused obsession with webp is starting to become un healthy aaronfrank.

Talking of lossy compression, forcing a 255 indexed color palette (like gif) would have reduced the file size by at least 50% and the image would still look as good as it did before, simply by the fact that there ain't no artifact vomit on the image. It's possible on gimp too!

It bears repeating, but there is never a day where using jpeg is justified and the day where webp will be needed because a png image is too big and impossible to compress. Something nasty like 100000x100000, and it uses as many colors as possible and so it is 200mb. Only that day will that discussion be wort having.

Wait, it still had the alpha channel was the other problem? I never got that far into messing with it before tired of it, so thanks for reminding me. The drop by going to 24-bit color is why the sized dropped that much, and that destroys visible information, though.

The only reason I support WEBP is because it's there, and supported by damn near everything. And it's smaller downloads for same quality

Kind of reiterating things savageorange aaronfranke said... A lot of that stuff made no sense. I mean, 1-frame GIF files compared to JPEG? Hahaha, no.

votp said:
I'll take gifs over WebPs any day, especially given the recent buffer overflow issue . You basically have two options when it comes to making an image fit within an upload limit; reduce the data in the image itself (scrubbing metadata, removing unused transparency, reduced palette, other compression or optimisations as-format-appropriate) or simply scaling it down to a more reasonable size.

All you've really done is make an argument in favour of JPEG over WebP to me from what you've shown. Mozjpeg time?

TBF, GIF and other formats were infamous back in the IE 4-5 days for having things like exploits. A lot of code blindly trusted pointers given to them by the (malformed) file. Even Java had that bytecode verify failure exploit where you could have objects with a negative size. The fact that they designed it not to explicitly treat that as an error was a big oversight. Then again, seeing how the bytecode was documented, you can see why no one noticed this issue (until it got used in the wild).

Whether it's trivial depends on the picture IMO (and whether you have the ability to see the extra information, which is determined by your monitor and possibly other things like whether your monitor is correctly calibrated, also some 'color management is just generally kind of a mess' issues like the ones mentioned here)

Realistically, 10 bits per channel (30 bits per RGB pixel) is as good as is currently relevant for wide-gamut monitors; but only newer formats like AVIF support 10bpc, most other formats support some of 8/16/32 bpc. I think there are some formats that special case alpha channel so it can use less bits than the RGB channels. People not using wide-gamut monitors will only be able to see the extra information by using some kind of image manipulation.

That's 'possible display reproduction fidelity'. But not being able to normally see the information doesn't mean it isn't meaningful. The typical gamut for a non-wide-gamut monitor is nominal sRGB , which notably represents a subset of the color range we can see. So yes, we cannot distinguish all of the 16 million colors in standard 8bpc sRGB; but we still need more colors, because it would be good to be able to express ALL colors that we can see, and do this in a way which incorporates a good-quality approximation of the standard sRGB gamut.
The other major need for high bit depth is because each time you mix things (audio or visual), you almost always degrade information, so it's good to start with higher definition information than you are going to deliver at the end. But that one is more of a workflow thing than anything that should be uploaded directly to the web.

One simple test for 'genuine information loss' would be to feed both the original version and the 16->8bpc-reduced version to ImageMagick identify -verbose -- it will report an entropy value for each channel. If the entropy values for each channel are similar between versions, then not much information is being lost (or IOW, there wasn't much actual worthwhile use of the added depth).

In the case of this image which has many flat colors, I suspect it would fail the entropy test.

Updated

alphamule

Privileged

savageorange said:
In the case of this image which has many flat colors, I suspect it would fail the entropy test.

Yeah, that was what I was thinking. No way there's legit more than 24 bits of actual information in each pixel.

EDIT: I got yet another request in DMs to convert WebP file to something that site accepts because more and more websites are starting to default to WebP nowdays. We really are starting to need that support.

wolfmanfur said:
According to Mairo, this is trivial information.
https://e621.net/posts/4301756

I'd like to hear from them why

I do not have 10-bit panel and I'm not versed enough with over 8-bit color depth inspection, so I was trying to check for additional color detail when altering colors, e.g. banding on gradients but I failed to see any by quick inspection which made me believe that it was someone just being stupid with the file and resaving 8bpc as 16bpc for whatever reason (which has happened before). Also it felt like this massively high quality file dropping out of nowhere was bit sus.
Doing further inspection, it seems I have been wrong and was mostly looking at random detail which was harder to determine if there was change or not.

Also yes, removing alpha channel was right call, that's also what most optimizers immidiately do as it does pad the file surprisingly lot for no benefit if no pixel is transparent to begin with. I use Pingo, but that refuses to do lossless optimization on 16bpc files at all.

To be fair, even now basically all people have just 8-bit monitors and displays, over 8bpc files are generally for production where you might need to still edit something so you won't lose any detail if you decide to adjust colors and contrast or other stuff still before final render.

Updated

It's probably worth mentioning that Newgrounds has started to convert larger images to webp with their new gallery update: https://www.newgrounds.com/bbs/topic/1528343

It looks like you can still change the file extension to png to get the original, but this is probably going to start tripping up the 5 people that use the site

plsignore said:
It's probably worth mentioning that Newgrounds has started to convert larger images to webp with their new gallery update: https://www.newgrounds.com/bbs/topic/1528343

It looks like you can still change the file extension to png to get the original, but this is probably going to start tripping up the 5 people that use the site

e621 strips the extension, so if webp was allowed that wouldn't be a proble but from this discussion I am left pondering if it is ever gonna happen or whether it should even happen.

I could imagine if any admin saw this debate they are now more than ever hesitant to add this as a feature, I would be. at one point I was thinking it would be better to disallow webp and have them converted to png than to allow it at all.

The only reason webp should be available as an option during upload is due to its prevalence and nothing else. Each time I come across a webp (and that occured more than a couple times) I was forced to convert it into png which I'm sure would bloat the filesize significantly for no real benefit.

It's too bad the discussion went from "webp option would be neat and quicken things up" to "compression, compress this, compress that". This insistence on compressing files and converting all png files to webp for the sole purpose to compress them "losslessly" still baffles me.This is an art archive, not a blog this serves no purpose and by default files are already served compressed, I have to manually download them or select "Original" to see the full picture. I am not too fond of compression in general, I had to download zopfli and optipng just for this specific artwork that exceeded the 75 mb limit.

And this soured my and other folk's view of webp as an available option because imagine this: If e621 allows webp then some people will intentionally convert pngs into webps to get the smallest file possible. Worst part is, it is not lossless like AaronFranke claimed, after doing some research webp is actually very restrictive of a file format. I don't remember the exact numbers, but I know webp does NOT support 16 bits per channel, I heard it supports 10 and 12 bpc, but not 16. That's on top not supporting files that are too tall or too large.

In short, I reckon it is important to allow this for its prevalence, but new guidelines should be formed around it e.g: If the source file is a png and the file uploaded is a webp we should treat it the same way as if someone posted a jpg over png and then get swiftly taken down.

Updated

wolfmanfur said:
It's too bad the discussion went from "webp option would be neat and quicken things up" to "compression, compress this, compress that". This insistence on compressing files and converting all png files to webp for the sole purpose to compress them "losslessly" still baffles me.

Some people have data limits on their mobile plan. I don't think people should be using mobile data to browse if they can help it, but empirically it's obvious that they do so.
Also, servers generally have large amounts of storage available but this doesn't mean people downloading the images do. A 10-25% saving adds up fairly quickly.

E621 discourages transcoding, but IMO artists should definitely consider providing WebPs themselves.

by default files are already served compressed, I have to manually download them or select "Original" to see the full picture

Sample images are "compressed" in some sense, but not the 'lossless/lossy compression' sense. They are downsampled.

And this soured my and other folk's view of webp as an available option because imagine this: If e621 allows webp then some people will intentionally convert pngs into webps to get the smallest file possible.

This is true.

And also applies to the jpeg file format, except in that case loss is guaranteed, and to my knowledge loss is simple to avoid for 99% of images that you might choose to transcode to WebP -- just choose lossless. Any question of loss would mainly apply to metadata (but this applies generally; any conversion from one file format to another is prone to subtle metadata loss)

Worst part is, it is not lossless like AaronFranke claimed

[Citation required] -- This is an official spec which is implemented by the reference encoder (libwebp) and has been for some time. This is the main WebP encoder in use (eg. if you save as WebP in GIMP, the encoding is done by libwebp).

AFAICS WebP can encode losslessly the pixels of any 8bpc image, either RGB or RGBA. Images on the web that don't fit into that category are rare.

That's on top not supporting files that are too tall or too large.

With the reference encoder, this is a non-issue; attempting to encode an image with excessive width or height does not produce any image at all, just a 0-byte file. There is no chance of this being mistaken for a legitimate image.

The most optimistic interpretation I can make of this particular argument is that 'we shouldn't support WebP as it will be relevant for only a limited time (as resolutions continue to increase or the size of displays continues to increase)'.

Updated

alphamule

Privileged

wolfmanfur said:
It's too bad the discussion went from "webp option would be neat and quicken things up" to "compression, compress this, compress that". This insistence on compressing files and converting all png files to webp for the sole purpose to compress them "losslessly" still baffles me.This is an art archive, not a blog this serves no purpose and by default files are already served compressed, I have to manually download them or select "Original" to see the full picture. I am not too fond of compression in general, I had to download zopfli and optipng just for this specific artwork that exceeded the 75 mb limit.

What? Oh, for thumbnails and previews, it's not bad. Archive site still gonna be archive site with the originals. Can't you set it to just show original images, anyways? I use Raccoony, and that automatically saves the original, anyways. It even lets me sort by artist name, and saves description.

And this soured my and other folk's view of webp as an available option because imagine this: If e621 allows webp then some people will intentionally convert pngs into webps to get the smallest file possible. Worst part is, it is not lossless like AaronFranke claimed, after doing some research webp is actually very restrictive of a file format. I don't remember the exact numbers, but I know webp does NOT support 16 bits per channel, I heard it supports 10 and 12 bpc, but not 16. That's on top not supporting files that are too tall or too large.

In short, I reckon it is important to allow this for its prevalence, but new guidelines should be formed around it e.g: If the source file is a png and the file uploaded is a webp it should treated the same way as if someone posted a jpg over png and then get swiftly taken down.

I agree that if it's being used blindly, it should get treated like the mutt conversions that people call videos. Hmm, so I just assumed since VP9 had a lossless option, that there was lossless Webp. I definitely need to research this. However, if someone's outputting 10 or 12 bit-per-channel instead of just 8 or 16, that's already an advantage, balancewise, for lossy images/video?

savageorange said:
[Citation required] -- This is an official spec which is implemented by the reference encoder (libwebp) and has been for some time. This is the main WebP encoder in use (eg. if you save as WebP in GIMP, the encoding is done by libwebp).

AFAICS WebP can encode losslessly the pixels of any 8bpc image, either RGB or RGBA. Images on the web that don't fit into that category are rare.

With the reference encoder, this is a non-issue; attempting to encode an image with excessive width or height does not produce any image at all, just a 0-byte file. There is no chance of this being mistaken for a legitimate image.

The most optimistic interpretation I can make of this particular argument is that 'we shouldn't support WebP as it will be relevant for only a limited time (as resolutions continue to increase or the size of displays continues to increase)'.

Ah, in next reply. ;)
Yeah, and obviously, if that became a thing, they'd make a revision to the format, like with anything else. I'd be shocked to see a 1 gigapixel display in someone's living room, though. That's less a display, than a poor attempt at a microlithography projector. ;)

https://developers.google.com/speed/webp/docs/riff_container Reading this part right now. It seems to have 24-bit fields for width and height. This is sadly limited to 4GB minus 10 number of bytes of actual image data. Oh noes, what will we ever do? How will I have my 281-trillion-pixel display show it full screen?! Drat, the lossless image chunk itself is limited to 14 bits per axis, though. :(

Updated

alphamule said:

https://developers.google.com/speed/webp/docs/riff_container Reading this part right now. It seems to have 24-bit fields for width and height. This is sadly limited to 4GB minus 10 number of bytes of actual image data. Oh noes, what will we ever do? How will I have my 281-trillion-pixel display show it full screen?! Drat, the lossless image chunk itself is limited to 14 bits per axis, though. :(

They used to have a system for tiling 16383x16383 'tiles' into a larger image, it may be related to that. (the tiling system is currently disabled IIRC)

alphamule

Privileged

savageorange said:
They used to have a system for tiling 16383x16383 'tiles' into a larger image, it may be related to that. (the tiling system is currently disabled IIRC)

That is apparently a bug, too. The format itself specifies that they use the value 16383 to store a height of 16384. This sort of thing is very likely why they had a buffer overrun exploit found this month. The compression library can't create files with a value above 16382 in that 14-bit unsigned integer, right?

Tiles would probably have to work like animation frames. That is, they'd fill in gaps one at a time. Essentially like a multi-part image.

Actually, the bug was in the decompression code: https://www.helpnetsecurity.com/2023/09/27/cve-2023-5129/

savageorange said:
[Citation required] -- This is an official spec which is implemented by the reference encoder (libwebp) and has been for some time. This is the main WebP encoder in use (eg. if you save as WebP in GIMP, the encoding is done by libwebp).

AFAICS WebP can encode losslessly the pixels of any 8bpc image, either RGB or RGBA. Images on the web that don't fit into that category are rare.

On your own link it says it uses 8 bits per channel. So, it would have been lost information on the png if it was converted to webp.

Here's where I heard of 10 and 12 bits per channel https://avif.io/blog/comparisons/avif-vs-webp2/ that might not have had anything to do with color depth now that I'm reading this again and Avif supports 12 bits.

If I had let AaronFranke post the 8 bits per channel png, it would have only been 12 gb in size. That's a 60 gb of difference, same story if the png had been converted to jpeg or webp. In effect, all modes of webp are lossy because of its restrictions if you convert from png to it.

wolfmanfur said:
In effect, all modes of webp are lossy because of its restrictions if you convert from png to it.

For this exact png, yes. For almost all other pngs on the web, no.

Updated

alphamule

Privileged

wolfmanfur said:
On your own link it says it uses 8 bits per channel. So, it would have been lost information on the png if it was converted to webp.

Here's where I heard of 10 and 12 bits per channel https://avif.io/blog/comparisons/avif-vs-webp2/ that might not have had anything to do with color depth now that I'm reading this again and Avif supports 12 bits.

If I had let AaronFranke post the 8 bits per channel png, it would have only been 12 gb in size. That's a 60 gb of difference, same story if the png had been converted to jpeg or webp. In effect, all modes of webp are lossy because of its restrictions if you convert from png to it.

MB?

Anyways, this historic post's comments on WEBM seem relevant: post #511153

I've been seeing a lot of PNGs that are censored that I wonder if messing with the alpha mask would unhide pixels. Same issue happened with PDFs and others.

  • 1
  • 2