What makes JPEG files so special? Discover the technical magic that keeps them at the forefront of digital photography.
A version of this post originally appeared on Tedium, Ernie Smith’s newsletter, which hunts for the end of the long tail.
For roughly three decades, the JPEG has been the World Wide Web’s primary image format. But it wasn’t the one the Web started with. In fact, the first mainstream graphical browser, NCSA Mosaic, didn’t initially support inline JPEG files—just inline GIFs, along with a couple of other formats forgotten to history. However, the JPEG had many advantages over the format it quickly usurped.
Despite not appearing together right away—it first appeared in Netscape in 1995, three years after the image standard was officially published—the JPEG and web browser fit together naturally. JPEG files degraded more gracefully than GIFs, retaining more of the picture’s initial form—and that allowed the format to scale to greater levels of success. While it wasn’t capable of animation, it progressively expanded from something a modem could pokily render to a format that was good enough for high-end professional photography.
For the internet’s purposes, the degradation was the important part. But it wasn’t the only thing that made the JPEG immensely valuable to the digital world. An essential part was that it was a documented standard built by numerous stakeholders.
How important is it that JPEG was a standard? Let me tell you a story.
During a 2013 New York Times interview conducted just before he received an award honoring his creation, GIF creator Steve Wilhite stepped into a debate he unwittingly created. Simply put, nobody knew how to pronounce the acronym for the image format he had fostered, the Graphics Interchange Format. He used the moment to attempt to set the record straight—it was pronounced like the peanut butter brand: “It is a soft ‘G,’ pronounced ‘jif.’ End of story,” he said.
I posted a quote from Wilhite on my popular Tumblr around that time, a period when the social media site was the center of the GIF universe. And soon afterward, my post got thousands of reblogs—nearly all of them disagreeing with Wilhite. Soon, Wilhite’s quote became a meme.
The situation paints how Wilhite, who died in 2022, did not develop his format by committee. He could say it sounded like “JIF” because he built it himself. He was handed the project as a CompuServe employee in 1987; he produced the object, and that was that. The initial document describing how it works? Dead simple. 38 years later, we’re still using the GIF—but it never rose to the same prevalence of JPEG.
The JPEG, which formally emerged about five years later, was very much not that situation. Far from it, in fact—it’s the difference between a de facto standard and an actual one. And that proved essential to its eventual ubiquity.
We’re going to degrade the quality of this image throughout this article. At its full image size, it’s 13.7 megabytes.Irina Iriser
Built with input from dozens of stakeholders, the Joint Photographic Experts Group ultimately aimed to create a format that fit everyone’s needs. (Reflecting its committee-led roots, there would be no confusion about the format’s name—an acronym of the organization that designed it.) And when the format was finally unleashed on the world, it was the subject of a more than 600-page book.
JPEG: Still Image Data Compression Standard, written by IBM employees and JPEG organization stakeholders William B. Pennebaker and Joan L. Mitchell, describes a landscape of multimedia imagery, held back without a way to balance the need for photorealistic images and immediacy. Standardization, they believed, could fix this.
“The problem was not so much the lack of algorithms for image compression (as there is a long history of technical work in this area),” the authors wrote, “but, rather, the lack of a standard algorithm—one which would allow an interchange of images between diverse applications.”
And they were absolutely right. For more than 30 years, JPEG has made high-quality, high-resolution photography accessible in operating systems far and wide. Although we no longer need to compress JPEGs to within an inch of their life, having that capability helped enable the modern internet.
As the book notes, Mitchell and Pennebaker were given IBM’s support to follow through this research and work with the JPEG committee, and that support led them to develop many of the JPEG format’s foundational patents. Described in patents filed by Mitchell and Pennebaker in 1988, IBM and other members of the JPEG standards committee, such as AT&T and Canon, were developing ways to use compression to make high-quality images easier to deliver in confined settings.
Each member brought their own needs to the process. Canon, obviously, was more focused on printers and photography, while AT&T’s interests were tied to data transmission. Together, the companies left behind a standard that has stood the test of time.
All this means, funnily enough, that the first place that a program capable of using JPEG compression appeared was not MacOS or Windows, but OS/2—a fascinating-but-failed graphical operating system created by Pennebaker and Mitchell’s employer, IBM. As early as 1990, OS/2 supported the format through the OS/2 Image Support application.
At 50 percent of its initial quality, the image is down to about 2.6 MB. By dropping half of the image’s quality, we brought it down to one-fifth of the original file size. Original image: Irina Iriser
The thing that differentiates a JPEG file from a PNG or a GIF is how the data degrades as you compress it. The goal for a JPEG image is to still look like a photo when all is said and done, even if some compression is necessary to make it all work at a reasonable size. That way, you can display something that looks close to the original image in fewer bytes.
Or, as Pennebaker and Mitchell put it, “the most effective compression is achieved by approximating the original image (rather than reproducing it exactly).”
Central to this is a compression process called discrete cosine transform (DCT), a lossy form of compression encoding heavily used in all sorts of compressed formats, most notably in digital audio and signal processing. Essentially, it delivers a lower-quality product by removing details, while still keeping the heart of the original product through approximation. The stronger the cosine transformation, the more compressed the final result.
The algorithm, developed by researchers in the 1970s, essentially takes a grid of data and treats it as if you’re controlling its frequency with a knob. The data rate is controlled like water from a faucet: The more data you want, the higher the setting. DCT allows a trickle of data to still come out in highly compressed situations, even if it means a slightly compromised result. In other words, you may not keep all the data when you compress it, but DCT allows you to keep the heart of it.
(See this video for a more technical but still somewhat easy-to-follow description of DCT.)
DCT is everywhere. If you have ever seen a streaming video or an online radio stream that degraded in quality because your bandwidth suddenly declined, you’ve witnessed DCT being utilized in real time.
A JPEG file doesn’t have to leverage the DCT with just one method, as JPEG: Still Image Data Compression Standard explains:
The JPEG standard describes a family of large image compression techniques, rather than a single compression technique. It provides a “tool kit” of compression techniques from which applications can select elements that satisfy their particular requirements.
The toolkit has four modes:
At the time the JPEG was being created, modems were extremely common. That meant images loaded slowly, making Progressive DCT the most fitting format for the early internet. Over time, the progressive DCT mode has become less common, as many computers can simply load the sequential DCT in one fell swoop.
That same forest, saved at 5 percent quality. Down to about 419 kilobytes.Original image: Irina Iriser
When an image is compressed with DCT, the change tends to be less noticeable in busier, more textured areas of the picture, like hair or foliage. Those areas are harder to compress, which means they keep their integrity longer. It tends to be more noticeable, however, with solid colors or in areas where the image sharply changes from one color to another—like text on a page. Ever screenshot a social media post, only for it to look noisy? Congratulations, you just made a JPEG file.
Other formats, like PNG, do better with text, because their compression format is intended to be non-lossy. (Side note: PNG’s compression format, DEFLATE, was designed by Phil Katz, who also created the ZIP format. The PNG format uses it in part because it was a license-free compression format. So it turns out the brilliant coder with the sad life story improved the internet in multiple ways before his untimely passing.)
In many ways, the JPEG is one tool in our image-making toolkit. Despite its age and maturity, it remains one of our best options for sharing photos on the internet. But it is not a tool for every setting—despite the fact that, like a wrench sometimes used as a hammer, we often leverage it that way.
The JPEG format gained popularity in the ’90s for reasons beyond the quality of the format. Patents also played a role: Starting in 1994, the tech company Unisys attempted to bill individual users who relied on GIF files, which used a patent the company owned. This made the free-to-use JPEG more popular. (This situation also led to the creation of the patent-free PNG format.)
While the JPEG was standards-based, it could still have faced the same fate as the GIF, thanks to the quirks of the patent system. A few years before the file format came to life, a pair of Compression Labs employees filed a patent application that dealt with the compression of motion graphics. By the time anyone noticed its similarity to JPEG compression, the format was ubiquitous.
Our forest, saved at 1 percent quality. This image is only about 239 KB in size, yet it’s still easily recognizable as the same photo. That’s the power of the JPEG.Original image: Irina Iriser
Then in 1997, a company named Forgent Networks acquired Compression Labs. The company eventually spotted the patent and began filing lawsuits over it, a series of events it saw as a stroke of good luck.
“The patent, in some respects, is a lottery ticket,” Forgent Chief Financial Officer Jay Peterson told CNET in 2005. “If you told me five years ago that ‘You have the patent for JPEG,’ I wouldn’t have believed it.”
While Forgent’s claim of ownership of the JPEG compression algorithm was tenuous, it ultimately saw more success with its legal battles than Unisys did. The company earned more than $100 million from digital camera makers before the patent finally ran out of steam around 2007. The company also attempted to extract licensing fees from the PC industry. Eventually, Forgent agreed to a modest $8 million settlement.
As the company took an increasingly aggressive approach to its acquired patent, it began to lose battles both in the court of public opinion and in actual courtrooms. Critics pounced on examples of prior art, while courts limited the patent’s use to motion-based uses like video.
By 2007, Forgent’s compression patent expired—and its litigation-heavy approach to business went away. That year, the company became Asure Software, which now specializes in payroll and HR solutions. Talk about a reboot.
The JPEG file format has served us well. It’s been difficult to remove the format from its perch. The JPEG 2000 format, for example, was intended to supplant it by offering more lossless options and better performance. The format is widely used by the Library of Congress and specialized sites like the Internet Archive, however, it is less popular as an end-user format.
See the forest JPEG degrade from its full resolution to 1 percent quality in this GIF. Original image: Irina Iriser
Other image technologies have had somewhat more luck getting past the JPEG format. The Google-supported WebP is popular with website developers (and controversial with end users). Meanwhile, the formats AVIF and HEIC, each developed by standards bodies, have largely outpaced both JPEG and JPEG 2000.
Still, the JPEG will be difficult to kill at this juncture. These days, the format is similar to MP3 or ZIP files—two legacy formats too popular and widely used to kill. Other formats that compress the files better and do the same things more efficiently are out there, but it’s difficult to topple a format with a 30-year head start.
Shaking off the JPEG is easier said than done. I think most people will be fine to keep it around.
Ernie Smith is the editor of Tedium, a long-running newsletter that hunts for the end of the long tail.
How in the world do people store images / photos nowadays?
Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.
JPEG is... old, and it shows. The filesizes are a bit bloated, which isn't really a huge problem with modern storage, but the quality isn't great.
JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)
HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.
AVIF seems computationally expensive and the support is pretty spotty - 8bit yuv420 might work, but 10b or yuv444 often doesn't. Windows 10 also chokes pretty hard on it.
Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.
PNG is cheap and support is ubiquitous but filesizes become sky-high very quick.
So what's left? I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them. Is jpeg still the only good option? Or is encoding everything in jpeg-xl or avif + praying things get better in the future a reasonable bet?
> JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)
It's worth noting that Firefox is willing to adopt JPEG-XL[1] as soon as the rust implementation[2] is mature And that rust impl is a direct port from the reference C++ implementation[3]. Mac OS and Safari already support JPEG-XL [4]. And recently Windows picked up JPEG-XL support. The only blockers at this point are Firefox, Chromium, and Android. If/when Firefox adopts JPEG-XL, we'll probably see google follow suit if only out of pressure from downstream Chromium platforms wanting to adopt it to maintain parity.
So really if you want to see JPEG-XL get adopted, go throw some engineering hours at the rust implementation [2] to help get it up to feature parity with the reference impl.
-----
1. https://github.com/mozilla/standards-positions/pull/1064
2. https://github.com/libjxl/jxl-rs
3. https://github.com/libjxl/libjxl
4. https://www.theregister.com/2023/06/07/apple_safari_jpeg_xl/
5. https://www.windowslatest.com/2025/03/05/turn-on-jpeg-xl-jxl...
g**gle is hellbent on killing JPEG-XL support in favor of WebP. assuming they'll capitulate to downstream pressure is a stretch. this article [0] sums it up nicely:
What this [removal of support for JPEG-XL in Chromium] really translates to is, “We’ve created WebP, a competing standard, and want to kill anything that might genuinely compete with it”. This would also partly explain why they adopted AVIF but not JPEG XL. AVIF wasn’t superior in every way and, as such, didn’t threaten to dethrone WebP.
[0] https://vale.rocks/posts/jpeg-xl-and-googles-war-against-it
I'm not assuming they capitulate under just pressure. Rather I'm assuming they'll capitulate if a majority of or even all of the big third party chromium browsers push for adding it to mainline chromium.
This is less just blind pressure but rather the risk that google becomes seen as an untrustworthy custodian of chromium and that downstreams start supporting an alternate upstream outside of google's control.
Jxl is certainly a hill that google seems intent to stand on but I doubt it's one they'd choose to die on. Doubly so given the ammo it'd give in the ongoing chrome anti-trust lawsuits.
How is Google so intent on webp winning? They don't even support it in their own products besides Chrome.
You either die an hero or live long enough to become the MS Office team.
But MS Office team is more easier to understand: they want money.
Chrome is a "free funnel" into Google services, just like Android.
That 20B a year that Google pays to Apple, think of that but a bigger traffic stream. That is Chrome, so Chrome is worth MORE than 20B a year to Google.
https://www.theverge.com/2024/5/2/24147007/google-paid-apple...
Well not directly
The risks and downsides of exposing an image decoder to the entire web are very real, especially a relatively new/untested one written in a language like C++. There's been vulnerabilities in pretty much every other image decoder and I fully expect jpeg-xl to be no different. You can't just brush that aside. Hell, article doesn't even acknowledge it. Google has no real stake in webp vs. jpeg-xl either. You may disagree with the decision, this this kind of stuff doesn't make much sense.
> HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.
HEIC was developed by the MPEG folks and is an ISO standard, ISO/IEC 23008-12:2022:
* https://www.iso.org/standard/83650.html
* https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...
An HEIC image is generally a still frame from ITU-T H.265† (HEVC):
* https://www.geeky-gadgets.com/heif-avc-h-264-h-265-heic-and-...
OS support includes Windows 10 v1083, Android 10+, Ubuntu 20.04, Debian 10, Fedora 36. Lots of cameras and smartphones support it as well.
There's nothing Apple-specific about it. Apple went through the process of licensing H.265, so they got HEIC 'for free' and use it as the default image format because over JPEG it supports: HDR, >8-bit colour, etc.
†Like WebP was similar to an image/frame from a VP8 video.
> OS support includes Windows 10 v1083
Ba-ha-ha... ha-ha... no.
Support is virtually non-existent. Every year or so, I try to use my Windows PC to convert a RAW photo taken with a high-end Nikon mirrorless camera to a proper HDR photo (in any format) and send it to my friends and family that use iDevices.
This has been literally impossible for the last decade, and will remain impossible until the heat death of the universe.
Read-only support is totally broken in a wide range of apps, including Microsoft-only apps. There are many Windows imaging APIs, and I would be very surprised if more than one gained HEIC support. Which is probably broken.
Microsoft will never support an Apple format, and vice versa.
Every single new photo or video format in the last 25 years has been pushed by one megacorp, and adoption outside of their own ecosystem is close to zero.
JPEG-XL is the only non-megacorp format that is any good any got and got multi-vendor traction, which then turned into "sliding backwards on oiled ice". (Google removed support from Chromium, which is the end of that sad story.)
> Every year or so, I try to use my Windows PC to convert a RAW photo taken with a high-end Nikon mirrorless camera to a proper HDR photo (in any format) and send it to my friends and family that use iDevices.
> This has been literally impossible for the last decade, and will remain impossible until the heat death of the universe.
It's possible right now with gainmap jpegs. Adobe can create them, Android captures in them now, and Apple devices can view them even. Or if they can't yet they can very soon, Apple announced support at the recent WWDC (Apple brands it "adaptive HDR")
There's something kinda hilariously ironic that out of all these new fancy image codecs be it HEIC, AVIF, or JPEG-XL, it's humble ol' JPEG that's the first to deliver not just portable HDR, but the best quality HDR of any format of any kind
My Samsung Galaxy outputs HEIC by default afaik. It's configurable, and I can turn that off, but still HEIC is not apple specific.
>> OS support includes Windows 10 v1083
> Ba-ha-ha... ha-ha... no. […]
Feel free to hit "Edit" on the Wikipedia page and correct it then:
* https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...
> Microsoft will never support an Apple format, and vice versa.
Once again, it's not an Apple format: it was developed by MPEG and is published by ISO/IEC, just like H.264 and H.265.
Or do you think H.264 and H.265 are an "Apple format" as well?
> It's not an Apple format
Create a HDR HEIC file on anything other than an Apple Device.
Upload it to an Apple Device.
Now use it any way: Forward it, attach it to a message, etc...
This won't work.
It won't ever work because the "standard" is not what Apple implements. They implement a few very specific subsets that their specific apps produce, and nothing else.
Nobody else implements these specific Apple versions of HEIC. Nobody.
For example, Adobe Lightroom can only produce a HEIC file on an Apple device.
My Nikon camera can produce a HDR HEIC file in-body, but it is useless on an Apple device because it's too dark and if forwarded in an iMessage... too bright!
It's a shit-show, comparable to "IPv6 support" which isn't.
That's not an argument. HEIC is to HEVC what WebP is to WebM. The lack of support in other products is due to developers not picking up the pace and sticking with "GIF, JPEG and PNG is good enough".
> HEIC was developed by the MPEG folks
And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!
Confusingly, there are two different MPEGs in this context.
MPEG the standards group is organized by ISO and IEC, along with JPEG.
The one you’re thinking of - MPEG LA, the licensing company - is a patent pool (which has since been subsumed by a different one[1]) that’s unaffiliated with MPEG the standards group.
So what good is it to have a separate entity doing the standard when the standard is unaffordable outside the top 500?
No, they’re in a patent pool. There’s what looks like a relatively up-to-date list at https://en.wikipedia.org/wiki/Via-LA#H.265/HEVC_licensors
5203 active patents for HEVC, 1480 for H.264. That's just plain insane! I get that video formats are complex, but complex enough to consist of 5000+ distinct, nontrivial inventions?
Many of those are just marginally related, and might not apply to the actual standard.
> And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!
Not wrong, but this is a different topic/objection than the GP's 'being locked into Apple's ecosystem'.
And as the Wikipedia article for HEIC shows, there's plenty of support for the format, even in open source OSes.
* https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...
As far as I know, that's only support for the container format. You can't actually decode HEIC without also installing libde265, which you are supposed to have a license for. I'm not even sure how you'd go about getting an individual license.
> You can't actually decode HEIC without also installing libde265, which you are supposed to have a license for. I'm not even sure how you'd go about getting an individual license.
Debian doesn't seem to have a problem with it:
> Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.
Say what? A random scan across the internet will reveal more videos in MP4 and H.264 format than av1. Perhaps streaming services have switched, but that is not what regular consumers usually use to make and store movies.
New compressed media formats always travel a decade-long path from either (1) obscurity → contender → universal support or (2) obscurity → contender → forgotten curiosity. AV1 is on one path, WebP is on another.
WebP remains fairly esoteric after 15 years, has always been a solution in search of a problem, and isn’t even universally supported in products by its owner.
AV1 was created and is backed by many companies via a non-profit industry consortium, solves real problems, and its momentum continues to grow. https://bitmovin.com/blog/av1-playback-support/
Funnily enough, JPEG2000-support was eventually removed everywhere. I assume the only reason this didn't happen with WebP as well is Google pushing and keeping it in Chrome.
AV1 is on the path to universal support and WebP is on the path to obscurity.
Apple CPUs have AV1 support in hardware.
Only support for decoding and from A17 Pro and M3 onwards, I believe? Going to be a few years before that's commonly available (he says from the work M1 Pro.)
[edit: clarify that it's decoding only]
I think you're arguing the same point—that there's plenty of support and it's arguably growing.
Yeah, I think I only just found out about av1 a few weeks ago with a video file that wouldn't play. Thought it was corrupted at first it's been so long since I saw something like that.
And H.264 is about to be patent free this year in many places.
I suspect there are even more H.265 than av1.
From what I've seen WebP is probably the strongest contender for a JPEG replacement. It's pretty common in the indie game scene for example to re-encode a JPEG game to WebP for better image quality and often a significant (25% or more) savings on installer size. Support is coming, albeit somewhat slowly. It was pretty bad in Ubuntu 22, but several apps have added support in Ubuntu 24. Windows 11 supports WebP in Photos and Paint for another example.
I hate webp. Not for any legitimate technical reason like, but I often just want to download an image from the web for an image board or drop it in a diagram or ppt or for a joke and nothing works with that format. Nothing. Google image search is useless because of it.
Cmd+shift+4 is now the only way to grab an image out of a browser. Which is annoying.
It has made my life needlessly more complicated. I wish it would go away.
Maybe if browsers auto converted when you dragged ann image out of the browser window I wouldn’t care, but when I see webp… I hate.
Often (in my experience) WebP is served as a bait-and-switch even if the link ends with .jpg. So I use Curl to fetch the file, and since Curl doesn't send "Accept: image/webp" unless you tell it to, the server just gives you what you ask for.
I once edited Firefox config to make it pretend to not support WebP, and the only site that broke was YouTube.
lol I installed the firefox extension "Don't Accept image/webp" but I assume a lot of sites just ignore it
Webp images are right up there with the fake transparent PNGs you come across in Google Images.
My working model is that WebP images are generally a lossy copy of a PNG or a generation-loss transcoding of a JPG image. I know that lossless WebP technically exists but nobody uses it when they're trying to save bandwidth at the cost of the user.
Even if webp got better support later, I want it deprecated just as revenge for previously wasting my time.
That's true of any new format. Until everything supports it it's not so great. iPhone saves .HEIC which I have to convert to something else to be useful. It's not everywhere (not sure it ever will be).
Windows didn't use to show .jpgs in the window explorer. I know becase I wrote a tool to generate thumbnail HTML pages to include on archive CDs of photos.
To solve this problem, some format has to "win" and get adopted everywhere. That format could be webp, but it will take 3-10 years before everything supports it. It's not just the OS showing it in it's file viewer. It's it's preview app supporting it. It's every web site that lets you upload an image (gmail/gmaps/gchat/facebook/discord/messenger/slack/your bank/apartment-rental-companies, etc..etc..etc..) I just takes forever to get everyone to upgrade.
When does a format stop being new? WebP was introduced fifteen years ago.
When it's widely adopted.
WebP gets pushed into your series of tubes without your consent, and the browser that you're most likely to use to view them just happens to be made by the same company that invented the codec. It's DivX and Real Media all over again.
Worst case you can open it up in Paint and save as JPEG.
Also, I just checked and Powerpoint has no problem dropping in a webp image. Gimp opens it just fine. You are right that web forums are often well behind the times on this.
Worst case you can open it up in Paint and save as JPEG
If he's using ⌘⇧4 to take a screenshot, he probably isn't going to open it in Microsoft Paint.
Total agreement from me, I use this:
bin/webp2png:
#!/bin/bash
dwebp "$1" -o "${1%%.webp}".png
I use ThumbsUp a free utility from https://www.devontechnologies.com/apps/freeware to convert webp/heic or whatever inconvenient format.
Just drop the offending image onto the icon in the dock.
On the classic macs they’re was a program called DropDisk. You could drag a disk image to it and it auto mounted it. That suggests a tool for you. Make a desktop app that you can drag and drop images on that converts them to jpeg and saves them in a folder.
If you want to do that so badly and hate webp so much, why not screenshot it? Then you don't have to care what format it's in on the browser
Pretty sure I've managed to configure my Firefoxes to act as webp does not exist...
It's a constant battle though to keep those browser extensions updated, especially since Google decided that extensions cut into their profits and they essentially made them useless.
Photoshop has native WebP support now too!
Exactly, part of being a "superior format" is adoption. Until then, it's just another in a sea of potential.
> It's pretty common in the indie game scene
That's such a weak argument. If I was an indie game developer, I would use whatever obscure format would offer me the most benefit, since I control the pipeline from the beginning (raw TIFF/TGA/PNG/... files) to the end (the game that needs to have a decoder and will uncompress it into GPU memory). 20 minutes extra build-time on the dev machine is irrelevant when I can save hundreds of MBs.
However, that is not the benchmark for a format widely used on the internet. Encoding times multiply, as does the need to search for specialized software, and literally everyone else needs to support the format to be able to view those files.
Also webp support in browsers is looking pretty good these days: https://caniuse.com/webp
The last major browser to add support was Safari 16 and that was released on September 12, 2022. I see pretty much no one on browsers older than Safari 16.4 in metrics on websites I run.
People want to use images outside of browsers too.
What apps are you using in 2025 that handle images but doesn't support webp?
I can't think of any on my Fedora desktop for instance.
Lots of websites that expect me to upload images only accept jpeg and png.
Another one I recently interacted with are video backgrounds for zoom. Those apparently can only be jpeg, not even png
Luminar Neo, for example, doesn't handle webp. And there's more than just Fedora, IIRC.
Yeah, after seeing the logs I made the switch to webp earlier this year. As much as I hate to admit it (not a fan of Google), it’s a pretty big bandwidth savings for the same (or better) quality.
I switched to webp on my forum for avatars and other user image uploads.
With one format you get decent filesize, transparency, and animation which makes things much simpler than doing things like conditionally producing gifs vs jpegs.
.. or you can go directly to avif - https://caniuse.com/avif (93%) instead of webp (95% support).
I'd love to see a comparison of the computational expense of avif compared to webp. I know av1, which avif is based off, is pretty hard on the hardware.
https://cloudinary.com/blog/jpeg-xl-and-the-pareto-front
See various charts. For example, this table:
https://res.cloudinary.com/cloudinary-marketing/images/f_aut...
For decoding,
Webp - ~70Mpx/s,
jpeg - 100 to 300 Mpx/s
Jpeg xl - 115 to 163
Avif (single thread) - 32 to 37
Avif (multithread) - 90 to 110
FWIW, the day I discovered jpegli[1] I left WebP behind. Similar sizes to WebP while maintaining JPEG compatibility.
Normal people use jpeg. It's good enough, much like mpeg-2 was good enough for DVDs. Compatibility always beats storage efficiency.
Photography nerds will religiously store raw images that they then never touch. They're like compliance records.
I think most photography nerds who want to save edited images to a lossless format will use TIFF, which is very different from the proprietary "raw" files that come out straight out of the camera.
Most raw files are TIFF with proprietary tags.
You'd be wrong in my experience.
No photog nerd wants EVEN MORE POSTPROCESSING.
I don't understand. You've got to save the edited result in a file somehow. What format do you use?
The file as it comes out of the camera, so-called raw, is a family of formats. Usually such files are kept untouched and any edits are saved in a lightweight settings file (in the format of your editing app) alongside the original.
And a low of RAW format are adopting or considering adopting JPEG Lossless as codec.
Is that like how javascript was named so as to imply a connection with java, in spite of there being none?
JPEG is the ur-example of lossy compression. JPEG Lossless can't have any connection with that.
I'm not even really a hobbyist photographer anymore, but when I was, the full lossless edit was a .psd and that was generally exported to (high quality) jpg for distribution. I have folders full of carefully curated raws. For the relatively few that were ever edited they have an accompanying psd. The jpgs are ephemeral and don't get saved long term.
Normal people just use whatever the default on their phone is. Which for iPhone is HEIC, not sure about Android, AVIF?
> not sure about Android, AVIF?
JPEG, or fancier jpeg: https://developer.android.com/media/platform/hdr-image-forma...
> How in the world do people store images / photos nowadays?
Well, as JPEGs? Why not? Quality is just fine if you don't touch the quality slider in Photoshop or other software.
For "more" there's still lossless camera RAW formats and large image formats like PSD and whatnot.
JPEG is just fine.
I wonder how much of JPEG good quality is that we are quite accustomed to its artefacts.
I've never seen JPEG artifacts on images modified/saved 5 or fewer times. Viewing on a monitor including at 100%, printing photos, whatever - in practice the artifacts don't matter.
jpeg artifacts mainly show up on drawings. where they seriously degrade masking operations. which is a hobby of mine. so I always appreciate it when a drawing is a png. rather than a bunch of jpeg goop.
At high quality, the artifacts are not visible unless you take a magnifying glass to the pixels, which is a practice anathema to enjoying the photo.
I am referring to highly compressed images or low resolution ones, at high bitrates mostly all formats look the same.
what i mean is that jpeg squarish artifacts look ok while av1 angular artifacts look distorted
JPEG artifacts are less disturbing because they're so obviously artificial. WEBP and similar artifacts look more natural, which makes them harder to ignore.
I think I agree, low quality JPEGs give the idea of looking through slightly imperfect glass, WEBP and AV1 look a bit more like bad AI
For non-photographic images, I'm horribly sensible to the ringing artifacts. Thankfully, there's waifu2x (in denoise mode only) to remove those when textures don't confuse it too much and I use MozJPEG to encode, which really improves the result.
There's something to be said about this. A high quality JPEG after cleanup can sometimes be larger than an ARW (sony RAW) on export and it makes no sense to me.
For my extensive collection of photography, I export to JPEG-XL and then convert to JPEG for use online. Most online services, like Flickr, Instagram, et al don't support JPEG-XL, but there's almost no quality loss converting from JPEG-XL to JPEG vs exporting to JPEG directly from your digital asset management system, and storing locally in JPEG-XL works very well. Almost all desktop tools I use support JPEG-XL natively already, conversely almost nothing support WEBP.
There is NO quality loss when converting from JPEG XL to JPEG and vice versa. It was done by design. Not an accident.
You're confusing jpg>jxl>jpg, which can be done losslessly via a special mode, and jxl > jpg, which can't (even ignoring all the extra features of jxl that jpg doesn't support)
this isn't true. there's no loss from jpeg to jpeg-xl (if you use the right mode), but the reverse is not true
I sorry to say that you are wrong about this.
> Key features of the JPEG XL codec are: > lossless JPEG transcoding,
> Moreover, JPEG XL includes several features that help transition from the legacy JPEG coding format. Existing JPEG files can be losslessly transcoded to JPEG XL files, significantly reducing their size (Fig. 1). These can be reconstructed to the exact same JPEG file, ensuring backward compatibility with legacy applications. Both transcoding and reconstruction are computationally efficient. Migrating to JPEG XL reduces storage costs because servers can store a single JPEG XL file to serve both JPEG and JPEG XL clients. This provides a smooth transition path from legacy JPEG platforms to the modern JPEG XL.
https://ds.jpeg.org/whitepapers/jpeg-xl-whitepaper.pdf
If you need more profs, you could transcode a JPEG to JPEG XL and convert against to JPEG. The result image would be BINARY IDENTICAL to the original image.
However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.
Yes, JPG to JPEG XL and back is lossless, but the reverse is nowhere mentioned.
Trying around with some jpg and jxl files I cannot convert jxl losslessly to jpg files even if they are only 8bit. The jxl files transcoded from jpg files show "JPEG bitstream reconstruction data available" with jxlinfo, so I think some extra metadata is stored when going from jpg to jxl to make the lossless transcoding possible. I can imagine not supporting the reverse (which is pretty useless anyway) allowed for more optimizations.
JPG to JXL is lossless, and will save around 20%
JXL to JPG is lossless as in a bit-for-bit identical file can be generated
> JXL to JPG is lossless
only if you got the JXL from JPG.
> However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.
A lot of those features (non-8×8 DCTs, Gaborish and EPF filters, XYB) are enabled by default when you compress a non-JPEG image to a lossy JXL. At the moment, you really do need to compress to JPEG first and then transcode that to JXL if you want the JXL→JPEG direction to be lossless.
> However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.
So he was not wrong about this. You have perfect JPEG -> JPEG XL conversion, but not the other way around.
default jpeg-xl uses a different color space (XYZ), bigger transform (up to 256x256), rectangular transforms, etc. if you go from jpg to jxl, you can go back (but your jxl file will be less efficient), but if you compress directly to jxl, you can't losslessly go to jpg
That's good to know. I'm not an image format expert, but I couldn't see any loss that was visually discernible at any rate.
People often forget that PNG images can be compressed in a lossy manner to keep the filesize down, not quite as well as jpegs but still quite substantially.
Tiff if you want to archive them and they started as raw or tiff, jpeg for everything else. If the file is already jpeg, there is no point in covering it to a new better quality format, the quality won't get better than it already is.
It may be obsolete, but it is ubiquitous. I care less about cutting edge tech than I do about the probability of being able to open it in 20+ years. Storage is cheap.
Presentation is a different matter and often should be a different format than whatever your store the original files as.
And jpg isn’t that bad when encoded at high quality, and not saved repeatedly.
I took a tiff and saved it high quality jpg. Loaded both into photoshop and “diffed” them (basically subtracted both layers). After some level adjustment you could see some difference but it was quite small.
AV1 is not the clear winner for video. Currently-existing encoders are worse than x265 for high-bitrate encodes.
AV1's advantage narrows to ~5% over H.265 at very high data rates, in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps. But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate, and of course H.265 is heavily patent encumbered in comparison. https://chipsandcheese.com/p/codecs-for-the-4k-era-hevc-av1-...
>AV1's advantage narrows to ~5% over H.265 at very high data rates.... But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate,
There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality. And hence how most torrents are still using x265 because AV1 simply doesn't match the quality offered by other encoder/ x265. Nor does the AOM camp cares about it, since their primarily usage is YouTube.
>in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps.
It is not. And never will be. MP3 has inherent disadvantage that needs substantial higher bitrate for quite a lot of samples, even at 320kbps. We have been through this war for 10 years at Hydropgenaudio with Data to back this up, I dont know why in the past 2-3 years the topic has pop up once again.
MP3 is not better than AAC-LC in any shape or form even at 25% higher bitrate. Just use AAC-LC, or specifically Apple's Quick Time AAC-LC Encoder.
> There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality.
In early AV1 encoders, psychovisual tuning was minimal and so AV1 encodes often looked soft or "plastic-y". Today's AV1 encoders are really good at this when told to prioritize psy quality (SVT-AV1 with `--tune 3`, libaom with `--tune=psy`). I'd guess that there's still lots of headroom for improvements to AV1 encoding.
> And hence how most torrents are still using x265 because…
Today most torrents still use H.264, I assume because of its ubiquitous support and modest decode requirements. Over time, I'd expect H.265 (and then AV1) to become the dominant compressed format for video sharing. It seems like that community is pretty slow to adopt advancements — most lossy-compressed music <finger quotes>sharing</finger quotes> is still MP3, even though AAC is a far better (as you note!) and ubiquitous choice.
My point about MP3 vs. AAC was simply: As you reduce the amount of compression, the perceived quality advantages of better compressed media formats is reduced. My personal music library is AAC (not MP3), encoded from CD rips using afconvert.
>Today most torrents still use H.264
That's not what I'm seeing for anything recent. x265 seems to be the dominant codec now. There's still a lot of support for h.264, but it's fading.
svt-av1 has come a long ways recently; I tried it a few weeks ago.
It still (maddeningly !) defaults to PSNR, but you can change that. There are some sources where I find it now can significantly improve over H.265 at higher data rates, and, while my testing was limited, I couldn't find sources any where H.265 clearly won based on my mark-1 eyeball. This is in contrast to when I tried multiple av1 encoders 2-ish years ago and they, at best, matched H.265 at higher bitrates.
I don't care about VMAF or PSNR, I care about looking with my eyes. With x265 on veryslow and AV1 on preset 0/1, and the source being a UHD BD I was downscaling to 1080p, AV1 looked worse even while using a higher bitrate than x265. Current AV1 encoders have issues with small details and have issues with dark scenes. People are trying to fix them (see svt-av1-psy, being merged into SVT-AV1 itself) but the problems aren't fixed yet.
>see svt-av1-psy, being merged into SVT-AV1 itself
Part of it being merged for now.
It is unfortunate this narrative hasn't caught on. Actual quality over VMAF and PSNR. And we haven't had further quality improvement since x265.
I do get frustrated every time the topic of codec comes up on HN. But then the other day I only came to realise I did spend ~20 years on Doom9 and Hydrogenaudio I guess I accumulated more knowledge than most.
Well, did your "eyes" care more about fidelity or appeal?
https://cloudinary.com/blog/what_to_focus_on_in_image_compre...
Yup, have had the same experience.
> How in the world do people store images / photos nowadays?
I had some high resolution graphic works in TIFF (BMP + LZW). To save space, I archived them using JPEG-2000 (lossless mode), using the J2k Photoshop plug-in ( https://www.fnord.com/ ). Saved tons of GBs. It has wide multi-platform support and is a recognized archival format, so its longevity is guaranteed for some time on our digital platforms. Recently explored using HEIF or even JPEG-XL for these but these formats still don't handle CMYK colour modes well.
>are nigh-unworkable on desktops, support is very spotty
I use .webp often and I don't understand this. At least on Windows 10 I can go to a .webp and see a preview and double-click and it opens in my image editor. Is it not like this elsewhere?
Try uploading one to any web service. Like imgur.
I recognize it as beating a dead horse now, but JPEG XL did what was needed to be actually adopted. AVIF has not been widely adopted given the difficulty of a leap to a new format in general and the computational cost of encoding AVIF specifically.
One of JPEG XL's best ideas was incorporating Brunsli, lossless recompression for existing JPEGs (like Dropbox's Lepton which I think might've been talked about earlier). It's not as much of a space win as a whole new format, but it's computationally cheap and much easier to just roll out today. There was even an idea of supporting it as a Content-Encoding, so a right-click and save would get you an OG .jpg avoiding the whole "what the heck is a WebP?" problem. (You might still be able to do something like this in a ServiceWorker, but capped at wasm speeds of course.) Combine it with improved JPEG encoders like mozjpeg and you're not in a terrible place. There's also work that could potentially be done with deblocking/debanding/deringing in decoders to stretch the old approach even further.
And JXL's other modes also had their advantages. VarDCT was still faster than libaom AVIF, and was reasonable in its own way (AVIFs look smoother, JXL tended more to preserve traces of low-contrast detail). There was a progressive mode, which made less sense in AVIF because it was a format for video keyframes first. The lossless mode was the evolution of FUIF and put up good numbers.
At this point I have no particular predictions. JPEG never stopped being usable despite a series of more technically sophisticated successors. (MP3 too, though its successors seemed to get better adoption.) Perhaps it means things continue not to change for a while, or at least that I needn't rush to move to $other_format or get left behind. Doesn't mean I don't complain about the situation in comments on the Internet, though.
> So what's left? I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them.
Yea, how is this still the case in 2025?
My family has a mix of Apple and Samsung devices and they move/backup their pictures to their Windows machines whenever they run out of space but once they move them, they can't easily view or browse them.
I had to download a 3rd party app and teach them to view them from there.
Maybe they’re not using hardware acceleration? H265 in software is really slow.
I store Raw + PSD with edits/history + whatever edited output format(s) I used.
I see no reason not to use JPEG-XL for archival storage. It is (IMO) the best all-rounder of the current formats, and 20 years from now imagemagick will still be able to convert it to whatever you want.
> JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)
I've done several tests where I lowered the quality settings (and thus, the resulting file size) of JPEG-XL and AVIF encoders over a variety of images. In almost every image, JPEG-XL subjective quality fell faster than AVIF, which seemed mostly OK for web use at similar file sizes. Due to that last fact, I concede that Chrome's choice to drop JPEG-XL support is correct. If things change (JPEG-XL becomes more efficient at low file sizes, gains Chrome support), I have lossless PNG originals to re-encode from.
At least there's JPEG-XL support in recent Windows 11 updates.
I've found that sometimes WebP with lossless compression (-lossless) results in smaller file sizes for graphics than JPEG-XL and sometimes it's the other way around.
I've done a few shootouts at various times in the last 10 years. I finally decided WebP was good for the web maybe two years ago, that is, I have 'set it or forget it' settings and get a good quality/size result consistently. (JPEG has the problem that you really need to turn the knob yourself since a quality level good for one image may not be good for another one)
I don't like AVIF, at least not for photos I want to share. I think AVIF is great for "a huge splash image for a web page that nobody is going to look at closely" but if you want something that looks like a pro photo I don't think it's better than WebP. People point out this example as "AVIF is great"
https://jakearchibald.com/2020/avif-has-landed/demos/compare...
but I think it badly mangles the reflection on the left wing of the car and... it's those reflections that make sports cars look sexy. (I'll grant that the 'acceptable' JPEG has obvious artifacts whereas the 'acceptable' AVIF replaced a sexy reflection with a plausible but slighly dull replacement)
JPEG is old, and it works.
Images are sorted in folders, per year and some group description based on why they were taken, vacations, event, whatever.
Enable indexing on the folders, and usually there are no freezes to complain about.
For video it's not as easy as it takes way more compute and requires hardware support.
You can take any random device and it will be able to decode h264 at 4k. h265 not so much.
As for av1 - my Ryzent 5500GT released in 2024 does not support it.
I think the only cpus with av1 support right now, whether encode, decode or both, are tile era Meteor/Lunar/Arrowlake cpus from Intel.
Actually, it goes back to Tiger lake (2020; 2021 for desktop). [0]
Addendum: AMD since RDNA2 (2020-2021-ish) [1], NVIDIA since 30 series (2020) [2], Apple since M3? (2023).
Note: GP's processor released in 2024 but is based on an architecture from 2020.
[0] https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Hardwar...
[1] https://en.wikipedia.org/wiki/Video_Core_Next#Feature_set
[2] https://developer.nvidia.com/video-encode-and-decode-gpu-sup...
Apple M3 and newer CPUs and A17Pro and newer mobile CPUs also have hardware AV1 decode.
> HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.
Not really true in my experience, I have no problems using it in Windows 11, Linux, or with my non-Apple non-Google cloud photos app.
The iPhone using it in an incredibly widespread way has made it a defacto standard.
If you're having trouble in Windows, I wonder if you're running Windows 11 or 10? Because 11 seems a lot better at supporting "modern things" considering that Microsoft has been neglecting Windows 10 for 3 years and is deprecating it this year.
My problem with HEIC is that if you convert it to another format, it looks different from the original, for reasons that I don't understand. I switched my iPhone back to JPEG to avoid that.
[dead]
RAW? Storage is becoming cheeper, why discard the originals?
When looking for a format to display HQ photos on my website I settled with a combination of AVIF + JPG. Most photos are AVIF, but if AVIF is too magical comparatively to JPG (like 3x-10x smaller) I use a larger JPG instead. "Magic" means that fine details are discarded.
WebP discards gradients (like sunset, night sky or water) even at the highest quality, so I consider it useless for photography.
not every storage is created equal. 1TB hdd is dirt cheap, 1TB of cloud storage is expensive af
just use jpegxl. works great on linux. Pressure software you use to use the proper formats
> HEIC is good,
It's not. Support is still surprisingly patchy, and it takes a second or so to decode and show the image even on a modern M* Mac. Compared to instant PNG.
> I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them.
Indeed.
I recently reencoded my photography archive to webp. It's a static site hosted from S3. I was pretty happy with the size reduction.
HEIC for photos taken by my iPhone. Apple stuff seems to do a mostly ok job auto-converting to JPG when needed (I assume, since I haven’t manually converted one in ages).
And JPG for photos taken on a “real” camera (including scanned negatives). Sometimes RAW, but they’re pretty large so not often.
I found that if you plug a iphone into a windows PC and copy the photos off it will convert to jpg. However it makes copying very slow, and the quality is worse, so I'd advise to turn off the setting on the phone (I think its compatibility mode or similar)
Assuming you are asking about archiving: Use the original format it came in. If you're going to transcode it should be to something lossless like J2K or PNG.
>How in the world do people store images / photos nowadays?
With codecs built for that purpose I hope. Intra-frame misconceptions "formats" should stay that way. A curiosity.
I have zero issues with macOS and Linux while using modern image formats. Don’t use Windows, I guess.
> How in the world do people store images / photos nowadays?
PNG where quality matters, JPG where size matters.
You’re right, for a lot of scenario which is exactly what a standard is there to do, encapsulating the broad strokes.
> Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.
Right again, and WebP is the enrichment that goes with the backend when dealing with web. I wouldn’t knock it for not being local compatible, it was designed for the web first and foremost, I think it’s in the name.
>Just as there is a clear winner for video - av1
What?? Maybe I'm too much in aarrrgh circles but it's all H.264 / 265...
>JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome,
Why would I ever care about Chrome? I can't use adblockers on Chrome, which makes the internet even less usable than it currently is. I only start up chrome to bypass cross-origin restrictions when I need to monkey-patch javascript to download things websites try to keep me from downloading (or, like when I need to scrape from a Google website... javascript scraper bots seem to evade their bot detection perfectly, just finished downloading a few hundred gigabytes of magazines off of Google Books).
Seriously, fuck Chrome. We're less than 2 years away from things being as bad as they were in the IE6 years.
I think we're there, not 2 years away.
I have software that won't work quite right in Safari or Firefox through a VPN every single day. Maybe it's the VPN and maybe it's the browser but it doesn't matter. We're at IE levels it's just ever so slightly more subtle this time. I'm still using alternatives but it's a battle.
VPN's layer 2... I suppose it could be resizing packets in such as way as to make it glitch out, but that just seems improbable.
Some of the warez sites I download torrents from have captchas and other javascripticles that only work on Chrome, but I've yet to see it with mainstream sites.
Fight the good fight.
The VPN could be on an IP address with a bad reputation that's getting soft-blocked by some stuff
> I can't use adblockers on Chrome
Why does this myth persist?
uBlock Origin Lite works perfectly fine on Chrome, with the new Manifest v3. Blocks basically all the ads uBlock Origin did previously, including YouTube. But it uses less resources so pages load even faster.
There's an argument that adblocking could theoretically become less effective in the future but we haven't seen any evidence of that yet.
So you can very much use adblockers on Chrome.
If uBO Lite is really better, why does uBO exist?
Because uBO Lite uses a newer Chrome function call (declarativeNetRequest) that didn't exist previously (original uBO was based on webRequest).
webRequest is slower because it has to evaluate JavaScript for each request (as well as the overhead of interprocess communication), instead of the blocking being done by compiled C++ code in the same process like declarativeNetRequest does.
uBO also has a bunch of extra features like zapping that the creator explicitly chose not to include in uBO Lite, in the interests of making the Lite version as fast and resource-light as possible. For zapping, there are other extensions you can install instead if you need that.
They're two different products with two different philosophies based on two different underlying architectures. The older architecture has now gone away in Chrome, but the new one supports uBlock Origin Lite great.
I think you're overstating it a bit. There were definitely features that couldn't be implemented due to MV3 limitations rather than because the developer chose to leave them out.
https://github.com/uBlockOrigin/uBOL-home/wiki/Frequently-as...
Hm, I can't tell how serious these limitations are, but there are a lot, and this one stands out:
replace=, can't modify the response body (full support is only possible with Firefox MV2)
Thanks, I'll try it out then.
We keep hearing different variants of "webp should take over because now it has good browser support". But that is not nearly enough for an image format to reach widespread adoption.
People want to be able to open the images anywhere (what about watching photos in a smartTV? or an old tablet? what about digital picture frames?). They want to edit the images in any program (what about this FOSS editor? what about the many people stuck in pre-subscription Photoshop versions?).
They also want to ensure far future access to their precious photos, when formats like JPEG2000 or even WebP might be long gone. I mean, webp was made by Google and we know how many of their heavily promoted creations are dead already...
It also doesn't help that most people's experience with webp is force-recompressed versions of images that were originally jpeg. With relatively low quality settings.
I'm not sure people consciously make that association, but you know which association they absolutely do make? The one where google images started using webp at the same time as they were locking it down. At the time, ecosystem support was practically nonexistent, so it functioned as "soft DRM" and among people who know the term "webp" at all that's by far the #1 association.
TIL webp is more than just potato shots that I have to delete off my hard drive once in a while
i don't like working with webp. especially since chrome pushes it on everyting, but googles other tools like slides do not support it. Super annoying, especially since google developed it.
JPEG XL is better than webp
This reminds me of rust: even if rust will be widespread in 10 or 20 years, it will not really displace C++.
Although I need a engineering explanation as to why COBOL is still alive after all those years, because any tech cannot live forever.
Can’t live forever?
Latin is still going strong as well as water pipes (oldest being several millenia old).
Hard to predict which innovations remain resilient. The longer they stick around the more ”Lindy-proof” they are.
The explanation for COBOL is not an engineering one, but an economics one: “It’s cheaper to train a programmer to use COBOL than it is to rewrite the codebase in <language>.” (Perhaps LLMs might change the economics here.)
"If it ain't broke, don't fix it" also applies here.
There is also no guarantee that whatever new language you port the COBOL code to won't also be seen as obsolete in a few years. Software developers are a fickle bunch.
> COBOL
Was popular in the 60s in fintech, so banks, ATM:s and related systems went digital using it.
Those systems are still running.
Although you can do a bit of Ship-of-Theseus philosophy on those COBOL systems. After every software component has been rewritten multiple times and every hardware has died and subsequently got replaced, all that's left is the decision to stick with COBOL, not the fact that it's a legacy system built in the 60s.
Anywhere? I would be super happy if I could open them all in all contexts on my desktop computer.
Last I checked you couldn't even upload .webp images to GitHub or Telegram. Well, for GitHub you can cheat by renaming it to .png and GitHub's content detection will make it work regardless, but meh.
> But that is not nearly enough for an image format to reach widespread adoption
I use WEBP extensively but WEBP has a major flaw: it can do both lossy and lossless.
That's the most fucktarded thing to ever do for an image compression format. I don't understand the level of confusion and cluelessness that had to happen for such a dumb choice to have been made.
I've got an entire process around determining and classifying WEBP depending on whether they're lossless or lossy. In the past we had JPG our PNG: life was good. Simple. Easy.
Then dumbfucks decided that it made sense to cram both lossy and lossless under the same umbrella.
> They also want to ensure far future access to their precious photos, when formats like JPEG2000 or even WebP might be long gone.
That however shall never be an issue. You can still open, today, old obscure formats from the DOS days. Even custom ones only use by a very select few software back then.
It's not as if we didn't have emulators, converters, etc. and it's all open source.
Opening old WEBP files in the future shall never ever be a problem.
Determining if it's a lossy or lossless WEBP for non-technical users, however... ; )
> In the past we had JPG our PNG: life was good. Simple. Easy.
Except for if you wanted a compressed image with transparency for the web, in which case you had to sacrifice one of those two things or use a different format besides those two.
> Then dumbfucks decided that it made sense to cram both lossy and lossless under the same umbrella.
> I don't understand the level of confusion and cluelessness that had to happen for such a dumb choice to have been made.
Besides many end users not caring which one it is as long as they recognize the file type and can open it, I found a few interesting reasons for having both in the same format from a simple web search. One was the possibility of having a lossy background layer with a lossless foreground layer (particularly in an animation).
JPEG XL also decided to support either lossless or lossy compression, so it wasn't just WebP that decided it made sense.
That however shall never be an issue. You can still open, today, old obscure formats from the DOS days. Even custom ones only use by a very select few software back then.
Certainly not true.
One example: I have many thousands of photos from my Sony digital camera that cannot be opened by any current operating system without installing third-party software.
I'm lucky that the camera also output JPEG versions as it saved, so I'm able to view the JEPG thumbnails, then drag the Sony version into my photo editor of choice.
I am pretty sure webp is supported everywhere nowadays. I think it is just inertia.
It isn't universal, my phone gallery doesn't support webp at all by default and the windows gallery only supports non-animating webp from what I can tell?
My smartphone camera does not output webp and so does my professional Nikon.
As long as these two major sources of pictures stay on JPEG, I will too. Simply because that's all for subjective and completely debatable reasons.
To me what cameras support as an output is irrelevant. On a pro camera you generally use the raw files as a default format. What is important is the formats you can export to / manipulate afterwards for publication/exchange.
To me
Not everyone is you.
When I download a photo to send to my family webp always causes some kind of problem so I end up screenshotting it
Always always always -- and it's often multiple problems, where the filesystem preview generators don't support it or don't support it over a network or the social media used by the other person doesn't support it (often egregiously so, where an unrecognized drop bubbles up to browser scope and blasts the page you were on) or there's a weird problem with a site/app that is supposed to support it, such as it turns into a black box.
Support for webp is still so rough that I have to wonder what one's ecosystem must look like for it to be seamless. Maybe if you are a googler and your phone/computer/browser use entirely google software and ditto for your friends and your friends friends and your spouses? Maybe?
To my knowledge, not even every Google product supports it, but I have not verified support myself.
I blame Google for pushing it, but I also blame every third-party product for not supporting it, when it is mostly free to do so (I'm sure all of them internally use libraries to decode images instead of rolling their own code).
I think it works better in a totally open source ecosystem. I can share webp pics to my daughters via xmpp for example regardless if I am on my smartphone (conversations) or desktop (gajim)
> I mean, webp was made by Google and we know how many of their heavily promoted creations are dead already...
I don't understand this argument. WebP is an algorithm, not a service. You cannot kill it once it's published.
JPEG XL is similarly an algorithm that's been published, but Google removed it from their browser and Mozilla followed suit, which effectively killed its usefulness as a web-friendly (and, more generally, usable-anywhere) format.
Some nuance there — it's not dead yet: https://github.com/mozilla/standards-positions/pull/1064
> the reference decoder, which weighs in at more than 100,000 lines of multithreaded C++.
Wow! I have never written a compression codec implementation, but that's kind of staggering.
I do not, sorry.
Fair enough. What I meant by this is that, in the end, most software that decides to add webp support is doing it because of the huge push by Google to do so. But if they suddenly change that push to something else then webp might find itself growing more irrelevant.
I didn't know webp was pushed by Google. They should publicize that fact more so people know to avoid the format entirely.
What Google pushes is in their self interest and has nothing to do with the good of the unwashed masses.
WebP is basically a single i-frame from the WebM video codec, which literally was developed by Google to avoid paying license cost for H.264. For which they had great incentive.
WebP is to WebM what HEIC is to HEVC.
You can argue that using free codecs is a collateral benefit here, even though Google did it for selfish reasons. It is not detrimental to the public or the internet.
Beyond the compression (which is amazing), JPEG is also extremely fast when implemented well. I'm not aware of any other image format that can encode at 60fps+ @ 1080p on a single CPU core. Only mild SIMD usage is required to achieve this. With dedicated hardware, the encode/decode cost quickly goes to zero.
I struggle to understand the justification for other lossy image formats as our networks continue to get faster. From a computation standpoint, it is really hard to beat JPEG. I don't know if extra steps like intra-block spatial prediction are really worth it when we are now getting 100mbps to our smartphones on a typical day.
https://news.ycombinator.com/item?id=44298656
You might be getting 100 Mbps to your smartphone; many people – yes, even within the United States – struggle to attain a quarter of that.
What is the likelihood of experiencing precisely marginal network conditions wherein webp improves the user experience so dramatically over jpeg that the user is able to notice?
If jpeg is loading like ass, webp probably isn't going to arrive much faster.
I'm sorry, I misunderstood your doubt of the usefulness of other lossy formats as criticism of using lossy formats in general in the face of higher bandwidth. Reading too fast, never mind me... :)
If you have slow internet on your smartphone, chances are that you also have a slow smartphone, and therefore decoding performance matter, it may also save you a bit of battery life for the same reason, which may be important in place with little internet coverage.
You have to find a balance, and unless (still) pictures are at the center of what you are doing, it is typically only a fraction of the bandwidth (and a fraction of the processing power too).
We are not talking about 100 Mbps, we downloaded JPEGs from dialup connections you know. You don't even need to go into the Mbps unless you are streaming MJPEG (and why would you do that?).
25Mbps is extremely fast in relation to the benefits when browsing the web of better image compression options than JPEG.
> I struggle to understand the justification for other lossy image formats as our networks continue to get faster.
Because Google's PageSpeed and Lighthouse both tell people to use WebP, and a large percentage of devs will do anything Google say in the hopes of winning better SERP placement:
- https://web.dev/articles/serve-images-webp
- https://developer.chrome.com/docs/lighthouse/performance/use...
That’s why I am confident LLMs won’t change as much as some may think: after 20+ years of search engines, some still can’t be bothered to do a simple search. (Either that or you’re trolling, I can’t decide I have to say.) Hence, we can wait another 20 years and some will still not use LLMs for everyday questions.
To answer your (false?) question, there’s a long list of benefits, but I’d say HDR and storage efficiency are the two big ones I can think of. The storage efficiency especially is massive, especially with large images.
Exactly! It’s like asking why we still use wheels when hovercrafts exist.
If humans are still around in a thousand years they’ll be using jpegs and they’ll still be using them a thousand years after that. When things work they have pernicious tendency to stick around.
Wheels continue to support a load without power.
Wheels are vastly superior to hover technologies in the crucial areas of steering and controlled braking. (For uncontrolled braking, you just cut the power to your hover fans and lift the skirts...)
It turns out to be remarkably difficult to get a hovercraft to go up an incline...
Wheels are both suspension and traction in one system.
There's no particular physical advantage to JPEG over the others mentioned; it's just currently ubiquitous.
Can JPEG do 3D somehow (I'm thinking VR/AR)? DVDs lasted well, until the medium itself moved to cheap NAND flash and then various SSD technologies.
When/if simple screens get usurped then we'll likely move on from JPEG.
I'm sure you were being a little flippant but your last sentence shows good insight. Someone said "we just need it to work" to me the other day and the "if it works there will be little impetus to improve it"-flag went off in my brain.
Thanks, thats a great insight!
Idk about 3d, but I’ll assume someone probably will tape something out necessity if they haven't already.
…and yes, very flippant! But not without good reason. If we are to extrapolate; the popularity of jpeg, love it or hate it, will invariably necessitate it’s continued compatibility contributing to my pervious statement. That compatibility will invariably lead to plausible hypothetical circumstances where future developers out of laziness, ignorance, or just plain conformity to norms will lead to its choice and use perpetuating the cycle. The tendency as such is that short of a radical mass extension level like event brought about by mass wide spread technological adoption such as what you describe is why I don’t see it going away anytime soon. Not to say it couldn’t happen, I just feel it’s highly improbable because of the contributing human factors.
That jpeg gets so many complaints is I feel for two reasons. One, its ubiquity and two, that we actually see it! Some similar situations that don’t get nearly as much attention but are far more pervasive are tcp/ip, bash, ntpd, ad nauseam. All old pervasive protocols so embedded as to be taken for granted, and also not able to be seen.
I’ll leave with this engineering truism that I feel should be more widely adhered to in software development, especially by UI designers: if it ain’t broke don’t fix it!
depends what you mean by 3d. jpeg-xl does let you add arbitrary channels, so you could add a depth channel, but it's not going to do a good job for full 3d (e.g. light field/point cloud).
one place I think jxl will really shine is PBR and game textures. for cases like that, it's very common to have color+transparency+bump map+normal map, and potentially even more. bundling all of those into a single file allows for away better compression
Jp3d can do 3d, but it is not well supported. It is an extension to the JPEG2000 specification iirc.
Besides the awful wheel comparison, there are dozens of formats that worked and stuck around until they got replaced, so this also tells us nothing on such a huge timescale
transparancy? hdr? proper support for lossless? theres many things lacking in jpeg