Why JPEGs still rule the web (2024)

2025-06-1714:51220389spectrum.ieee.org

What makes JPEG files so special? Discover the technical magic that keeps them at the forefront of digital photography.

A version of this post originally appeared on Tedium, Ernie Smith’s newsletter, which hunts for the end of the long tail.

For roughly three decades, the JPEG has been the World Wide Web’s primary image format. But it wasn’t the one the Web started with. In fact, the first mainstream graphical browser, NCSA Mosaic, didn’t initially support inline JPEG files—just inline GIFs, along with a couple of other formats forgotten to history. However, the JPEG had many advantages over the format it quickly usurped.

aspect_ratioTedium logo, a red rectangle with the word Tedium in white, above the text "This post originally appeared on Tedium."

Despite not appearing together right away—it first appeared in Netscape in 1995, three years after the image standard was officially published—the JPEG and web browser fit together naturally. JPEG files degraded more gracefully than GIFs, retaining more of the picture’s initial form—and that allowed the format to scale to greater levels of success. While it wasn’t capable of animation, it progressively expanded from something a modem could pokily render to a format that was good enough for high-end professional photography.

For the internet’s purposes, the degradation was the important part. But it wasn’t the only thing that made the JPEG immensely valuable to the digital world. An essential part was that it was a documented standard built by numerous stakeholders.

The GIF was a de facto standard. The JPEG was an actual one

How important is it that JPEG was a standard? Let me tell you a story.

During a 2013 New York Times interview conducted just before he received an award honoring his creation, GIF creator Steve Wilhite stepped into a debate he unwittingly created. Simply put, nobody knew how to pronounce the acronym for the image format he had fostered, the Graphics Interchange Format. He used the moment to attempt to set the record straight—it was pronounced like the peanut butter brand: “It is a soft ‘G,’ pronounced ‘jif.’ End of story,” he said.

I posted a quote from Wilhite on my popular Tumblr around that time, a period when the social media site was the center of the GIF universe. And soon afterward, my post got thousands of reblogs—nearly all of them disagreeing with Wilhite. Soon, Wilhite’s quote became a meme.

The situation paints how Wilhite, who died in 2022, did not develop his format by committee. He could say it sounded like “JIF” because he built it himself. He was handed the project as a CompuServe employee in 1987; he produced the object, and that was that. The initial document describing how it works? Dead simple. 38 years later, we’re still using the GIF—but it never rose to the same prevalence of JPEG.

The JPEG, which formally emerged about five years later, was very much not that situation. Far from it, in fact—it’s the difference between a de facto standard and an actual one. And that proved essential to its eventual ubiquity.

Full resolution photo of a sunlit pine forest with a narrow trail winding through the trees and grassy undergrowth.We’re going to degrade the quality of this image throughout this article. At its full image size, it’s 13.7 megabytes.Irina Iriser

How the JPEG format came to life

Built with input from dozens of stakeholders, the Joint Photographic Experts Group ultimately aimed to create a format that fit everyone’s needs. (Reflecting its committee-led roots, there would be no confusion about the format’s name—an acronym of the organization that designed it.) And when the format was finally unleashed on the world, it was the subject of a more than 600-page book.

JPEG: Still Image Data Compression Standard, written by IBM employees and JPEG organization stakeholders William B. Pennebaker and Joan L. Mitchell, describes a landscape of multimedia imagery, held back without a way to balance the need for photorealistic images and immediacy. Standardization, they believed, could fix this.

“The problem was not so much the lack of algorithms for image compression (as there is a long history of technical work in this area),” the authors wrote, “but, rather, the lack of a standard algorithm—one which would allow an interchange of images between diverse applications.”

And they were absolutely right. For more than 30 years, JPEG has made high-quality, high-resolution photography accessible in operating systems far and wide. Although we no longer need to compress JPEGs to within an inch of their life, having that capability helped enable the modern internet.

As the book notes, Mitchell and Pennebaker were given IBM’s support to follow through this research and work with the JPEG committee, and that support led them to develop many of the JPEG format’s foundational patents. Described in patents filed by Mitchell and Pennebaker in 1988, IBM and other members of the JPEG standards committee, such as AT&T and Canon, were developing ways to use compression to make high-quality images easier to deliver in confined settings.

Each member brought their own needs to the process. Canon, obviously, was more focused on printers and photography, while AT&T’s interests were tied to data transmission. Together, the companies left behind a standard that has stood the test of time.

All this means, funnily enough, that the first place that a program capable of using JPEG compression appeared was not MacOS or Windows, but OS/2—a fascinating-but-failed graphical operating system created by Pennebaker and Mitchell’s employer, IBM. As early as 1990, OS/2 supported the format through the OS/2 Image Support application.

Nearly identical photo of a sunlit pine forest.At 50 percent of its initial quality, the image is down to about 2.6 MB. By dropping half of the image’s quality, we brought it down to one-fifth of the original file size. Original image: Irina Iriser

What a JPEG does when you heavily compress it

The thing that differentiates a JPEG file from a PNG or a GIF is how the data degrades as you compress it. The goal for a JPEG image is to still look like a photo when all is said and done, even if some compression is necessary to make it all work at a reasonable size. That way, you can display something that looks close to the original image in fewer bytes.

Or, as Pennebaker and Mitchell put it, “the most effective compression is achieved by approximating the original image (rather than reproducing it exactly).”

Central to this is a compression process called discrete cosine transform (DCT), a lossy form of compression encoding heavily used in all sorts of compressed formats, most notably in digital audio and signal processing. Essentially, it delivers a lower-quality product by removing details, while still keeping the heart of the original product through approximation. The stronger the cosine transformation, the more compressed the final result.

The algorithm, developed by researchers in the 1970s, essentially takes a grid of data and treats it as if you’re controlling its frequency with a knob. The data rate is controlled like water from a faucet: The more data you want, the higher the setting. DCT allows a trickle of data to still come out in highly compressed situations, even if it means a slightly compromised result. In other words, you may not keep all the data when you compress it, but DCT allows you to keep the heart of it.

(See this video for a more technical but still somewhat easy-to-follow description of DCT.)

DCT is everywhere. If you have ever seen a streaming video or an online radio stream that degraded in quality because your bandwidth suddenly declined, you’ve witnessed DCT being utilized in real time.

A JPEG file doesn’t have to leverage the DCT with just one method, as JPEG: Still Image Data Compression Standard explains:

The JPEG standard describes a family of large image compression techniques, rather than a single compression technique. It provides a “tool kit” of compression techniques from which applications can select elements that satisfy their particular requirements.

The toolkit has four modes:

  • Sequential DCT, which displays the compressed image in order, like a window shade slowly being rolled down
  • Progressive DCT, which displays the full image in the lowest-resolution format, then adds detail as more information rolls in
  • Sequential lossless, which uses the window shade format but doesn’t compress the image
  • Hierarchical mode, which combines the prior three modes—so maybe it starts with a progressive mode, then loads DCT compression slowly, but then reaches a lossless final result

At the time the JPEG was being created, modems were extremely common. That meant images loaded slowly, making Progressive DCT the most fitting format for the early internet. Over time, the progressive DCT mode has become less common, as many computers can simply load the sequential DCT in one fell swoop.

The same photo of a sunlit pine forest with very slight degradation visible.That same forest, saved at 5 percent quality. Down to about 419 kilobytes.Original image: Irina Iriser

When an image is compressed with DCT, the change tends to be less noticeable in busier, more textured areas of the picture, like hair or foliage. Those areas are harder to compress, which means they keep their integrity longer. It tends to be more noticeable, however, with solid colors or in areas where the image sharply changes from one color to another—like text on a page. Ever screenshot a social media post, only for it to look noisy? Congratulations, you just made a JPEG file.

Other formats, like PNG, do better with text, because their compression format is intended to be non-lossy. (Side note: PNG’s compression format, DEFLATE, was designed by Phil Katz, who also created the ZIP format. The PNG format uses it in part because it was a license-free compression format. So it turns out the brilliant coder with the sad life story improved the internet in multiple ways before his untimely passing.)

In many ways, the JPEG is one tool in our image-making toolkit. Despite its age and maturity, it remains one of our best options for sharing photos on the internet. But it is not a tool for every setting—despite the fact that, like a wrench sometimes used as a hammer, we often leverage it that way.

Forgent Networks claimed to own the JPEG’s defining algorithm

The JPEG format gained popularity in the ’90s for reasons beyond the quality of the format. Patents also played a role: Starting in 1994, the tech company Unisys attempted to bill individual users who relied on GIF files, which used a patent the company owned. This made the free-to-use JPEG more popular. (This situation also led to the creation of the patent-free PNG format.)

While the JPEG was standards-based, it could still have faced the same fate as the GIF, thanks to the quirks of the patent system. A few years before the file format came to life, a pair of Compression Labs employees filed a patent application that dealt with the compression of motion graphics. By the time anyone noticed its similarity to JPEG compression, the format was ubiquitous.

The same photo of a sunlit pine forest with more noticeable color degradation visible. Areas with previously subtle color gradients now appear more like blocks of color.Our forest, saved at 1 percent quality. This image is only about 239 KB in size, yet it’s still easily recognizable as the same photo. That’s the power of the JPEG.Original image: Irina Iriser

Then in 1997, a company named Forgent Networks acquired Compression Labs. The company eventually spotted the patent and began filing lawsuits over it, a series of events it saw as a stroke of good luck.

“The patent, in some respects, is a lottery ticket,” Forgent Chief Financial Officer Jay Peterson told CNET in 2005. “If you told me five years ago that ‘You have the patent for JPEG,’ I wouldn’t have believed it.”

While Forgent’s claim of ownership of the JPEG compression algorithm was tenuous, it ultimately saw more success with its legal battles than Unisys did. The company earned more than $100 million from digital camera makers before the patent finally ran out of steam around 2007. The company also attempted to extract licensing fees from the PC industry. Eventually, Forgent agreed to a modest $8 million settlement.

As the company took an increasingly aggressive approach to its acquired patent, it began to lose battles both in the court of public opinion and in actual courtrooms. Critics pounced on examples of prior art, while courts limited the patent’s use to motion-based uses like video.

By 2007, Forgent’s compression patent expired—and its litigation-heavy approach to business went away. That year, the company became Asure Software, which now specializes in payroll and HR solutions. Talk about a reboot.

Why the JPEG won’t die

The JPEG file format has served us well. It’s been difficult to remove the format from its perch. The JPEG 2000 format, for example, was intended to supplant it by offering more lossless options and better performance. The format is widely used by the Library of Congress and specialized sites like the Internet Archive, however, it is less popular as an end-user format.

Animated GIF of the forest images, starting at full resolution and progressing through increasingly degraded version of the iamge.See the forest JPEG degrade from its full resolution to 1 percent quality in this GIF. Original image: Irina Iriser

Other image technologies have had somewhat more luck getting past the JPEG format. The Google-supported WebP is popular with website developers (and controversial with end users). Meanwhile, the formats AVIF and HEIC, each developed by standards bodies, have largely outpaced both JPEG and JPEG 2000.

Still, the JPEG will be difficult to kill at this juncture. These days, the format is similar to MP3 or ZIP files—two legacy formats too popular and widely used to kill. Other formats that compress the files better and do the same things more efficiently are out there, but it’s difficult to topple a format with a 30-year head start.

Shaking off the JPEG is easier said than done. I think most people will be fine to keep it around.

Ernie Smith is the editor of Tedium, a long-running newsletter that hunts for the end of the long tail.



Read the original article

Comments

  • By imageformatssux 2025-06-1715:4934 reply

    How in the world do people store images / photos nowadays?

    Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.

    JPEG is... old, and it shows. The filesizes are a bit bloated, which isn't really a huge problem with modern storage, but the quality isn't great.

    JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)

    HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.

    AVIF seems computationally expensive and the support is pretty spotty - 8bit yuv420 might work, but 10b or yuv444 often doesn't. Windows 10 also chokes pretty hard on it.

    Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.

    PNG is cheap and support is ubiquitous but filesizes become sky-high very quick.

    So what's left? I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them. Is jpeg still the only good option? Or is encoding everything in jpeg-xl or avif + praying things get better in the future a reasonable bet?

    • By OneDeuxTriSeiGo 2025-06-1720:551 reply

      > JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)

      It's worth noting that Firefox is willing to adopt JPEG-XL[1] as soon as the rust implementation[2] is mature And that rust impl is a direct port from the reference C++ implementation[3]. Mac OS and Safari already support JPEG-XL [4]. And recently Windows picked up JPEG-XL support. The only blockers at this point are Firefox, Chromium, and Android. If/when Firefox adopts JPEG-XL, we'll probably see google follow suit if only out of pressure from downstream Chromium platforms wanting to adopt it to maintain parity.

      So really if you want to see JPEG-XL get adopted, go throw some engineering hours at the rust implementation [2] to help get it up to feature parity with the reference impl.

      -----

      1. https://github.com/mozilla/standards-positions/pull/1064

      2. https://github.com/libjxl/jxl-rs

      3. https://github.com/libjxl/libjxl

      4. https://www.theregister.com/2023/06/07/apple_safari_jpeg_xl/

      5. https://www.windowslatest.com/2025/03/05/turn-on-jpeg-xl-jxl...

      • By Liquix 2025-06-1721:283 reply

        g**gle is hellbent on killing JPEG-XL support in favor of WebP. assuming they'll capitulate to downstream pressure is a stretch. this article [0] sums it up nicely:

        What this [removal of support for JPEG-XL in Chromium] really translates to is, “We’ve created WebP, a competing standard, and want to kill anything that might genuinely compete with it”. This would also partly explain why they adopted AVIF but not JPEG XL. AVIF wasn’t superior in every way and, as such, didn’t threaten to dethrone WebP.

        [0] https://vale.rocks/posts/jpeg-xl-and-googles-war-against-it

        • By OneDeuxTriSeiGo 2025-06-1722:37

          I'm not assuming they capitulate under just pressure. Rather I'm assuming they'll capitulate if a majority of or even all of the big third party chromium browsers push for adding it to mainline chromium.

          This is less just blind pressure but rather the risk that google becomes seen as an untrustworthy custodian of chromium and that downstreams start supporting an alternate upstream outside of google's control.

          Jxl is certainly a hill that google seems intent to stand on but I doubt it's one they'd choose to die on. Doubly so given the ammo it'd give in the ongoing chrome anti-trust lawsuits.

        • By frollogaston 2025-06-1722:521 reply

          How is Google so intent on webp winning? They don't even support it in their own products besides Chrome.

        • By arp242 2025-06-1813:20

          The risks and downsides of exposing an image decoder to the entire web are very real, especially a relatively new/untested one written in a language like C++. There's been vulnerabilities in pretty much every other image decoder and I fully expect jpeg-xl to be no different. You can't just brush that aside. Hell, article doesn't even acknowledge it. Google has no real stake in webp vs. jpeg-xl either. You may disagree with the decision, this this kind of stuff doesn't make much sense.

    • By throw0101c 2025-06-180:372 reply

      > HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.

      HEIC was developed by the MPEG folks and is an ISO standard, ISO/IEC 23008-12:2022:

      * https://www.iso.org/standard/83650.html

      * https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...

      An HEIC image is generally a still frame from ITU-T H.265† (HEVC):

      * https://www.geeky-gadgets.com/heif-avc-h-264-h-265-heic-and-...

      OS support includes Windows 10 v1083, Android 10+, Ubuntu 20.04, Debian 10, Fedora 36. Lots of cameras and smartphones support it as well.

      There's nothing Apple-specific about it. Apple went through the process of licensing H.265, so they got HEIC 'for free' and use it as the default image format because over JPEG it supports: HDR, >8-bit colour, etc.

      †Like WebP was similar to an image/frame from a VP8 video.

      • By jiggawatts 2025-06-188:373 reply

        > OS support includes Windows 10 v1083

        Ba-ha-ha... ha-ha... no.

        Support is virtually non-existent. Every year or so, I try to use my Windows PC to convert a RAW photo taken with a high-end Nikon mirrorless camera to a proper HDR photo (in any format) and send it to my friends and family that use iDevices.

        This has been literally impossible for the last decade, and will remain impossible until the heat death of the universe.

        Read-only support is totally broken in a wide range of apps, including Microsoft-only apps. There are many Windows imaging APIs, and I would be very surprised if more than one gained HEIC support. Which is probably broken.

        Microsoft will never support an Apple format, and vice versa.

        Every single new photo or video format in the last 25 years has been pushed by one megacorp, and adoption outside of their own ecosystem is close to zero.

        JPEG-XL is the only non-megacorp format that is any good any got and got multi-vendor traction, which then turned into "sliding backwards on oiled ice". (Google removed support from Chromium, which is the end of that sad story.)

        • By kllrnohj 2025-06-1923:55

          > Every year or so, I try to use my Windows PC to convert a RAW photo taken with a high-end Nikon mirrorless camera to a proper HDR photo (in any format) and send it to my friends and family that use iDevices.

          > This has been literally impossible for the last decade, and will remain impossible until the heat death of the universe.

          It's possible right now with gainmap jpegs. Adobe can create them, Android captures in them now, and Apple devices can view them even. Or if they can't yet they can very soon, Apple announced support at the recent WWDC (Apple brands it "adaptive HDR")

          There's something kinda hilariously ironic that out of all these new fancy image codecs be it HEIC, AVIF, or JPEG-XL, it's humble ol' JPEG that's the first to deliver not just portable HDR, but the best quality HDR of any format of any kind

        • By mardifoufs 2025-06-1815:55

          My Samsung Galaxy outputs HEIC by default afaik. It's configurable, and I can turn that off, but still HEIC is not apple specific.

        • By throw0101c 2025-06-1810:371 reply

          >> OS support includes Windows 10 v1083

          > Ba-ha-ha... ha-ha... no. […]

          Feel free to hit "Edit" on the Wikipedia page and correct it then:

          * https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...

          > Microsoft will never support an Apple format, and vice versa.

          Once again, it's not an Apple format: it was developed by MPEG and is published by ISO/IEC, just like H.264 and H.265.

          Or do you think H.264 and H.265 are an "Apple format" as well?

          • By jiggawatts 2025-06-1812:151 reply

            > It's not an Apple format

            Create a HDR HEIC file on anything other than an Apple Device.

            Upload it to an Apple Device.

            Now use it any way: Forward it, attach it to a message, etc...

            This won't work.

            It won't ever work because the "standard" is not what Apple implements. They implement a few very specific subsets that their specific apps produce, and nothing else.

            Nobody else implements these specific Apple versions of HEIC. Nobody.

            For example, Adobe Lightroom can only produce a HEIC file on an Apple device.

            My Nikon camera can produce a HDR HEIC file in-body, but it is useless on an Apple device because it's too dark and if forwarded in an iMessage... too bright!

            It's a shit-show, comparable to "IPv6 support" which isn't.

            • By graealex 2025-06-1814:14

              That's not an argument. HEIC is to HEVC what WebP is to WebM. The lack of support in other products is due to developers not picking up the pace and sticking with "GIF, JPEG and PNG is good enough".

      • By socalgal2 2025-06-182:423 reply

        > HEIC was developed by the MPEG folks

        And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!

        • By culturestate 2025-06-188:071 reply

          Confusingly, there are two different MPEGs in this context.

          MPEG the standards group is organized by ISO and IEC, along with JPEG.

          The one you’re thinking of - MPEG LA, the licensing company - is a patent pool (which has since been subsumed by a different one[1]) that’s unaffiliated with MPEG the standards group.

          1. https://en.wikipedia.org/wiki/Via-LA

          • By nottorp 2025-06-1810:39

            So what good is it to have a separate entity doing the standard when the standard is unaffordable outside the top 500?

        • By ralfd 2025-06-187:061 reply

          Arent all MPEG patents expired?

        • By throw0101c 2025-06-1810:361 reply

          > And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!

          Not wrong, but this is a different topic/objection than the GP's 'being locked into Apple's ecosystem'.

          And as the Wikipedia article for HEIC shows, there's plenty of support for the format, even in open source OSes.

          * https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...

    • By glitchc 2025-06-1719:004 reply

      > Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.

      Say what? A random scan across the internet will reveal more videos in MP4 and H.264 format than av1. Perhaps streaming services have switched, but that is not what regular consumers usually use to make and store movies.

      • By CharlesW 2025-06-1719:101 reply

        New compressed media formats always travel a decade-long path from either (1) obscurity → contender → universal support or (2) obscurity → contender → forgotten curiosity. AV1 is on one path, WebP is on another.

        • By Andrex 2025-06-1719:343 reply

          As someone who doesn't follow this stuff, which is on which path?

          • By CharlesW 2025-06-1720:031 reply

            WebP remains fairly esoteric after 15 years, has always been a solution in search of a problem, and isn’t even universally supported in products by its owner.

            AV1 was created and is backed by many companies via a non-profit industry consortium, solves real problems, and its momentum continues to grow. https://bitmovin.com/blog/av1-playback-support/

            • By graealex 2025-06-1812:531 reply

              Funnily enough, JPEG2000-support was eventually removed everywhere. I assume the only reason this didn't happen with WebP as well is Google pushing and keeping it in Chrome.

              • By wongarsu 2025-06-1813:571 reply

                Also the Google's lighthouse benchmark pushing webp recommendations, and people listening to it because of SEO concerns

                • By graealex 2025-06-1814:16

                  True. I mentioned JPEG2000 because it had a similar fate, in particular no real reason to use it in the first place.

          • By modeless 2025-06-1719:49

            AV1 is on the path to universal support and WebP is on the path to obscurity.

          • By tedunangst 2025-06-1720:372 reply

            Apple CPUs have AV1 support in hardware.

            • By zimpenfish 2025-06-185:25

              Only support for decoding and from A17 Pro and M3 onwards, I believe? Going to be a few years before that's commonly available (he says from the work M1 Pro.)

              [edit: clarify that it's decoding only]

            • By consp 2025-06-1722:441 reply

              So does every modern GPU. This is nothing special.

              • By airstrike 2025-06-182:26

                I think you're arguing the same point—that there's plenty of support and it's arguably growing.

      • By Izkata 2025-06-1719:06

        Yeah, I think I only just found out about av1 a few weeks ago with a video file that wouldn't play. Thought it was corrupted at first it's been so long since I saw something like that.

      • By ksec 2025-06-180:09

        And H.264 is about to be patent free this year in many places.

      • By userbinator 2025-06-184:01

        I suspect there are even more H.265 than av1.

    • By jandrese 2025-06-1719:414 reply

      From what I've seen WebP is probably the strongest contender for a JPEG replacement. It's pretty common in the indie game scene for example to re-encode a JPEG game to WebP for better image quality and often a significant (25% or more) savings on installer size. Support is coming, albeit somewhat slowly. It was pretty bad in Ubuntu 22, but several apps have added support in Ubuntu 24. Windows 11 supports WebP in Photos and Paint for another example.

      • By 0x20cowboy 2025-06-1720:1713 reply

        I hate webp. Not for any legitimate technical reason like, but I often just want to download an image from the web for an image board or drop it in a diagram or ppt or for a joke and nothing works with that format. Nothing. Google image search is useless because of it.

        Cmd+shift+4 is now the only way to grab an image out of a browser. Which is annoying.

        It has made my life needlessly more complicated. I wish it would go away.

        Maybe if browsers auto converted when you dragged ann image out of the browser window I wouldn’t care, but when I see webp… I hate.

        • By encom 2025-06-1721:381 reply

          Often (in my experience) WebP is served as a bait-and-switch even if the link ends with .jpg. So I use Curl to fetch the file, and since Curl doesn't send "Accept: image/webp" unless you tell it to, the server just gives you what you ask for.

          I once edited Firefox config to make it pretend to not support WebP, and the only site that broke was YouTube.

          • By MiddleEndian 2025-06-1723:59

            lol I installed the firefox extension "Don't Accept image/webp" but I assume a lot of sites just ignore it

        • By cosmic_cheese 2025-06-1720:40

          Webp images are right up there with the fake transparent PNGs you come across in Google Images.

        • By nyanpasu64 2025-06-1721:10

          My working model is that WebP images are generally a lossy copy of a PNG or a generation-loss transcoding of a JPG image. I know that lossless WebP technically exists but nobody uses it when they're trying to save bandwidth at the cost of the user.

        • By frollogaston 2025-06-1722:54

          Even if webp got better support later, I want it deprecated just as revenge for previously wasting my time.

        • By socalgal2 2025-06-182:391 reply

          That's true of any new format. Until everything supports it it's not so great. iPhone saves .HEIC which I have to convert to something else to be useful. It's not everywhere (not sure it ever will be).

          Windows didn't use to show .jpgs in the window explorer. I know becase I wrote a tool to generate thumbnail HTML pages to include on archive CDs of photos.

          To solve this problem, some format has to "win" and get adopted everywhere. That format could be webp, but it will take 3-10 years before everything supports it. It's not just the OS showing it in it's file viewer. It's it's preview app supporting it. It's every web site that lets you upload an image (gmail/gmaps/gchat/facebook/discord/messenger/slack/your bank/apartment-rental-companies, etc..etc..etc..) I just takes forever to get everyone to upgrade.

          • By bapak 2025-06-183:241 reply

            When does a format stop being new? WebP was introduced fifteen years ago.

            • By graealex 2025-06-1814:19

              When it's widely adopted.

              WebP gets pushed into your series of tubes without your consent, and the browser that you're most likely to use to view them just happens to be made by the same company that invented the codec. It's DivX and Real Media all over again.

        • By jandrese 2025-06-1721:361 reply

          Worst case you can open it up in Paint and save as JPEG.

          Also, I just checked and Powerpoint has no problem dropping in a webp image. Gimp opens it just fine. You are right that web forums are often well behind the times on this.

          • By reaperducer 2025-06-1820:15

            Worst case you can open it up in Paint and save as JPEG

            If he's using ⌘⇧4 to take a screenshot, he probably isn't going to open it in Microsoft Paint.

        • By harry8 2025-06-181:25

          Total agreement from me, I use this:

          bin/webp2png:

              #!/bin/bash
              dwebp "$1" -o  "${1%%.webp}".png

        • By nntwozz 2025-06-1720:49

          I use ThumbsUp a free utility from https://www.devontechnologies.com/apps/freeware to convert webp/heic or whatever inconvenient format.

          Just drop the offending image onto the icon in the dock.

        • By wang_li 2025-06-1720:293 reply

          On the classic macs they’re was a program called DropDisk. You could drag a disk image to it and it auto mounted it. That suggests a tool for you. Make a desktop app that you can drag and drop images on that converts them to jpeg and saves them in a folder.

          • By msephton 2025-06-1723:21

            You can create this using Automator in a minute.

          • By jamiek88 2025-06-180:38

            thumbsup app does exactly this.

        • By voidUpdate 2025-06-187:54

          If you want to do that so badly and hate webp so much, why not screenshot it? Then you don't have to care what format it's in on the browser

        • By nottorp 2025-06-1810:401 reply

          Pretty sure I've managed to configure my Firefoxes to act as webp does not exist...

          • By graealex 2025-06-1814:21

            It's a constant battle though to keep those browser extensions updated, especially since Google decided that extensions cut into their profits and they essentially made them useless.

        • By adzm 2025-06-1722:27

          Photoshop has native WebP support now too!

        • By Salgat 2025-06-1720:50

          Exactly, part of being a "superior format" is adoption. Until then, it's just another in a sea of potential.

      • By graealex 2025-06-1813:42

        > It's pretty common in the indie game scene

        That's such a weak argument. If I was an indie game developer, I would use whatever obscure format would offer me the most benefit, since I control the pipeline from the beginning (raw TIFF/TGA/PNG/... files) to the end (the game that needs to have a decoder and will uncompress it into GPU memory). 20 minutes extra build-time on the dev machine is irrelevant when I can save hundreds of MBs.

        However, that is not the benchmark for a format widely used on the internet. Encoding times multiply, as does the need to search for specialized software, and literally everyone else needs to support the format to be able to view those files.

      • By Sammi 2025-06-1719:523 reply

        Also webp support in browsers is looking pretty good these days: https://caniuse.com/webp

        The last major browser to add support was Safari 16 and that was released on September 12, 2022. I see pretty much no one on browsers older than Safari 16.4 in metrics on websites I run.

        • By userbinator 2025-06-184:001 reply

          People want to use images outside of browsers too.

          • By prmoustache 2025-06-1810:122 reply

            What apps are you using in 2025 that handle images but doesn't support webp?

            I can't think of any on my Fedora desktop for instance.

            • By wongarsu 2025-06-1813:59

              Lots of websites that expect me to upload images only accept jpeg and png.

              Another one I recently interacted with are video backgrounds for zoom. Those apparently can only be jpeg, not even png

            • By jcynix 2025-06-1813:07

              Luminar Neo, for example, doesn't handle webp. And there's more than just Fedora, IIRC.

        • By turnsout 2025-06-1722:441 reply

          Yeah, after seeing the logs I made the switch to webp earlier this year. As much as I hate to admit it (not a fan of Google), it’s a pretty big bandwidth savings for the same (or better) quality.

          • By hombre_fatal 2025-06-1723:13

            I switched to webp on my forum for avatars and other user image uploads.

            With one format you get decent filesize, transparency, and animation which makes things much simpler than doing things like conditionally producing gifs vs jpegs.

        • By out_of_protocol 2025-06-1810:191 reply

          .. or you can go directly to avif - https://caniuse.com/avif (93%) instead of webp (95% support).

      • By aidenn0 2025-06-1816:54

        FWIW, the day I discovered jpegli[1] I left WebP behind. Similar sizes to WebP while maintaining JPEG compatibility.

        1: https://github.com/google/jpegli

    • By hengheng 2025-06-1718:462 reply

      Normal people use jpeg. It's good enough, much like mpeg-2 was good enough for DVDs. Compatibility always beats storage efficiency.

      Photography nerds will religiously store raw images that they then never touch. They're like compliance records.

      • By munchler 2025-06-1718:592 reply

        I think most photography nerds who want to save edited images to a lossless format will use TIFF, which is very different from the proprietary "raw" files that come out straight out of the camera.

        • By inferiorhuman 2025-06-183:39

          Most raw files are TIFF with proprietary tags.

        • By IAmBroom 2025-06-1719:581 reply

          You'd be wrong in my experience.

          No photog nerd wants EVEN MORE POSTPROCESSING.

          • By munchler 2025-06-1721:573 reply

            I don't understand. You've got to save the edited result in a file somehow. What format do you use?

            • By msephton 2025-06-1723:291 reply

              The file as it comes out of the camera, so-called raw, is a family of formats. Usually such files are kept untouched and any edits are saved in a lightweight settings file (in the format of your editing app) alongside the original.

              • By ksec 2025-06-180:101 reply

                And a low of RAW format are adopting or considering adopting JPEG Lossless as codec.

                • By nottorp 2025-06-1810:54

                  Is that like how javascript was named so as to imply a connection with java, in spite of there being none?

                  JPEG is the ur-example of lossy compression. JPEG Lossless can't have any connection with that.

            • By 0xffff2 2025-06-1817:49

              I'm not even really a hobbyist photographer anymore, but when I was, the full lossless edit was a .psd and that was generally exported to (high quality) jpg for distribution. I have folders full of carefully curated raws. For the relatively few that were ever edited they have an accompanying psd. The jpgs are ephemeral and don't get saved long term.

      • By Gigachad 2025-06-1722:201 reply

        Normal people just use whatever the default on their phone is. Which for iPhone is HEIC, not sure about Android, AVIF?

    • By martin_a 2025-06-1716:211 reply

      > How in the world do people store images / photos nowadays?

      Well, as JPEGs? Why not? Quality is just fine if you don't touch the quality slider in Photoshop or other software.

      For "more" there's still lossless camera RAW formats and large image formats like PSD and whatnot.

      JPEG is just fine.

      • By afiori 2025-06-1718:325 reply

        I wonder how much of JPEG good quality is that we are quite accustomed to its artefacts.

        • By Arainach 2025-06-1720:001 reply

          I've never seen JPEG artifacts on images modified/saved 5 or fewer times. Viewing on a monitor including at 100%, printing photos, whatever - in practice the artifacts don't matter.

          • By somat 2025-06-1811:15

            jpeg artifacts mainly show up on drawings. where they seriously degrade masking operations. which is a hobby of mine. so I always appreciate it when a drawing is a png. rather than a bunch of jpeg goop.

        • By BugsJustFindMe 2025-06-1719:001 reply

          At high quality, the artifacts are not visible unless you take a magnifying glass to the pixels, which is a practice anathema to enjoying the photo.

          • By afiori 2025-06-1720:22

            I am referring to highly compressed images or low resolution ones, at high bitrates mostly all formats look the same.

            what i mean is that jpeg squarish artifacts look ok while av1 angular artifacts look distorted

        • By mrob 2025-06-1811:101 reply

          JPEG artifacts are less disturbing because they're so obviously artificial. WEBP and similar artifacts look more natural, which makes them harder to ignore.

          • By afiori 2025-06-196:57

            I think I agree, low quality JPEGs give the idea of looking through slightly imperfect glass, WEBP and AV1 look a bit more like bad AI

        • By a-french-anon 2025-06-188:23

          For non-photographic images, I'm horribly sensible to the ringing artifacts. Thankfully, there's waifu2x (in denoise mode only) to remove those when textures don't confuse it too much and I use MozJPEG to encode, which really improves the result.

        • By whaleofatw2022 2025-06-1719:02

          There's something to be said about this. A high quality JPEG after cleanup can sometimes be larger than an ARW (sony RAW) on export and it makes no sense to me.

    • By tristor 2025-06-1716:091 reply

      For my extensive collection of photography, I export to JPEG-XL and then convert to JPEG for use online. Most online services, like Flickr, Instagram, et al don't support JPEG-XL, but there's almost no quality loss converting from JPEG-XL to JPEG vs exporting to JPEG directly from your digital asset management system, and storing locally in JPEG-XL works very well. Almost all desktop tools I use support JPEG-XL natively already, conversely almost nothing support WEBP.

      • By Zardoz84 2025-06-1716:253 reply

        There is NO quality loss when converting from JPEG XL to JPEG and vice versa. It was done by design. Not an accident.

        • By eviks 2025-06-186:11

          You're confusing jpg>jxl>jpg, which can be done losslessly via a special mode, and jxl > jpg, which can't (even ignoring all the extra features of jxl that jpg doesn't support)

        • By adgjlsfhk1 2025-06-1718:331 reply

          this isn't true. there's no loss from jpeg to jpeg-xl (if you use the right mode), but the reverse is not true

          • By Zardoz84 2025-06-1718:474 reply

            I sorry to say that you are wrong about this.

            > Key features of the JPEG XL codec are: > lossless JPEG transcoding,

            > Moreover, JPEG XL includes several features that help transition from the legacy JPEG coding format. Existing JPEG files can be losslessly transcoded to JPEG XL files, significantly reducing their size (Fig. 1). These can be reconstructed to the exact same JPEG file, ensuring backward compatibility with legacy applications. Both transcoding and reconstruction are computationally efficient. Migrating to JPEG XL reduces storage costs because servers can store a single JPEG XL file to serve both JPEG and JPEG XL clients. This provides a smooth transition path from legacy JPEG platforms to the modern JPEG XL.

            https://ds.jpeg.org/whitepapers/jpeg-xl-whitepaper.pdf

            If you need more profs, you could transcode a JPEG to JPEG XL and convert against to JPEG. The result image would be BINARY IDENTICAL to the original image.

            However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.

            • By CorrectHorseBat 2025-06-1719:221 reply

              Yes, JPG to JPEG XL and back is lossless, but the reverse is nowhere mentioned.

              Trying around with some jpg and jxl files I cannot convert jxl losslessly to jpg files even if they are only 8bit. The jxl files transcoded from jpg files show "JPEG bitstream reconstruction data available" with jxlinfo, so I think some extra metadata is stored when going from jpg to jxl to make the lossless transcoding possible. I can imagine not supporting the reverse (which is pretty useless anyway) allowed for more optimizations.

              • By jbverschoor 2025-06-1816:091 reply

                JPG to JXL is lossless, and will save around 20%

                JXL to JPG is lossless as in a bit-for-bit identical file can be generated

                • By adgjlsfhk1 2025-06-1818:18

                  > JXL to JPG is lossless

                  only if you got the JXL from JPG.

            • By spider-mario 2025-06-1719:32

              > However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.

              A lot of those features (non-8×8 DCTs, Gaborish and EPF filters, XYB) are enabled by default when you compress a non-JPEG image to a lossy JXL. At the moment, you really do need to compress to JPEG first and then transcode that to JXL if you want the JXL→JPEG direction to be lossless.

            • By gabrielhidasy 2025-06-1719:19

              > However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.

              So he was not wrong about this. You have perfect JPEG -> JPEG XL conversion, but not the other way around.

            • By adgjlsfhk1 2025-06-1721:00

              default jpeg-xl uses a different color space (XYZ), bigger transform (up to 256x256), rectangular transforms, etc. if you go from jpg to jxl, you can go back (but your jxl file will be less efficient), but if you compress directly to jxl, you can't losslessly go to jpg

        • By tristor 2025-06-1717:30

          That's good to know. I'm not an image format expert, but I couldn't see any loss that was visually discernible at any rate.

    • By zuminator 2025-06-185:49

      People often forget that PNG images can be compressed in a lossy manner to keep the filesize down, not quite as well as jpegs but still quite substantially.

      https://pngmini.com/lossypng.html

      https://pngquant.org/

      https://css-ig.net/pinga

    • By asielen 2025-06-182:281 reply

      Tiff if you want to archive them and they started as raw or tiff, jpeg for everything else. If the file is already jpeg, there is no point in covering it to a new better quality format, the quality won't get better than it already is.

      It may be obsolete, but it is ubiquitous. I care less about cutting edge tech than I do about the probability of being able to open it in 20+ years. Storage is cheap.

      Presentation is a different matter and often should be a different format than whatever your store the original files as.

      • By acomjean 2025-06-183:22

        And jpg isn’t that bad when encoded at high quality, and not saved repeatedly.

        I took a tiff and saved it high quality jpg. Loaded both into photoshop and “diffed” them (basically subtracted both layers). After some level adjustment you could see some difference but it was quite small.

    • By MallocVoidstar 2025-06-1716:501 reply

      AV1 is not the clear winner for video. Currently-existing encoders are worse than x265 for high-bitrate encodes.

      • By CharlesW 2025-06-1718:572 reply

        AV1's advantage narrows to ~5% over H.265 at very high data rates, in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps. But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate, and of course H.265 is heavily patent encumbered in comparison. https://chipsandcheese.com/p/codecs-for-the-4k-era-hevc-av1-...

        • By ksec 2025-06-180:172 reply

          >AV1's advantage narrows to ~5% over H.265 at very high data rates.... But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate,

          There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality. And hence how most torrents are still using x265 because AV1 simply doesn't match the quality offered by other encoder/ x265. Nor does the AOM camp cares about it, since their primarily usage is YouTube.

          >in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps.

          It is not. And never will be. MP3 has inherent disadvantage that needs substantial higher bitrate for quite a lot of samples, even at 320kbps. We have been through this war for 10 years at Hydropgenaudio with Data to back this up, I dont know why in the past 2-3 years the topic has pop up once again.

          MP3 is not better than AAC-LC in any shape or form even at 25% higher bitrate. Just use AAC-LC, or specifically Apple's Quick Time AAC-LC Encoder.

          • By CharlesW 2025-06-180:501 reply

            > There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality.

            In early AV1 encoders, psychovisual tuning was minimal and so AV1 encodes often looked soft or "plastic-y". Today's AV1 encoders are really good at this when told to prioritize psy quality (SVT-AV1 with `--tune 3`, libaom with `--tune=psy`). I'd guess that there's still lots of headroom for improvements to AV1 encoding.

            > And hence how most torrents are still using x265 because…

            Today most torrents still use H.264, I assume because of its ubiquitous support and modest decode requirements. Over time, I'd expect H.265 (and then AV1) to become the dominant compressed format for video sharing. It seems like that community is pretty slow to adopt advancements — most lossy-compressed music <finger quotes>sharing</finger quotes> is still MP3, even though AAC is a far better (as you note!) and ubiquitous choice.

            My point about MP3 vs. AAC was simply: As you reduce the amount of compression, the perceived quality advantages of better compressed media formats is reduced. My personal music library is AAC (not MP3), encoded from CD rips using afconvert.

            • By shiroiuma 2025-06-181:17

              >Today most torrents still use H.264

              That's not what I'm seeing for anything recent. x265 seems to be the dominant codec now. There's still a lot of support for h.264, but it's fading.

          • By aidenn0 2025-06-1817:05

            svt-av1 has come a long ways recently; I tried it a few weeks ago.

            It still (maddeningly !) defaults to PSNR, but you can change that. There are some sources where I find it now can significantly improve over H.265 at higher data rates, and, while my testing was limited, I couldn't find sources any where H.265 clearly won based on my mark-1 eyeball. This is in contrast to when I tried multiple av1 encoders 2-ish years ago and they, at best, matched H.265 at higher bitrates.

        • By MallocVoidstar 2025-06-1720:193 reply

          I don't care about VMAF or PSNR, I care about looking with my eyes. With x265 on veryslow and AV1 on preset 0/1, and the source being a UHD BD I was downscaling to 1080p, AV1 looked worse even while using a higher bitrate than x265. Current AV1 encoders have issues with small details and have issues with dark scenes. People are trying to fix them (see svt-av1-psy, being merged into SVT-AV1 itself) but the problems aren't fixed yet.

          • By ksec 2025-06-180:23

            >see svt-av1-psy, being merged into SVT-AV1 itself

            Part of it being merged for now.

            It is unfortunate this narrative hasn't caught on. Actual quality over VMAF and PSNR. And we haven't had further quality improvement since x265.

            I do get frustrated every time the topic of codec comes up on HN. But then the other day I only came to realise I did spend ~20 years on Doom9 and Hydrogenaudio I guess I accumulated more knowledge than most.

          • By eviks 2025-06-185:59

            Well, did your "eyes" care more about fidelity or appeal?

            https://cloudinary.com/blog/what_to_focus_on_in_image_compre...

          • By spookie 2025-06-1723:54

            Yup, have had the same experience.

    • By thisislife2 2025-06-1720:27

      > How in the world do people store images / photos nowadays?

      I had some high resolution graphic works in TIFF (BMP + LZW). To save space, I archived them using JPEG-2000 (lossless mode), using the J2k Photoshop plug-in ( https://www.fnord.com/ ). Saved tons of GBs. It has wide multi-platform support and is a recognized archival format, so its longevity is guaranteed for some time on our digital platforms. Recently explored using HEIF or even JPEG-XL for these but these formats still don't handle CMYK colour modes well.

    • By djeastm 2025-06-1722:261 reply

      >are nigh-unworkable on desktops, support is very spotty

      I use .webp often and I don't understand this. At least on Windows 10 I can go to a .webp and see a preview and double-click and it opens in my image editor. Is it not like this elsewhere?

      • By hadlock 2025-06-1722:47

        Try uploading one to any web service. Like imgur.

    • By twotwotwo 2025-06-180:07

      I recognize it as beating a dead horse now, but JPEG XL did what was needed to be actually adopted. AVIF has not been widely adopted given the difficulty of a leap to a new format in general and the computational cost of encoding AVIF specifically.

      One of JPEG XL's best ideas was incorporating Brunsli, lossless recompression for existing JPEGs (like Dropbox's Lepton which I think might've been talked about earlier). It's not as much of a space win as a whole new format, but it's computationally cheap and much easier to just roll out today. There was even an idea of supporting it as a Content-Encoding, so a right-click and save would get you an OG .jpg avoiding the whole "what the heck is a WebP?" problem. (You might still be able to do something like this in a ServiceWorker, but capped at wasm speeds of course.) Combine it with improved JPEG encoders like mozjpeg and you're not in a terrible place. There's also work that could potentially be done with deblocking/debanding/deringing in decoders to stretch the old approach even further.

      And JXL's other modes also had their advantages. VarDCT was still faster than libaom AVIF, and was reasonable in its own way (AVIFs look smoother, JXL tended more to preserve traces of low-contrast detail). There was a progressive mode, which made less sense in AVIF because it was a format for video keyframes first. The lossless mode was the evolution of FUIF and put up good numbers.

      At this point I have no particular predictions. JPEG never stopped being usable despite a series of more technically sophisticated successors. (MP3 too, though its successors seemed to get better adoption.) Perhaps it means things continue not to change for a while, or at least that I needn't rush to move to $other_format or get left behind. Doesn't mean I don't complain about the situation in comments on the Internet, though.

    • By SwamyM 2025-06-1817:011 reply

      > So what's left? I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them.

      Yea, how is this still the case in 2025?

      My family has a mix of Apple and Samsung devices and they move/backup their pictures to their Windows machines whenever they run out of space but once they move them, they can't easily view or browse them.

      I had to download a 3rd party app and teach them to view them from there.

      • By mr_toad 2025-06-1818:19

        Maybe they’re not using hardware acceleration? H265 in software is really slow.

    • By SAI_Peregrinus 2025-06-1716:51

      I store Raw + PSD with edits/history + whatever edited output format(s) I used.

    • By aidenn0 2025-06-1816:57

      I see no reason not to use JPEG-XL for archival storage. It is (IMO) the best all-rounder of the current formats, and 20 years from now imagemagick will still be able to convert it to whatever you want.

    • By theandrewbailey 2025-06-1723:501 reply

      > JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)

      I've done several tests where I lowered the quality settings (and thus, the resulting file size) of JPEG-XL and AVIF encoders over a variety of images. In almost every image, JPEG-XL subjective quality fell faster than AVIF, which seemed mostly OK for web use at similar file sizes. Due to that last fact, I concede that Chrome's choice to drop JPEG-XL support is correct. If things change (JPEG-XL becomes more efficient at low file sizes, gains Chrome support), I have lossless PNG originals to re-encode from.

    • By todotask2 2025-06-182:57

      At least there's JPEG-XL support in recent Windows 11 updates.

      I've found that sometimes WebP with lossless compression (-lossless) results in smaller file sizes for graphics than JPEG-XL and sometimes it's the other way around.

    • By PaulHoule 2025-06-1719:21

      I've done a few shootouts at various times in the last 10 years. I finally decided WebP was good for the web maybe two years ago, that is, I have 'set it or forget it' settings and get a good quality/size result consistently. (JPEG has the problem that you really need to turn the knob yourself since a quality level good for one image may not be good for another one)

      I don't like AVIF, at least not for photos I want to share. I think AVIF is great for "a huge splash image for a web page that nobody is going to look at closely" but if you want something that looks like a pro photo I don't think it's better than WebP. People point out this example as "AVIF is great"

      https://jakearchibald.com/2020/avif-has-landed/demos/compare...

      but I think it badly mangles the reflection on the left wing of the car and... it's those reflections that make sports cars look sexy. (I'll grant that the 'acceptable' JPEG has obvious artifacts whereas the 'acceptable' AVIF replaced a sexy reflection with a plausible but slighly dull replacement)

    • By pjmlp 2025-06-186:38

      JPEG is old, and it works.

      Images are sorted in folders, per year and some group description based on why they were taken, vacations, event, whatever.

      Enable indexing on the folders, and usually there are no freezes to complain about.

    • By zeroq 2025-06-1719:461 reply

      For video it's not as easy as it takes way more compute and requires hardware support.

      You can take any random device and it will be able to decode h264 at 4k. h265 not so much.

      As for av1 - my Ryzent 5500GT released in 2024 does not support it.

    • By dangus 2025-06-1718:592 reply

      > HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.

      Not really true in my experience, I have no problems using it in Windows 11, Linux, or with my non-Apple non-Google cloud photos app.

      The iPhone using it in an incredibly widespread way has made it a defacto standard.

      If you're having trouble in Windows, I wonder if you're running Windows 11 or 10? Because 11 seems a lot better at supporting "modern things" considering that Microsoft has been neglecting Windows 10 for 3 years and is deprecating it this year.

      • By munchler 2025-06-1719:013 reply

        My problem with HEIC is that if you convert it to another format, it looks different from the original, for reasons that I don't understand. I switched my iPhone back to JPEG to avoid that.

        • By jorl17 2025-06-1719:18

          Perhaps due to HDR handling?

        • By spookie 2025-06-1723:57

          Only Gwenview on the Linux side is able to render them properly somehow

        • By dangus 2025-06-1918:12

          I have increasingly noticed very little reason to convert them to JPEG.

      • By thunderfork 2025-06-193:42

        [dead]

    • By nvch 2025-06-1723:491 reply

      RAW? Storage is becoming cheeper, why discard the originals?

      When looking for a format to display HQ photos on my website I settled with a combination of AVIF + JPG. Most photos are AVIF, but if AVIF is too magical comparatively to JPG (like 3x-10x smaller) I use a larger JPG instead. "Magic" means that fine details are discarded.

      WebP discards gradients (like sunset, night sky or water) even at the highest quality, so I consider it useless for photography.

      • By mrheosuper 2025-06-183:13

        not every storage is created equal. 1TB hdd is dirt cheap, 1TB of cloud storage is expensive af

    • By redeeman 2025-06-1719:46

      just use jpegxl. works great on linux. Pressure software you use to use the proper formats

    • By troupo 2025-06-1811:12

      > HEIC is good,

      It's not. Support is still surprisingly patchy, and it takes a second or so to decode and show the image even on a modern M* Mac. Compared to instant PNG.

      > I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them.

      Indeed.

    • By zmj 2025-06-1721:11

      I recently reencoded my photography archive to webp. It's a static site hosted from S3. I was pretty happy with the size reduction.

    • By alistairSH 2025-06-181:011 reply

      HEIC for photos taken by my iPhone. Apple stuff seems to do a mostly ok job auto-converting to JPG when needed (I assume, since I haven’t manually converted one in ages).

      And JPG for photos taken on a “real” camera (including scanned negatives). Sometimes RAW, but they’re pretty large so not often.

      • By rr808 2025-06-181:47

        I found that if you plug a iphone into a windows PC and copy the photos off it will convert to jpg. However it makes copying very slow, and the quality is worse, so I'd advise to turn off the setting on the phone (I think its compatibility mode or similar)

    • By esafak 2025-06-184:50

      Assuming you are asking about archiving: Use the original format it came in. If you're going to transcode it should be to something lossless like J2K or PNG.

    • By gsich 2025-06-1721:48

      >How in the world do people store images / photos nowadays?

      With codecs built for that purpose I hope. Intra-frame misconceptions "formats" should stay that way. A curiosity.

    • By wltr 2025-06-1815:58

      I have zero issues with macOS and Linux while using modern image formats. Don’t use Windows, I guess.

    • By esperent 2025-06-188:59

      > How in the world do people store images / photos nowadays?

      PNG where quality matters, JPG where size matters.

    • By bilekas 2025-06-1720:09

      You’re right, for a lot of scenario which is exactly what a standard is there to do, encapsulating the broad strokes.

      > Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.

      Right again, and WebP is the enrichment that goes with the backend when dealing with web. I wouldn’t knock it for not being local compatible, it was designed for the web first and foremost, I think it’s in the name.

    • By heraldgeezer 2025-06-1720:09

      >Just as there is a clear winner for video - av1

      What?? Maybe I'm too much in aarrrgh circles but it's all H.264 / 265...

    • By NoMoreNicksLeft 2025-06-1716:272 reply

      >JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome,

      Why would I ever care about Chrome? I can't use adblockers on Chrome, which makes the internet even less usable than it currently is. I only start up chrome to bypass cross-origin restrictions when I need to monkey-patch javascript to download things websites try to keep me from downloading (or, like when I need to scrape from a Google website... javascript scraper bots seem to evade their bot detection perfectly, just finished downloading a few hundred gigabytes of magazines off of Google Books).

      Seriously, fuck Chrome. We're less than 2 years away from things being as bad as they were in the IE6 years.

      • By codazoda 2025-06-1716:561 reply

        I think we're there, not 2 years away.

        I have software that won't work quite right in Safari or Firefox through a VPN every single day. Maybe it's the VPN and maybe it's the browser but it doesn't matter. We're at IE levels it's just ever so slightly more subtle this time. I'm still using alternatives but it's a battle.

        • By NoMoreNicksLeft 2025-06-1717:591 reply

          VPN's layer 2... I suppose it could be resizing packets in such as way as to make it glitch out, but that just seems improbable.

          Some of the warez sites I download torrents from have captchas and other javascripticles that only work on Chrome, but I've yet to see it with mainstream sites.

          Fight the good fight.

          • By frollogaston 2025-06-1722:56

            The VPN could be on an IP address with a bad reputation that's getting soft-blocked by some stuff

      • By crazygringo 2025-06-1721:562 reply

        > I can't use adblockers on Chrome

        Why does this myth persist?

        uBlock Origin Lite works perfectly fine on Chrome, with the new Manifest v3. Blocks basically all the ads uBlock Origin did previously, including YouTube. But it uses less resources so pages load even faster.

        There's an argument that adblocking could theoretically become less effective in the future but we haven't seen any evidence of that yet.

        So you can very much use adblockers on Chrome.

        • By frollogaston 2025-06-1722:551 reply

          If uBO Lite is really better, why does uBO exist?

          • By crazygringo 2025-06-1723:062 reply

            Because uBO Lite uses a newer Chrome function call (declarativeNetRequest) that didn't exist previously (original uBO was based on webRequest).

            webRequest is slower because it has to evaluate JavaScript for each request (as well as the overhead of interprocess communication), instead of the blocking being done by compiled C++ code in the same process like declarativeNetRequest does.

            uBO also has a bunch of extra features like zapping that the creator explicitly chose not to include in uBO Lite, in the interests of making the Lite version as fast and resource-light as possible. For zapping, there are other extensions you can install instead if you need that.

            They're two different products with two different philosophies based on two different underlying architectures. The older architecture has now gone away in Chrome, but the new one supports uBlock Origin Lite great.

            • By pikelet 2025-06-186:571 reply

              I think you're overstating it a bit. There were definitely features that couldn't be implemented due to MV3 limitations rather than because the developer chose to leave them out.

              https://github.com/uBlockOrigin/uBOL-home/wiki/Frequently-as...

              • By frollogaston 2025-06-1817:25

                Hm, I can't tell how serious these limitations are, but there are a lot, and this one stands out:

                  replace=, can't modify the response body (full support is only possible with Firefox MV2)

            • By frollogaston 2025-06-1723:34

              Thanks, I'll try it out then.

  • By carra 2025-06-187:289 reply

    We keep hearing different variants of "webp should take over because now it has good browser support". But that is not nearly enough for an image format to reach widespread adoption.

    People want to be able to open the images anywhere (what about watching photos in a smartTV? or an old tablet? what about digital picture frames?). They want to edit the images in any program (what about this FOSS editor? what about the many people stuck in pre-subscription Photoshop versions?).

    They also want to ensure far future access to their precious photos, when formats like JPEG2000 or even WebP might be long gone. I mean, webp was made by Google and we know how many of their heavily promoted creations are dead already...

    • By account42 2025-06-1813:121 reply

      It also doesn't help that most people's experience with webp is force-recompressed versions of images that were originally jpeg. With relatively low quality settings.

      • By schmidtleonard 2025-06-1813:361 reply

        I'm not sure people consciously make that association, but you know which association they absolutely do make? The one where google images started using webp at the same time as they were locking it down. At the time, ecosystem support was practically nonexistent, so it functioned as "soft DRM" and among people who know the term "webp" at all that's by far the #1 association.

        • By skylurk 2025-06-1816:16

          TIL webp is more than just potato shots that I have to delete off my hard drive once in a while

    • By jagermo 2025-06-188:43

      i don't like working with webp. especially since chrome pushes it on everyting, but googles other tools like slides do not support it. Super annoying, especially since google developed it.

    • By Velocifyer 2025-06-1812:46

      JPEG XL is better than webp

    • By jokoon 2025-06-189:263 reply

      This reminds me of rust: even if rust will be widespread in 10 or 20 years, it will not really displace C++.

      Although I need a engineering explanation as to why COBOL is still alive after all those years, because any tech cannot live forever.

      • By fsloth 2025-06-1811:15

        Can’t live forever?

        Latin is still going strong as well as water pipes (oldest being several millenia old).

        Hard to predict which innovations remain resilient. The longer they stick around the more ”Lindy-proof” they are.

      • By jt2190 2025-06-1811:581 reply

        The explanation for COBOL is not an engineering one, but an economics one: “It’s cheaper to train a programmer to use COBOL than it is to rewrite the codebase in <language>.” (Perhaps LLMs might change the economics here.)

        • By leptons 2025-06-1814:59

          "If it ain't broke, don't fix it" also applies here.

          There is also no guarantee that whatever new language you port the COBOL code to won't also be seen as obsolete in a few years. Software developers are a fickle bunch.

      • By 0points 2025-06-1810:011 reply

        > COBOL

        Was popular in the 60s in fintech, so banks, ATM:s and related systems went digital using it.

        Those systems are still running.

        • By graealex 2025-06-1813:27

          Although you can do a bit of Ship-of-Theseus philosophy on those COBOL systems. After every software component has been rewritten multiple times and every hardware has died and subsequently got replaced, all that's left is the decision to stick with COBOL, not the fact that it's a legacy system built in the 60s.

    • By scotty79 2025-06-1916:19

      Anywhere? I would be super happy if I could open them all in all contexts on my desktop computer.

    • By arp242 2025-06-1813:11

      Last I checked you couldn't even upload .webp images to GitHub or Telegram. Well, for GitHub you can cheat by renaming it to .png and GitHub's content detection will make it work regardless, but meh.

    • By TacticalCoder 2025-06-1812:302 reply

      > But that is not nearly enough for an image format to reach widespread adoption

      I use WEBP extensively but WEBP has a major flaw: it can do both lossy and lossless.

      That's the most fucktarded thing to ever do for an image compression format. I don't understand the level of confusion and cluelessness that had to happen for such a dumb choice to have been made.

      I've got an entire process around determining and classifying WEBP depending on whether they're lossless or lossy. In the past we had JPG our PNG: life was good. Simple. Easy.

      Then dumbfucks decided that it made sense to cram both lossy and lossless under the same umbrella.

      > They also want to ensure far future access to their precious photos, when formats like JPEG2000 or even WebP might be long gone.

      That however shall never be an issue. You can still open, today, old obscure formats from the DOS days. Even custom ones only use by a very select few software back then.

      It's not as if we didn't have emulators, converters, etc. and it's all open source.

      Opening old WEBP files in the future shall never ever be a problem.

      Determining if it's a lossy or lossless WEBP for non-technical users, however... ; )

      • By jacobgkau 2025-06-1921:49

        > In the past we had JPG our PNG: life was good. Simple. Easy.

        Except for if you wanted a compressed image with transparency for the web, in which case you had to sacrifice one of those two things or use a different format besides those two.

        > Then dumbfucks decided that it made sense to cram both lossy and lossless under the same umbrella.

        > I don't understand the level of confusion and cluelessness that had to happen for such a dumb choice to have been made.

        Besides many end users not caring which one it is as long as they recognize the file type and can open it, I found a few interesting reasons for having both in the same format from a simple web search. One was the possibility of having a lossy background layer with a lossless foreground layer (particularly in an animation).

        JPEG XL also decided to support either lossless or lossy compression, so it wasn't just WebP that decided it made sense.

      • By reaperducer 2025-06-1819:44

        That however shall never be an issue. You can still open, today, old obscure formats from the DOS days. Even custom ones only use by a very select few software back then.

        Certainly not true.

        One example: I have many thousands of photos from my Sony digital camera that cannot be opened by any current operating system without installing third-party software.

        I'm lucky that the camera also output JPEG versions as it saved, so I'm able to view the JEPG thumbnails, then drag the Sony version into my photo editor of choice.

    • By prmoustache 2025-06-1810:053 reply

      I am pretty sure webp is supported everywhere nowadays. I think it is just inertia.

      • By chownie 2025-06-1810:12

        It isn't universal, my phone gallery doesn't support webp at all by default and the windows gallery only supports non-animating webp from what I can tell?

      • By 7bit 2025-06-1810:351 reply

        My smartphone camera does not output webp and so does my professional Nikon.

        As long as these two major sources of pictures stay on JPEG, I will too. Simply because that's all for subjective and completely debatable reasons.

        • By prmoustache 2025-06-1810:451 reply

          To me what cameras support as an output is irrelevant. On a pro camera you generally use the raw files as a default format. What is important is the formats you can export to / manipulate afterwards for publication/exchange.

      • By xeromal 2025-06-1812:521 reply

        When I download a photo to send to my family webp always causes some kind of problem so I end up screenshotting it

        • By schmidtleonard 2025-06-1813:271 reply

          Always always always -- and it's often multiple problems, where the filesystem preview generators don't support it or don't support it over a network or the social media used by the other person doesn't support it (often egregiously so, where an unrecognized drop bubbles up to browser scope and blasts the page you were on) or there's a weird problem with a site/app that is supposed to support it, such as it turns into a black box.

          Support for webp is still so rough that I have to wonder what one's ecosystem must look like for it to be seamless. Maybe if you are a googler and your phone/computer/browser use entirely google software and ditto for your friends and your friends friends and your spouses? Maybe?

          • By graealex 2025-06-1814:101 reply

            To my knowledge, not even every Google product supports it, but I have not verified support myself.

            I blame Google for pushing it, but I also blame every third-party product for not supporting it, when it is mostly free to do so (I'm sure all of them internally use libraries to decode images instead of rolling their own code).

            • By prmoustache 2025-06-1815:42

              I think it works better in a totally open source ecosystem. I can share webp pics to my daughters via xmpp for example regardless if I am on my smartphone (conversations) or desktop (gajim)

    • By whyever 2025-06-188:392 reply

      > I mean, webp was made by Google and we know how many of their heavily promoted creations are dead already...

      I don't understand this argument. WebP is an algorithm, not a service. You cannot kill it once it's published.

      • By jacobgkau 2025-06-189:091 reply

        JPEG XL is similarly an algorithm that's been published, but Google removed it from their browser and Mozilla followed suit, which effectively killed its usefulness as a web-friendly (and, more generally, usable-anywhere) format.

      • By carra 2025-06-189:071 reply

        Fair enough. What I meant by this is that, in the end, most software that decides to add webp support is doing it because of the huge push by Google to do so. But if they suddenly change that push to something else then webp might find itself growing more irrelevant.

        • By nottorp 2025-06-1810:321 reply

          I didn't know webp was pushed by Google. They should publicize that fact more so people know to avoid the format entirely.

          What Google pushes is in their self interest and has nothing to do with the good of the unwashed masses.

          • By graealex 2025-06-1813:33

            WebP is basically a single i-frame from the WebM video codec, which literally was developed by Google to avoid paying license cost for H.264. For which they had great incentive.

            WebP is to WebM what HEIC is to HEVC.

            You can argue that using free codecs is a collateral benefit here, even though Google did it for selfish reasons. It is not detrimental to the public or the internet.

  • By bob1029 2025-06-1715:365 reply

    Beyond the compression (which is amazing), JPEG is also extremely fast when implemented well. I'm not aware of any other image format that can encode at 60fps+ @ 1080p on a single CPU core. Only mild SIMD usage is required to achieve this. With dedicated hardware, the encode/decode cost quickly goes to zero.

    I struggle to understand the justification for other lossy image formats as our networks continue to get faster. From a computation standpoint, it is really hard to beat JPEG. I don't know if extra steps like intra-block spatial prediction are really worth it when we are now getting 100mbps to our smartphones on a typical day.

    • By MrDOS 2025-06-1715:493 reply

      https://news.ycombinator.com/item?id=44298656

      You might be getting 100 Mbps to your smartphone; many people – yes, even within the United States – struggle to attain a quarter of that.

      • By bob1029 2025-06-1715:561 reply

        What is the likelihood of experiencing precisely marginal network conditions wherein webp improves the user experience so dramatically over jpeg that the user is able to notice?

        If jpeg is loading like ass, webp probably isn't going to arrive much faster.

        • By MrDOS 2025-06-1716:20

          I'm sorry, I misunderstood your doubt of the usefulness of other lossy formats as criticism of using lossy formats in general in the face of higher bandwidth. Reading too fast, never mind me... :)

      • By GuB-42 2025-06-1723:24

        If you have slow internet on your smartphone, chances are that you also have a slow smartphone, and therefore decoding performance matter, it may also save you a bit of battery life for the same reason, which may be important in place with little internet coverage.

        You have to find a balance, and unless (still) pictures are at the center of what you are doing, it is typically only a fraction of the bandwidth (and a fraction of the processing power too).

        We are not talking about 100 Mbps, we downloaded JPEGs from dialup connections you know. You don't even need to go into the Mbps unless you are streaming MJPEG (and why would you do that?).

      • By tossaway0 2025-06-184:41

        25Mbps is extremely fast in relation to the benefits when browsing the web of better image compression options than JPEG.

    • By Lammy 2025-06-1720:28

      > I struggle to understand the justification for other lossy image formats as our networks continue to get faster.

      Because Google's PageSpeed and Lighthouse both tell people to use WebP, and a large percentage of devs will do anything Google say in the hopes of winning better SERP placement:

      - https://web.dev/articles/serve-images-webp

      - https://developer.chrome.com/docs/lighthouse/performance/use...

    • By illiac786 2025-06-1719:25

      That’s why I am confident LLMs won’t change as much as some may think: after 20+ years of search engines, some still can’t be bothered to do a simple search. (Either that or you’re trolling, I can’t decide I have to say.) Hence, we can wait another 20 years and some will still not use LLMs for everyday questions.

      To answer your (false?) question, there’s a long list of benefits, but I’d say HDR and storage efficiency are the two big ones I can think of. The storage efficiency especially is massive, especially with large images.

    • By wizardforhire 2025-06-1715:543 reply

      Exactly! It’s like asking why we still use wheels when hovercrafts exist.

      If humans are still around in a thousand years they’ll be using jpegs and they’ll still be using them a thousand years after that. When things work they have pernicious tendency to stick around.

      • By dsr_ 2025-06-1720:39

        Wheels continue to support a load without power.

        Wheels are vastly superior to hover technologies in the crucial areas of steering and controlled braking. (For uncontrolled braking, you just cut the power to your hover fans and lift the skirts...)

        It turns out to be remarkably difficult to get a hovercraft to go up an incline...

        Wheels are both suspension and traction in one system.

        There's no particular physical advantage to JPEG over the others mentioned; it's just currently ubiquitous.

      • By pbhjpbhj 2025-06-1716:043 reply

        Can JPEG do 3D somehow (I'm thinking VR/AR)? DVDs lasted well, until the medium itself moved to cheap NAND flash and then various SSD technologies.

        When/if simple screens get usurped then we'll likely move on from JPEG.

        I'm sure you were being a little flippant but your last sentence shows good insight. Someone said "we just need it to work" to me the other day and the "if it works there will be little impetus to improve it"-flag went off in my brain.

        • By wizardforhire 2025-06-1717:06

          Thanks, thats a great insight!

          Idk about 3d, but I’ll assume someone probably will tape something out necessity if they haven't already.

          …and yes, very flippant! But not without good reason. If we are to extrapolate; the popularity of jpeg, love it or hate it, will invariably necessitate it’s continued compatibility contributing to my pervious statement. That compatibility will invariably lead to plausible hypothetical circumstances where future developers out of laziness, ignorance, or just plain conformity to norms will lead to its choice and use perpetuating the cycle. The tendency as such is that short of a radical mass extension level like event brought about by mass wide spread technological adoption such as what you describe is why I don’t see it going away anytime soon. Not to say it couldn’t happen, I just feel it’s highly improbable because of the contributing human factors.

          That jpeg gets so many complaints is I feel for two reasons. One, its ubiquity and two, that we actually see it! Some similar situations that don’t get nearly as much attention but are far more pervasive are tcp/ip, bash, ntpd, ad nauseam. All old pervasive protocols so embedded as to be taken for granted, and also not able to be seen.

          I’ll leave with this engineering truism that I feel should be more widely adhered to in software development, especially by UI designers: if it ain’t broke don’t fix it!

        • By adgjlsfhk1 2025-06-1718:41

          depends what you mean by 3d. jpeg-xl does let you add arbitrary channels, so you could add a depth channel, but it's not going to do a good job for full 3d (e.g. light field/point cloud).

          one place I think jxl will really shine is PBR and game textures. for cases like that, it's very common to have color+transparency+bump map+normal map, and potentially even more. bundling all of those into a single file allows for away better compression

        • By tehjoker 2025-06-1718:55

          Jp3d can do 3d, but it is not well supported. It is an extension to the JPEG2000 specification iirc.

      • By eviks 2025-06-186:28

        Besides the awful wheel comparison, there are dozens of formats that worked and stuck around until they got replaced, so this also tells us nothing on such a huge timescale

    • By redeeman 2025-06-1719:50

      transparancy? hdr? proper support for lossless? theres many things lacking in jpeg

HackerNews